Batch Processing for Multiple Leads
This guide shows how to efficiently process multiple client intake records using the Screen API.
Overview
While the Screen API processes one record at a time, you can implement client-side batching to handle multiple leads efficiently.
Implementation Strategy
There are several approaches to handle batches of records:
1. Sequential Processing
The simplest approach is to process records one after another:
async function processLeads(leads) {
const results = {
valid: [],
invalid: []
};
for (const lead of leads) {
try {
const response = await fetch('https://www.caseverdict.com/v1/screen', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({ record: lead })
});
const data = await response.json();
if (data.success) {
if (data.results.valid.length > 0) {
results.valid.push(data.results.valid[0]);
} else if (data.results.invalid.length > 0) {
results.invalid.push(data.results.invalid[0]);
}
}
} catch (error) {
console.error(`Error processing lead: ${lead.email}`, error);
}
}
return results;
}
2. Parallel Processing with Concurrency Limits
For better performance, process multiple records concurrently, while respecting rate limits:
async function processLeadsInParallel(leads, concurrencyLimit = 5) {
const results = {
valid: [],
invalid: []
};
// Process leads in chunks to control concurrency
for (let i = 0; i < leads.length; i += concurrencyLimit) {
const chunk = leads.slice(i, i + concurrencyLimit);
const promises = chunk.map(lead =>
fetch('https://www.caseverdict.com/api/v1/screen', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({ record: lead })
})
.then(response => response.json())
.then(data => {
if (data.success) {
if (data.results.valid.length > 0) {
results.valid.push(data.results.valid[0]);
} else if (data.results.invalid.length > 0) {
results.invalid.push(data.results.invalid[0]);
}
}
return data;
})
.catch(error => {
console.error(`Error processing lead: ${lead.email}`, error);
return null;
})
);
await Promise.all(promises);
// Add a small delay between batches to avoid rate limiting
if (i + concurrencyLimit < leads.length) {
await new Promise(resolve => setTimeout(resolve, 1000));
}
}
return results;
}
Rate Limiting Considerations
- The API has a rate limit of 100 requests per minute for PRO accounts
- Implement exponential backoff for retry logic
- Monitor 429 responses and adjust your concurrency accordingly
Example: Processing a CSV File
Here’s an example of processing leads from a CSV file:
const fs = require('fs');
const csv = require('csv-parser');
const results = [];
// 1. Read the CSV file
fs.createReadStream('leads.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', async () => {
// 2. Process the leads
const processedResults = await processLeadsInParallel(results);
// 3. Save the results
fs.writeFileSync(
'valid_leads.json',
JSON.stringify(processedResults.valid, null, 2)
);
fs.writeFileSync(
'invalid_leads.json',
JSON.stringify(processedResults.invalid, null, 2)
);
console.log(`Processed ${results.length} leads:`);
console.log(`- Valid: ${processedResults.valid.length}`);
console.log(`- Invalid: ${processedResults.invalid.length}`);
});
Best Practices
- Pre-validate data before sending to save API calls
- Implement retries with exponential backoff for transient errors
- Process in batches to balance throughput with reliability
- Log outcomes for auditing and troubleshooting
- Consider implementing a queue for very large datasets
Last updated on