$ man batch-processing

GTM · Automation & Scripts

Batch Processing

Processing a list of records all at once instead of one at a time. Load a CSV of 73 companies, enrich all of them, output a new CSV with the results. The script does the work while you do something else.


why it matters

interactive enrichment doesn't scale. if you're researching one company at a time in Clay or asking Claude about one prospect, you'll spend all day on 20 accounts. batch processing lets you define the workflow once — what data to pull, how to clean it, where to output it — and run it across hundreds of records. the script handles rate limiting, error recovery, deduplication, and progress logging. you come back to a finished CSV. this is the difference between "using AI" and "building AI systems."

how I use it

I use Python scripts with csv.DictReader/DictWriter for batch processing. the pattern is always the same: (1) load input CSV. (2) load existing output CSV for deduplication. (3) iterate through each row. (4) call the API (Exa search, enrichment provider, etc.). (5) clean and structure the response. (6) append to output CSV. (7) log progress. every script supports a --test flag that only processes the first 3-5 records — so I can validate the output before running the full batch. I also build text cleaning functions that strip HTML entities, nav boilerplate, and excessive whitespace from API responses. raw API output is messy. cleaned output is usable. the cleaning step is half the work.


related terms
Enrichment PipelineRate LimitingDeduplication
GTM knowledge guideall terms →
ShawnOS.ai|theGTMOS.ai|theContentOS.ai
built with Next.js · Tailwind · Claude · Remotion