Chevron DownCommand-Line Reference
Chevron DownAPI Reference


The parseInBatches function can parse incrementally from a stream of data as it arrives and emit "batches" of parsed data.

Batched parsing is only supported by a subset of loaders. Check documentation of each loader before using this function.

From Website parseInBatches can be used with all loaders. Non-supporting loaders will wait until all data has arrived, and emit a single batch containing the parsed data for the entire input (effectively behave as if parse had been called).


Parse CSV in batches (emitting a batch of rows every time data arrives from the network):

import {fetchFile, parseInBatches} from '';
import {CSVLoader} from '';

const batchIterator = await parseInBatches(fetchFile(url), CSVLoader);
for await (const batch of batchIterator) {

Parse CSV in batches, requesting an initial metadata batch:

import {fetchFile, parseInBatches} from '';
import {CSVLoader} from '';

const batchIterator = await parseInBatches(fetchFile(url), CSVLoader, {metadata: true});
for await (const batch of batchIterator) {
  switch (batch.batchType) {
    case 'metadata':


async parseInBatches(data: DataSource, loaders: object | object[], options?: object): AsyncIterator

async parseInBatches(data: DataSource, options?: object]]): AsyncIterator

Parses data in batches from a stream, releasing each batch to the application while the stream is still being read.

Parses data with the selected loader object. An array of loaders can be provided, in which case an attempt will be made to autodetect which loader is appropriate for the file (using url extension and header matching).

  • data: loaded data or an object that allows data to be loaded. Plese refer to the table below for valid types.
  • loaders - can be a single loader or an array of loaders. If ommitted, will use the list of registered loaders (see registerLoaders)
  • options: optional, options for the loader (see documentation of the specific loader).
  • url: optional, assists in the autoselection of a loader if multiple loaders are supplied to loader.


  • Returns an async iterator that yields batches of data. The exact format for the batches depends on the loader object category.


  • The loaders parameter can also be ommitted, in which case any loaders previously registered with registerLoaders will be used.

Input Types

Data TypeDescriptionComments
ResponseResponse object, e.g returned by fetch or fetchFile.Data will be streamed from the response.body stream.
AsyncIteratoriterator that yields promises that resolve to binary (ArrayBuffer) chunks or string chunks.
converted into async iterators behind the scenes.)
IteratorIterator that yields binary chunks (ArrayBuffer) or string chunksstring chunks only work for loaders that support textual input.
PromiseA promise that resolves to any of the other supported data types can also be supplied.

Note that many other data sources can also be parsed by first converting them to Response objects, e.g. with fetchResoure: http urls, data urls, ArrayBuffer, String, File, Blob, ReadableStream etc.


options.metadatabooleanfalseAn initial batch with batchType: 'metadata' will be added with information about the data being loaded.
options.batches.chunkSize?numberN/AWhen set, "atomic" inputs (like ArrayBuffer or string) are chunked, enabling batched parsing.
options.fetch`object(url: string) => Response`{}
options.transformsTransform[][]An array with transforms that can be applied to the input data before parsing.