Bulk Read Rows

Catalyst allows you to perform bulk read jobs on a specific table present in the Data Store.

In the SDK snippet below, the Bulk Read job can read thousands of records from a specific table and generate a CSV file containing the results of the read operation, if the job is successful.The table is referred to by its unique Table ID.

Note: You can also use the dataStore.table().bulkJob(‘read’ | ‘write’) method to perform either a bulk read or a bulk write job.
Method Used Description
bulkRead.createJob({ criteria, page, select_columns }) Create a new bulk read job.
bulkRead.getStatus(job ID) Get a bulk read job's status.
bulkRead.getResult(job ID) Get a bulk read job's result.

Copy the SDK snippet below to perform a bulk read job on a particular table.

// bulk read const bulkRead = dataStore.table('sampleTable').bulkJob('read'); // create bulk read job const bulkReadJob = await bulkRead.createJob({ criteria: { group_operator: 'or', group: [ { column_name: 'Department', comparator: 'equal', value: 'Marketing' }, { column_name: 'EmpID', comparator: 'greater_than', value: '1000' }, { column_name: 'EmpName', comparator: 'starts_with', value: 'S' } ] }, page: 1, select_columns: ['EmpID', 'EmpName', 'Department'] }); // Get bulk read status await bulkRead.getStatus(bulkReadJob.job_id); // Get bulk read result await bulkRead.getResult(bulkReadJob.job_id);

Note: A maximum of 200,000 rows can be read simultaneously.

Last Updated 2023-09-03 01:06:41 +0530 +0530



Data Store