Bulk Read Rows

Catalyst allows you to perform bulk read jobs on a specific table present in the Data Store.

In the SDK snippet below, the Bulk Read job can read thousands of records from a specific table and generate a CSV file containing the results of the read operation, if the job is successful.The table is referred to by its unique Table ID. The datastore_service reference used below is already defined in the component instance page.

Method Used Description
bulk_read.create_job({ criteria, page, select_columns }) Create a new bulk read job.
bulk_read.get_status(job ID) Get a bulk read job's status.
bulk_read.get_result(job ID) Get a bulk read job's result.

Copy the SDK snippet below to perform a bulk read job on a particular table.

    
copy
#Bulk read bulk_read = datastore_service.table("sampleTable").bulk_read() #Create bulk read job bulk_read_Job = bulk_read.create_job({ "criteria": { "group_operator": 'or', "group": [ { "column_name": 'Department', "comparator": 'equal', "value": 'Marketing' }, { "column_name": 'EmpId', "comparator": 'greater_than', "value": '1000' }, { "column_name": 'EmpName', "comparator": 'starts_with', "value": 'S' } ] }, "page": 1, "select_columns": ['EmpId', 'EmpName', 'Department'] }) #Get bulk read status status = bulk_read.get_status(bulk_read_Job['job_id']) #Get bulk read result result = bulk_read.get_result(bulk_read_Job['job_id'])

Note: A maximum of 200,000 rows can be read simultaneously.

Last Updated 2024-09-20 18:35:48 +0530 +0530

ON THIS PAGE

RELATED LINKS

Data Store