Bulk Write Rows

Catalyst enables you to perform bulk write jobs on a specific table present in the Data Store. The bulk write operation can fetch thousands of records from a CSV file uploaded in the File Store and insert them in a specific table.

The table is referred to by its unique table ID that is generated by Catalyst during creation. The column in which the write operation must be performed is referred to by its unique column ID.

Note: To perform a bulk write operation, you must first upload the required data as a CSV file in the File Store. During the write job, the file will be referred to by its unique file ID that will be generated by Catalyst once the file is uploaded.
Method Used Description
bulkWrite.createJob(fileId, {find_by,fk_mapping,operation}) Create a new bulk write job on a specific table.
bulkWrite.status(job ID) Get the status of a bulk write operation.
bulkWrite.result(job ID) Get the result of a bulk write operation.

Copy the SDK snippet below to perform a bulk write job on a particular table.

    
copy
const bulkWrite = dataStore.table('sampleTable').bulkJob('write'); // create bulk write job const bulkWriteJob = await bulkWrite.createJob('file_id', { find_by: 'EmpID', fk_mapping: [ { local_column: 'EmployeeID', reference_column: 'EmpID' }, { local_column: 'DepartmentID', reference_column: 'DeptID' } ], operation: 'insert' }); // get bulk write status await bulkWrite.getStatus(bulkWriteJob.job_id); // get bulk read result await bulkWrite.getResult(bulkWriteJob.job_id);

Note: A maximum of 100,000 rows can be written at one time.

Last Updated 2023-09-03 01:06:41 +0530 +0530

ON THIS PAGE

RELATED LINKS

Data Store