Bulk Write Rows
Catalyst enables you to perform bulk write jobs on a specific table present in the Data Store. The bulk write operation can fetch thousands of records from a CSV file uploaded in the File Store and insert them in a specific table.
The table is referred to by its unique table ID that is generated by Catalyst during creation. The column in which the write operation must be performed is referred to by its unique column ID.
Note: To perform a bulk write operation, you must first upload the required data as a CSV file in the File Store. During the write job, the file will be referred to by its unique file ID that will be generated by Catalyst once the file is uploaded.
Method Used | Description |
---|---|
bulk_write.create_job(fileId, {find_by,fk_mapping,operation}) | Create a new bulk write job on a specific table. | bulk_write.get_status(job ID) | Get the status of a bulk write operation. |
bulk_write.get_result(job ID) | Get the result of a bulk write operation. |
Copy the SDK snippet below to perform a bulk write job on a particular table. The datastore_service reference used below is already defined in the component instance page.
copybulk_write = datastore_service.table("Sample").bulk_write() #Create bulk write job bulk_write_job = bulk_write.create_job('6759000000165103', { "find_by": 'EmpId', "fk_mapping": [ { "local_column": 'EmployeeID', "reference_column": 'EmpId' }, { "local_column": 'DepartmentID', "reference_column": 'DepId' } ], "operation": 'insert' }) #Get bulk write status status = bulk_write.get_status(6759000000167103) #Get bulk write result result = bulk_write.get_result(6759000000167103)
Note: A maximum of 100,000 rows can be written at one time.
Last Updated 2024-09-20 18:35:48 +0530 +0530
Yes
No
Send your feedback to us
Skip
Submit