Best Practice to load 70K to 100K records for daily night batch process
I have a requirement to load 70K to 100k records daily night from an external data table and create cases based on initial filter conditions and we will be creating cases for the majority of the records.
Once the cases are created each case will fetch multiple data elements from multiple external data tables.
Please let me know the best practices to load such huge data on every day night batch file. It will be helpful if you can share your previous experiences in working on such tasks.
Keep up to date on this post and subscribe to comments
- To generate reports having more than 100K records
- Creating Data-Admin-DistinctValueIndex Instances for Properties which have more than 100k Distinct Records.
- Anyone successfully implemented solutions for 100k or more results for export to excel?
- Error while processing duplicate records from Bulk Operator Load
- How to send a batch of records to Service-Rest in Pega instead of sending one record at a time?