Question

Best Practice to load 70K to 100K records for daily night batch process

Hi Team,

I have a requirement to load 70K to 100k records daily night from an external data table and create cases based on initial filter conditions and we will be creating cases for the majority of the records.

Once the cases are created each case will fetch multiple data elements from multiple external data tables.

Please let me know the best practices to load such huge data on every day night batch file. It will be helpful if you can share your previous experiences in working on such tasks.

Regards

Comments

Keep up to date on this post and subscribe to comments

Pega
August 31, 2019 - 3:00am

Hi, 

You can try using multiple batch processes running on a dedicated node. Configure one batch process( call as master batch process) to allocate and queue the work to other batch processes based on a particular attribute like number of records for each batch processes to be allocated.
You should have a dedicated node for these batch processes and users should not login to this node.

Below are also few points regarding load testing which might be of help to you.

https://community.pega.com/knowledgebase/articles/ten-best-practices-successful-performance-load-testing

Thanks

August 31, 2019 - 8:36am

Thanks Karunesh for your quick response.

As our application is running on the Pega cloud, do you still recommend a separate node to run the batch process?