Question

How to handle very large data pages?

We are migrating a legacy client/server application (written in PowerBuilder) into the Pega platform (currently version 8.1.1). Our concern is the possibility of needing to handle very large data pages. "Very large" would consist of perhaps upwards of 70,000 records. It would not be a typical to be returning 5-10K records fairly routinely. In addition, the user community for this application is over close to 1,000 users (only about half concurrently).

While we would using paging (whether normal or progressive) to only present a limited number of records to the end user GUI, the data page fetch (via a connect rest service) would likely pull all the data into the clipboard. This current approach is memory intensive.

My questions are:

1. Is there a prescribed pattern that Pega recommends for handling and presenting large data pages of data to the end user?
2. Is there a client-side approach (via javascript or React or Angular) whereby large data sets can be brought to the browser and then feed Pega the data through Pega JS API?
3. Are there alternative PEGA design strategies that have worked for others when dealing with large data sets?

The only technical constraint is that all data interaction are via Connect REST services. We currently use no Report Definitions because this data isn't being stored in the Pega data tables.

Comments

Keep up to date on this post and subscribe to comments

December 4, 2019 - 3:23am

About point 2:

I'd not advice you to keep a lot of data on client side. Browsers, in general, are not very efficient in regards of performance, comparing with your servers.

Well, that might be, but you can never rely on it - you never know, what's the machine on user's side (cpu, memory), how many opened tabs they have and so on.

Whenever you can, perform all data-related manipulations on a backend side. And present to a user only data she needs at the current moment