Need approach to upload very large file to external system

we are saving the attachment an external system for that we are using REST service currently we set a max size for attachment as 10 Mb. user requested to increase the file size to 1 GB we would like to know if that effect the System performance and is there a better way to this.

***Moderator Edit-Vidyaranjan: Updated Platform Capability***

***Edited by Moderator Marissa to update SR Details***

Group Tags


Keep up to date on this post and subscribe to comments

December 3, 2019 - 8:11pm


It will cause slowness to some extent depending on your infrastructure capacity.

If you split the file and do multiple attachment then it may help, but again not sure how you would want final result here.

December 3, 2019 - 10:34pm

Hi Ajay,

Thanks for posting the query. PFB link which helps you in this requirement.

December 4, 2019 - 3:17am

Never send such large files via network using Pega. 

Try to store it in a local machine and, depending of your OS, create a batch/bash background process who will transfer files via network.

But in general, I'd rethink the overall approach in regards of such files - on my opinion, Pega should not store and maintain it.

December 4, 2019 - 11:12am

Hi Ajay,

Definitely it will cause performance issue as it will take more time send huge attachment size.There is an alternate to store attachments in database but I would suggest not go with that because more or less it will also degrade performance.

There is a nice feature in pega to connect to repository like alfresco etc.You can use this to store .If you are in pega cloud then I think you can use pega cloud repository as well.