This all depends on how you implemented the export. If the only thing you've built is a REST api which returns 5000 records and then the request is finished.
Probably you created a microflow which handles the export.. retrieving 5000 records and doing that over and over within the same microflow transaction will cause the memory issue.
Please share more information about how you have implemented the export in order to get a more precize answer/solution
More the data more heavier is the response body JSON of your REST api as well. So not only does your network call is slower your server may have a limit too (better check this with devops)
But dont you think the underlying issue stemmed from why the DataGrid2 struggles? or deeper still do your or your business user really need this much data to be exported?