REST memory

Hi all,  Whenever we run a REST service to pull data from our Mendix app into Azure Data Factory (and other consumers), because of the size of the data (1.2million records) the app crashes with a Java memory error. Does anyone have any advice on best practices or how to throttle the retrieve that the REST service performs? Thanks,  Garion
2 answers

Hi Garion,

You should pass additional parameters to retrieve the data, something like Offset and Amount. You can then retrieve data in your Mendix application using the Offset and Amount parameters, only returning the specified amount of data (capped at, for instance, 1000). That way, you can retrieve the data in several consecutive calls, with increments each time for the Amount that you are retrieving.



I think what Marius means is that you can add query parameters to your published rest service for offset and amount. Then you call the service multiple times, to get batches of, for instance, 1000 records:

GET /rest/myservce/largeresource?amount=1000&offset=0
GET /rest/myservce/largeresource?amount=1000&offset=1000
GET /rest/myservce/largeresource?amount=1000&offset=2000

Even better, I think, is to use continuation, for instance:

First, you call GET /rest/myservce/largeresource returns { data: [ 1, 2, 3, …, 1000 ], next: 1001 }
then, GET /rest/myservce/largeresource?startFrom=1001, which returns { data [ 1001, 1002, …, 2000 ], next: 2001}