Hi Folks, I want to know anyone has same problem as mine. Actually I am trying to upload Json file with huge data records. My microflow has simple Json Import actvity which will put data into temporary object(Non persistant) and later will be put it to actual objects. Since data records are huge, experiencing performance issues in upload. so wanted to try out batching mechnism here. So first I want to read the content from File in batches by adding offset and pass that offset value to Json Import activity. I see that Json Import takes a custom parameter which is only a value. Not sure how will it take range of value so that I can read it batch by batch and pass to import. Can someone gone through these type of scenario before. please help. Regards, Nirmal Kumar