Is there an example for asynchronous file upload for S3

0
Does anybody have an example how to implement an ansynchronous file upload to AWS S3 storage with MENDIX to support the upload of large files (e.g. several GB)
asked
5 answers
1

Dear Joachim,

 

Thank you for clarifying your requirements for uploading large files to Amazon S3 without temporary storage in the Mendix filesystem/database. Understanding your need to support files up to 4GB and avoid out-of-memory issues is crucial for us to provide you with the best possible solution.

 

To further assist you, I have a couple of questions regarding your current setup and the challenges you're facing:

  1. Could you share more about the deployment strategy of your Mendix instance? Are you testing this locally, is the app deployed in the Mendix cloud, or are you using a private cloud for this Mendix application?
  2. It would also be helpful to know more about when the Mendix application runs out of memory. Specifically, does this issue occur when the file is being transferred from your client to the Mendix runtime, or is it happening when the Mendix runtime attempts to upload the file to its file storage backend?

 

Your response will not only allow us to understand your situation better but will also help us in exploring the most appropriate approach to support your needs. Besides, sharing this information in the thread will greatly benefit the wider Mendix community, who may be dealing with similar file management challenges.

 

Best,

Trong

answered
0

Dear Joachim,

 

I'm Trong, a developer with the team that created the AWS connectors at Mendix.

 

I'm intrigued to learn more about your specific needs. Could you share some insights into why you are considering an asynchronous approach for this functionality? Are there particular constraints or requirements that prevent you from utilizing a synchronous method? It's important to note that the existing Amazon S3 Connector in Mendix is designed using the synchronous client. This setup can facilitate operations tailored for larger files, such as CreateMultipartUpload, UploadPart, and CompleteMultipartUpload, which might be suitable for your use case. These operations, however, are not implemented in the current version of the Amazon S3 Connector.

 

Your feedback on this matter will not only help us understand your requirements better but also enable us to assist you more effectively. Additionally, your response could greatly benefit the broader Mendix community who might be facing similar challenges or considering similar implementations.

 

Best,

Trong

answered
0

Dear Trong,

 

maybe, I used the term "asynchronously" in a wrong way. My intension is to build a MENDIX app which is able to upload files to Amazon S3 without storing them temporarily in a "MENDIX" filesystem/database. I would like to support uploading of large files (up to 4GB) without running into "out of memory" for the MENDIX app. 

 

I saw the MENDIX S3 connector expects a FileDocument. I assume something like a "stream" or uploading of chunks would be necessary to scale.

answered
0

Dear Trong,

 

sorry for not following up. I still would like to use the S3 connector. The question is not so much about asynchronous or synchronous. I would like to build a Mendix application to read and write large files (up to 4GB) on AWS S3.

 

I do not want to have the while file content neither in the database nor in the memory, since in both cases it will break the deployment. 

 

You were mentioning mulitpart which is not yet implemented in the connector. Is my assumption correct that it is not possible to work with large files with Mendix and S3 without storing the whole content either in memory or in the database?

answered
0

Hi Joachim,

 

Indeed, your understanding is correct. As of now, we lack a platform-supported operation for multi-part uploads and downloads in the Amazon S3 Connector. However, I’m happy to share the code for our proof-of-concept widget, which employs an Amazon S3 pre-signed URL for authentication. I’m also happy to share our approach to video streaming if that interests you.

 

Can you elaborate more on the details of your use case, the reading of files specifically?

 

With regards to uploading, we've developed a proof-of-concept widget that leverages the Mendix client for direct file uploads into a designated Amazon S3 Bucket, circumventing the Mendix runtime. This asset is, however, just that; a proof of concept.

 

For reading, we encountered a case where a client desired to stream large video files into a Mendix application without having to preload the entire content into memory. We solved this by streaming the video content using Amazon Cloudfront.

 

Best,

Trong

answered