How to do Mendix with high concurrent load (1000+ concurrent users)

0
We are working on a backend for a mobile app. On a schedule, a push notification is sent to all users (up to 200k) which then open the app. This causes massive peak loads. Static data requests are handled by the SpeedyRest module which we think is great. Dynamic data requests follow the normal REST flow. The most time consuming is the Prize endpoint. Users claim a prize at the prize endpoint. This claim needs to check if there is stock and if so associated it with the user. We are facing a couple of challenges: The stock update is currently not accurate. When peak load hits the stock sometimes even goes up. We are going to test with EndTransaction from the ObjectHandeling module. Any tips are welcome. We are unable to handle more than 400 concurrent requests. Any tips to get to 1000+ concurrent. To be able to answer this properly more context is needed. Though possibly people can share their experience with high traffic apps.  
asked
1 answers
10

Mendix has assisted the customer to address the question above. The app has successfully been able to handle all traffic required. For anybody looking for an answer to a comparable situation, here’s some info.

1000+ concurrent users is not uncommon for Mendix applications. The complexity for this app is the peak caused by the push notifications, but can be caused by other traffic generating activities. In this app, all users use the same functionality at the same moment, accessing the same data for all concurrent users.

To handle the peak in user requests, additional Mendix runtime instances are started. In addition, the database has been resized, to support a higher number of concurrent database actions. Finally, it has been identified that for this use case, handling the functionality asynchronously through a queue would reduce the sizing requirements.

After improving the app, it has successfully been used to handle 2000+ concurrent users in production.

answered