Mendix <-> locally hosted LLM?

0
Has anyone ever implemented a scenario where the Mendix app is expected to communicate with a locally hosted LLM (on LM Studio or Ollama, for instance)? How would one go about it?
asked
1 answers
0

Handle Networking: If the LLM is running locally, you may need to:

- Use the local IP or hostname (http://localhost:port) for testing.

- Configure networking rules to allow communication between devices if running the Mendix app on a different server or device.

 

Optional: Use a Proxy: If the app is deployed on the cloud, use a reverse proxy or VPN to expose the local LLM API to the app securely.

answered