DB connection issue

0
I have successfully established a connection with my DB which is in snowflake. However I ran into an error as mendix is expecting JSON and snowflake gives Appache arrow. In mendix documentation it has asked me to run a query in my snowflix so that I can change jdbc query to json. However my company wiki is saying me to go with adding an jvm option also fixes the issue. When I check out which is better, option 1 seems better as it will work in production environment also unlike the second option. Which to go with?see less
asked
2 answers
0

This is a known issue when using Snowflake with Mendix, especially on Java 17/21. Snowflake returns results in Apache Arrow format by default, but Mendix expects JSON.


Mendix Support recommends forcing Snowflake to return JSON instead:


ALTER USER <your_user> SET JDBC_QUERY_RESULT_FORMAT='JSON';


This is the preferred solution for production, as it ensures compatibility without relying on JVM workarounds.



answered
0

Why This Happens

Snowflake’s newer JDBC drivers return results in Apache Arrow format by default.

Mendix Database Connector / external database calls expect:


JSON-compatible result sets

If Arrow is returned → Mendix cannot deserialize → error.

Option 1 – Correct Production Approach

Run this in Snowflake session or modify connection properties:


ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';

Or add connection property:


JDBC_QUERY_RESULT_FORMAT=JSON

This forces Snowflake to return JSON instead of Arrow.

  • Works in all environments
  • Stable in production
  • Explicit and predictable
  • Follows Snowflake + JDBC best practice
  • This is the correct architectural solution.

Option 2 – JVM Option

Some suggest adding a JVM flag to disable Arrow globally.

Problems:

  • Not environment-safe
  • Can break after driver updates
  • Harder to maintain
  • Depends on runtime config
  • Not portable across environments

It is a workaround, not a clean solution.

Mendix Best Practice

Mendix expects structured JSON when using:

  • Database Connector
  • REST-like response handling
  • Import mappings

So always configure the data source to return compatible format rather than patching JVM.

Final Recommendation

Use:


ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';

or set it in the connection string.

That is the correct, production-ready solution.



If You Are Using the Database Connector Module

You need to modify the connection properties, not the JVM.

Option A — Add It in the JDBC URL (Recommended)

In your Database Connector configuration (usually in a constant or connection string), update the JDBC URL like this:


jdbc:snowflake://<account>.snowflakecomputing.com/?JDBC_QUERY_RESULT_FORMAT=JSON

Example:


jdbc:snowflake://abc123.eu-central-1.snowflakecomputing.com/?db=MYDB&schema=PUBLIC&warehouse=WH1&JDBC_QUERY_RESULT_FORMAT=JSON

That’s it.

This forces Snowflake to return JSON instead of Apache Arrow.

  • Works in all environments
  • Production-safe
  • No JVM change needed

Option B — Execute ALTER SESSION Before Query

If you cannot change the URL, then:

In your microflow before executing query:

  1. Add Execute Statement
  2. Run:

ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';

Then run your actual SELECT query.

This works too — but URL-level configuration is cleaner.

Do NOT Use JVM Option

Avoid adding JVM parameters like:


-Dnet.snowflake.jdbc.disableArrow=true

Reasons:

  • Not portable
  • Can break after driver upgrade
  • Harder to maintain across environments
  • Not recommended for production

Best Practice

Set it in:


JDBC URL → JDBC_QUERY_RESULT_FORMAT=JSON

That is the cleanest and most production-ready solution.

answered