-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-755767: snowflake-jdbc driver fails with JDK >= 16 #589
Comments
I have the same problem.
Downgrading to JDK 16 "fixes" the problem |
Adding the JVM argument: --add-opens java.base/java.nio=ALL-UNNAMED seems to solve the problem. |
Is there a solution without downgrading to JDK 16? |
@sfc-gh-kdama @sfc-gh-mknister Could you please look at this issue? It seems like #682 solves that. Releasing this would be really appreciated because our snowflake apps are failing with JDK17. Is there any chance to include this fix in the next following release? |
I also vote for fixing it. For me, the driver fails even with the add-opens flag. |
I wasn't able to create the ticket but found one that's already been reported #533 |
It looks that ARROW-12747 was finally resolved three days ago in version 8.0.0. |
Adding a VM argument is not a good way to do this. I'd use it if it's really necessary. Otherwise, I avoid using VM arguments in production during the runtime because VM arguments are a black box to me. |
The issue can be marked as resolved since 3.13.19 snowflake-jdbc driver version. |
I haven't tested, but are you sure @vkopichenko ? |
Oh. It seems I got deceived by long-awaited eager expectations. |
This should be addressed in #1017 |
After upgrading to 3.13.20 we are still seeing errors with JDK 17
|
Confirmed that the problem still occurs with Apache Arrow v8.0. We'll need to look into this and see what's happening here. |
https://arrow.apache.org/release/9.0.0.html was released August 3rd does this correct the issue? |
@mdiskin it won't unfortunately. The issue really requires all developers to plan to migrate to modules, otherwise, so long as any project (whether that's Arrow, Snowflake-JDBC, or Netty) uses reflection then you will always see these warnings/errors in JDK 9 and above. These problems manifested themselves as just warnings but then turned into errors once the JDK turned on strong encapsulation by default starting with JDK 16. The issue described here is directly caused by Apache Arrow Memory, and the same code exists in v9.0 that leverages the sun.misc package. Here's the output of the jdeps command when run against some of the classes involved in the issue in the Apache Arrow project:
So the real solution is to use the following guide to migrate all the projects: As you can imagine, that will probably take time. I see Arrow started working on this. Even though the JIRA was closed as "incomplete," there is a PR that's still in progress: |
I still face the same issue. JDK 17, Spring Boot 3 and Snowflake JDBC version 3.13.28. Thanks |
&JDBC_QUERY_RESULT_FORMAT=JSON This worked for me. |
@sfc-gh-wfateem Unless I'm mistaken, Arrow is already compatible with JDK 8+ according to their documentation https://arrow.apache.org/docs/java/install.html#id3. Although, there are items still open in their issue tracker, most important one is apache/arrow#38915, which has only one open issue apache/arrow#39000 Is that correct? that last item needs to be closed to be able to have a snowflake jdbc driver compatible with JDK 8+? |
Hi @sfc-gh-wfateem, sorry for being insistent, were you able to see my last comment? Thanks |
@paoliniluis The problem is not compatibility with JDK 8+, the problem is with JDK 16 and onwards. Please refer to the KB article here. All items in #38915 would need to be implemented including #39000 to address that issue highlighted in the KB article. Once it is implemented, we would need to then include that Arrow dependency in the Snowflake JDBC JAR. I hope that addresses your question. |
@sfc-gh-wfateem the Arrow team confirmed that they're not going to address the issue apache/arrow#39000 (comment), any plans to complete this? |
@paoliniluis do you mean for us to step in and make the necessary changes in Arrow? If that's what you mean then no, we don't have plans to do that. If Arrow can't address the issue directly, then there's not much we can do on our end, I'm afraid. |
@sfc-gh-wfateem thanks for confirming, I guess there are a lot of projects waiting on this to happen. If this will never happen I would encourage your team to close this issue and document the workaround mentioned above so many projects can move forward with this |
@paoliniluis yes, that probably makes more sense at this point. I personally wasn't aware that Arrow decided against the change until you brought it up. Thanks for sharing that with us. |
To wrap this up, we went into detail explaining the nature of the issue in the following article: The workaround for JDK 16 is to add the following JVM argument: For JDK 17 and higher, you need to add the following Java option: At the time of this comment, Arrow had decided that they weren't going to make the necessary changes needed to unblock this issue: apache/arrow#39000 (comment) If Arrow comes back and fixes this then we can revisit this issue. In the meantime, we're going to move forward with closing this issue as there isn't much more that can be done other than providing everyone with the workarounds mentioned. |
are there any jdbc drivers for snowflake not using the arrow library? workaround for spring: spring:
datasource:
hikari:
connection-init-sql: ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON' |
Its works for me jdk21 and snowflake 3.13.29 |
@sfc-gh-dszmolka please avoid marking unresolved issues as "completed" and "closed" because this is misleading. |
not sure why you got the impression i marked with completed or closed @grepwood but my colleague provided a pretty good description why he closed it for now. Please see #589 (comment) |
@sfc-gh-dszmolka sorry, tagged the wrong person. As for the reason why I had the impression this is fixed: I don't think this is anywhere near being fixed because Arrow won't fix it: apache/arrow#39000 (comment) |
Would this project accept a PR that replaces Arrow as the implementation? If snowflake driver starts using a library, there is a decent chance that adoption will follow and the fork/alternative would succeed. I would be embarrassed, as the arrow project to lose market share to a competing library, but something has got to give; some people can only be reasoned with by avoidance. |
@grepwood sorry for the confusion. I rectified the status now. @alexanderankin It's a bit more complicated than merely changing the Arrow dependency in the JDBC driver. You have to make a change at the backend as well because the data the driver downloads is in Arrow format, so changing that on the client's side won't fix the problem. |
@sfc-gh-wfateem yes to clarify, I am suggesting that someone (not snowflake, willing volunteers, if any) fork the arrow library and remove the erroneous parts which are not compatible with modern java anymore (using internal APIs or whatever). The question to snowflake-jdbc maintainers is - would you be willing to try using that new library (again - not asking for fixes - asking for likelyhood of accepting a PR - which would replace the dependency to a new one, and update any import statements e.g. from |
@alexanderankin I see. If you or someone else is willing to take on that task, wouldn't it make more sense to just contribute that to Arrow directly rather than create a new project? There are reasons why the Arrow team tabled this. It's not really an easy issue to address. You would basically need to replace Netty as well as a dependency for Arrow, or go and make changes in Netty. Let's say you manage to fork it and address these problems, I am skeptical we'll want to replace the dependency to use that forked one unless we know it's going to be properly maintained and thorough testing takes place to ensure that these changes won't cause performance regressions (and any other issues). |
So if we would like to fix this issue in the way that end users would prefer it would be, we should perhaps concentrate our efforts on Netty? |
env:
mac v11.6
intellij
openjdk17
snowflake-driver v3.13.8 jar from maven repo
stack trace:
Exception in thread "main" net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver internal error: Fail to retrieve row count for first arrow chunk: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available. at net.snowflake.client.jdbc.SnowflakeResultSetSerializableV1.setFirstChunkRowCountForArrow(SnowflakeResultSetSerializableV1.java:1061) at net.snowflake.client.jdbc.SnowflakeResultSetSerializableV1.create(SnowflakeResultSetSerializableV1.java:550) at net.snowflake.client.jdbc.SnowflakeResultSetSerializableV1.create(SnowflakeResultSetSerializableV1.java:467) at net.snowflake.client.core.SFResultSetFactory.getResultSet(SFResultSetFactory.java:29) at net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:219) at net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:134) at net.snowflake.client.core.SFStatement.execute(SFStatement.java:743) at net.snowflake.client.core.SFStatement.execute(SFStatement.java:639) at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:238) at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQuery(SnowflakeStatementV1.java:133)
workaround:
Change jdk17 to jdk11.
The text was updated successfully, but these errors were encountered: