Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNOW-618478: Unable to authenticate with a private key by using spring datasource properties #1053

Closed
ZigZag59 opened this issue Jun 27, 2022 · 25 comments
Assignees
Labels
Backlog feature status-triage_done Initial triage done, will be further handled by the driver team

Comments

@ZigZag59
Copy link

We are using the standard configuration to setup spring datasources. When we want to use a private key to authenticate, the key is interpreted as a string and not a private key.

net.snowflake.client.jdbc.SnowflakeSQLLoggedException: Invalid parameter value type: java.lang.String, expected type: java.security.PrivateKey.

spring.datasource.username=${snowflake.username}
spring.datasource.driverClassName=net.snowflake.client.jdbc.SnowflakeDriver
spring.datasource.url=jdbc:snowflake://${snowflake.account}.snowflakecomputing.com/?warehouse=${snowflake.warehouse}&db=${snowflake.database.name}&schema=${snowflake.schema}
spring.datasource.dataSourceProperties.privatekey=MII...YKF

The final case is to use Azure AppConfiguration/KeyVault to get properties.

@github-actions github-actions bot changed the title Unable to authenticate with a private key by using spring datasource properties SNOW-618478: Unable to authenticate with a private key by using spring datasource properties Jun 27, 2022
@sfc-gh-wfateem
Copy link
Collaborator

Hi @ZigZag59,

I was able to reproduce this also using Hikari. The only workaround right now is to set your private key file and private key passphrase in your URL, for example:
spring.datasource.url=jdbc:snowflake://${snowflake.account}.snowflakecomputing.com/?warehouse=${snowflake.warehouse}&db=${snowflake.database.name}&schema=${snowflake.schema}&private_key_file=/PATH_TO/key.p8&private_key_file_pwd=YOUR_PRIVATE_KEY_PASSPHRASE

@sfc-gh-bgooley
Copy link

@SimbaGithub

@sfc-gh-hbarile
Copy link

sfc-gh-hbarile commented Sep 23, 2022

Hi @sfc-gh-wfateem my customer is also trying to create a Hikari connection pool that uses a private key instead of password. IS this the only workaround code snippet we can point them to ?

@j2201
Copy link

j2201 commented Feb 17, 2023

we can create any other datasource using privatekey and set it as a datasource in Hikari.
for example :
Properties properties = new Properties();
properties.put("user", user);
properties.put("privateKey", privateKey);
properties.put("db", database);
DriverManagerDataSource dmds = new DriverManagerDataSource(url, properties);
HikariConfig config = new HikariConfig();
config.setDataSource(dmds);
return new HikariDataSource(config);

@sfc-gh-wfateem
Copy link
Collaborator

This should be addressed by Hikari now in PR 1895.

@NitinSharma1991
Copy link

which hikari cp version has this change ? And spring boot version.

@sfc-gh-wfateem
Copy link
Collaborator

@NitinSharma1991 I'm reopening this issue because it looks like I erroneously perceived that changes were made in that PR, but I see now that it was closed and the Hikari team commented that they're not going to make any changes. We'll need to take a closer look.

@sfc-gh-wfateem sfc-gh-wfateem reopened this Feb 5, 2024
@sfc-gh-dszmolka sfc-gh-dszmolka added the status-triage_done Initial triage done, will be further handled by the driver team label Apr 26, 2024
@josecsotomorales
Copy link

We currently use HikariCP + Snowflake JDBC in our project and faced this issue while attempting to inject privateKey as a data source property. By looking at the source code there's no reason for the driver to accept privateKey as an object, a string should be ok, and the driver can implicitly transform it as a java.security.PrivateKey object, thoughts?

Code Snippet:

import org.bouncycastle.asn1.pkcs.PrivateKeyInfo
import org.bouncycastle.jce.provider.BouncyCastleProvider
import org.bouncycastle.openssl.PEMParser
import org.bouncycastle.openssl.jcajce.JcaPEMKeyConverter

import java.io.StringReader
import java.security.{PrivateKey, Security}
import java.sql.{Connection, DriverManager, ResultSet, Statement}
import java.util.Properties

object TestJdbc {

  // Private key as a string
  private val PRIVATE_KEY_STRING: String =
    """-----BEGIN PRIVATE KEY-----
      |-----END PRIVATE KEY-----
      |""".stripMargin

  private object PrivateKeyReader {
    def getFromString(privateKeyString: String): PrivateKey = {
      Security.addProvider(new BouncyCastleProvider())
      val pemParser = new PEMParser(new StringReader(privateKeyString))
      val pemObject = pemParser.readObject.asInstanceOf[PrivateKeyInfo]
      pemParser.close()

      val converter = new JcaPEMKeyConverter().setProvider(BouncyCastleProvider.PROVIDER_NAME)
      converter.getPrivateKey(pemObject)
    }
  }

  def main(args: Array[String]): Unit = {
    val url = "jdbc:snowflake://account-prod.snowflakecomputing.com"
    val prop = new Properties()
    prop.put("user", "service_user")
    prop.put("privateKey", PrivateKeyReader.getFromString(PRIVATE_KEY_STRING))
    prop.put("db", "SNOWFLAKE_SAMPLE_DATA")
    prop.put("schema", "TPCH_SF1")
    prop.put("warehouse", "COMPUTE_WH")
    prop.put("role", "READ_ROLE")

    val conn: Connection = DriverManager.getConnection(url, prop)
    val stat: Statement = conn.createStatement()
    val res: ResultSet = stat.executeQuery("select 1")
    if (res.next()) {
      println(res.getString(1))
    }
    conn.close()
  }
}

@sfc-gh-wfateem
Copy link
Collaborator

Hi @josecsotomorales,

Thanks for the comment. Yes, you're right. The Map should have stored <String, String> key, value pairs. However, historically, the JDBC code used the Properties HashMap to store objects, and it requires a broader cleanup.
In the meantime, though, can you not pass PRIVATE_KEY_FILE as a string representing the full path and file name of your private key file and PRIVATE_KEY_FILE_PWD with the passphrase to decrypt the key?

I'm assuming you're using an encrypted key and you're just not sharing the full code used to decrypt the private key. If the original reason of using this Bouncy Castle code was because of similar problems as described in #1683 then JDBC driver version 3.16.0 introduces a JVM argument you can enable -Dnet.snowflake.jdbc.enableBouncyCastle=true which basically does all this for you under the hood. You can refer to PR #1671.

Are any of these options possible for you to try?

@ets
Copy link
Contributor

ets commented Jul 2, 2024

@sfc-gh-wfateem I just opened a PR to resolve this issue. As further justification of the need, I'd add my blocking concern:

I want to leverage this driver through Spark.read in a cluster. In order to use private_key_file, I'd have to make the file available to every executor in the cluster. If I instead use private_key_base64, I can read the bytes from disk once and allow Spark to pass the base64 encoded String to all the executors transparently.

Happy to make adjustments to the PR as necessary, I'd really like to get this mainlined asap and I'm certain all the Hikari users represented here would appreciate it as well.

@josecsotomorales
Copy link

+1

@sfc-gh-wfateem
Copy link
Collaborator

Hi @ets,

That makes total sense. Thanks for the contribution!
I'll discuss it with the team and see if anyone has cycles to review the PR.

@shindiogawa
Copy link

+1

3 similar comments
@Hitsu-360
Copy link

+1

@gasscoelho
Copy link

+1

@RafaelOsiro
Copy link

+1

@sfc-gh-wfateem
Copy link
Collaborator

@ets I just want to make sure I understand the expectations on how to use that new PRIVATE_KEY_BASE64 parameter that you added.
Let's say you generate a key using the following command:

openssl genrsa 2048 | openssl pkcs8 -topk8 -v2 aes256 -inform PEM -out key-aes256.p8

The file key-aes256.p8 will contain something like the following:

-----BEGIN ENCRYPTED PRIVATE KEY-----
MIIFNTBfBgkqhkiG9w0BBQ0wUjAxBgkqhkiG9w0BBQwwJAQQDIHvF6uHcdw6AaYE
...
ju8yLyOIiXDGVX6MkCZNSpb/CUrYEjLT1VGSlGF4FhK1RJSPzdzp2X4=
-----END ENCRYPTED PRIVATE KEY-----

So a user should pass this to the JDBC driver:

PRIVATE_KEY_BASE64=MIIFNTBfBgkqhkiG9w0BBQ0wUjAxBgkqhkiG9w0BBQwwJAQQDIHvF6uHcdw6AaYE...ju8yLyOIiXDGVX6MkCZNSpb/CUrYEjLT1VGSlGF4FhK1RJSPzdzp2X4=

Is my understanding correct?

@ets
Copy link
Contributor

ets commented Jul 3, 2024

@sfc-gh-wfateem negative. The value of PRIVATE_KEY_BASE64 is just the base64 encoded byte content of the entire file.

For example:
PRIVATE_KEY_BASE64 = Base64.getEncoder().encodeToString(Files.readAllBytes(Paths.get("key-aes256.p8")))

We only use Base64 encoding because we can't pass around the byte[] for our purposes...we need a string.

If you take a look at SessionUtilKeyPair in the PR you'll see two methods that illustrate this further:
extractPrivateKeyFromFile
extractPrivateKeyFromBase64

The former reads the bytes from the PRIVATE_KEY_FILE file and the latter Base64.decodes the bytes from the PRIVATE_KEY_BASE64 string ... after that they both call extractPrivateKeyFromBytes

@cdrowley
Copy link

cdrowley commented Jul 5, 2024

For reference and my own understanding:

  • the linked PR above (@ets great work) will address the problem of passing privatekeys to pooling libs like hikari
  • the code snippet above from @josecsotomorales works for a single connection, but not for pooling libs (so my snippet below would not currently work)
  • once the linked PR is finished users could simply use privateKeyString and wouldn't need to self parse the strings to keys
val config = new HikariConfig()
val dataSource = new SnowflakeBasicDataSource()
val privateKey = PrivateKeyReader.getFromString(privateKeyString) // privateKeyString ala '-----BEGIN ...'

dataSource.setPrivateKey(privateKey)
...
config.setDataSource(dataSource)
new HikariDataSource(config)

@ets
Copy link
Contributor

ets commented Jul 5, 2024

Setting a DataSource like that should work too, but there's an easier approach. With the PR in place, you can just pass the new private_key_base64 property into the config using addDataSourceProperty

val privateKeyString = Base64.getEncoder().encodeToString(Files.readAllBytes(Paths.get("key-aes256.p8")))

val hc: HikariConfig = new HikariConfig()
hc.setJdbcUrl(jdbcUrl)
hc.setUsername(user)
hc.addDataSourceProperty("private_key_base64", privateKeyString)
new HikariDataSource(hc)

@sfc-gh-wfateem
Copy link
Collaborator

I'm closing this off since this was released in v3.19.0 and the new parameters private_key_base64 and private_key_pwd have been documented here.

Thanks for the contribution @ets!

@jase52476
Copy link

jase52476 commented Oct 28, 2024

Hi @sfc-gh-wfateem ,

I'm using the v3.19.0 driver, and I cannot get this to work at all. I've gone through this thread and I'm also using the Hikari connection pool, and I've tried the private_key_base64 and it gives me this error:

Private key provided is invalid or not supported: net.snowflake.client.jdbc.internal.org.bouncycastle.operator.OperatorCreationException: 1.2.840.113549.1.5.13 not available: Cannot find any provider supporting AES/CBC/PKCS7Padding

If I decrypt the file with OpenSSL, and try the base64 without the corresponding property : private_key_pwd it has an error but still connects ( I assume it figures out that it's not encrypted and uses the file contents):

2024-10-28 11:56:24.620 ERROR [REDACTED] --- [           main] c.a.c.d.d.c.SnowflakeCopernicusConfig    :  Error reading private key file: unable to convert key pair: null
2024-10-28 11:56:24.633  INFO [REDACTED] --- [           main] com.zaxxer.hikari.HikariDataSource       :  snowflakeConnectionPool - Starting...
2024-10-28 11:56:24.669  INFO [REDACTED] --- [           main] n.s.client.jdbc.SnowflakeConnectionV1    :  Initializing new connection
2024-10-28 11:56:24.685  INFO [REDACTED] --- [           main] net.snowflake.client.core.SFSession      :  Opening session with server: [REDACTED]snowflakecomputing.com:443/, account: [REDACTED], user: [REDACTED], password is not provided, role: null, database: [REDACTED], schema: [REDACTED], warehouse: [REDACTED], validate default parameters: null, authenticator: null, ocsp mode: FAIL_OPEN, passcode in password: null, passcode is not provided, private key is not provided, disable socks proxy: null, application: null, app id: JDBC, app version: 3.19.0, login timeout: null, retry timeout: null, network timeout: null, query timeout: null, connection timeout: null, socket timeout: null, tracing: null, private key file: null, private key pwd is provided, enable_diagnostics: not provided, diagnostics_allowlist_path: null, session parameters: client store temporary credential: null, gzip disabled: null, browser response timeout: null
2024-10-28 11:56:24.687  INFO [REDACTED] --- [           main] net.snowflake.client.core.SFBaseSession  :  Driver OCSP mode: FAIL_OPEN, gzip disabled: false and no proxy
2024-10-28 11:56:24.690  INFO [REDACTED] --- [           main] net.snowflake.client.core.SFSession      :  Connecting to GLOBAL Snowflake domain
2024-10-28 11:56:24.707  INFO [REDACTED] --- [           main] net.snowflake.client.core.FileUtil       :  Cache file creation: Accessing file: C:\Users\[REDACTED]\AppData\Local\Snowflake\Caches\ocsp_response_cache.json
2024-10-28 11:56:26.472  INFO [REDACTED] --- [           main] net.snowflake.client.core.FileUtil       :  Write to cache: Accessing file: C:\Users\[REDACTED]\AppData\Local\Snowflake\Caches\ocsp_response_cache.json
2024-10-28 11:56:26.915  INFO [REDACTED] --- [           main] net.snowflake.client.core.SFSession      :  Session [REDACTED] opened in 2,241 ms.
2024-10-28 11:56:26.915  INFO [REDACTED] --- [           main] n.s.client.jdbc.SnowflakeConnectionV1    :  Connection initialized successfully in 2,255 ms. Session id: [REDACTED]
2024-10-28 11:56:27.050  INFO [REDACTED] --- [           main] com.zaxxer.hikari.HikariDataSource       :  snowflakeConnectionPool - Start completed.

I also tried creating my own decryption method using the examples, and when I decrypt the original .p8 file, it says the algorithm is : 1.2.840.113549.1.1.1 and not 1.2.840.113549.1.5.13.

I was getting this error in my decryption method but I overcame it like this:

PKCS8EncryptedPrivateKeyInfo            encryptedPrivateKeyInfo = new PKCS8EncryptedPrivateKeyInfo ( pemObject.getContent ( ) );
String                                  passphrase              = pass; // Getting from dependency injection
JceOpenSSLPKCS8DecryptorProviderBuilder builder                 = new JceOpenSSLPKCS8DecryptorProviderBuilder ( );

builder.setProvider ( new BouncyCastleProvider ( ) );

InputDecryptorProvider pkcs8Prov = builder.build ( passphrase.toCharArray ( ) );
privateKeyInfo = encryptedPrivateKeyInfo.decryptPrivateKeyInfo ( pkcs8Prov );

I got the same error but adding this fixed it for me:

builder.setProvider ( new BouncyCastleProvider ( ) );

But it looks like the decryption that the driver is doing may not be doing this step?

I used BouncyCastle v1.78, which worked--is there a way to tell the driver to use the provided BouncyCastle instead of the
internal one? Or is there something else I'm missing?

@sfc-gh-wfateem
Copy link
Collaborator

@jase52476 I don't believe your issue is related to the new private_key_base64 property. I recommend you try the thin JDBC driver library and then provide the Bouncy Castle JARs manually on the classpath.

If you're still having issues with this then I would recommend you open a separate issue, so we can unpack what it is that's special about your scenario and see if it's reproducible on our end.

@jase52476
Copy link

@sfc-gh-wfateem I'm following the instructions provided for using key pair authentication and it's not working if we use the encrypted key file and passphrase:

props.put("private_key_file", "/tmp/rsa_key.p8");
props.put("private_key_file_pwd", "dummyPassword");

This is the driver/version I'm using.

            <dependency>
                <groupId>net.snowflake</groupId>
                <artifactId>snowflake-jdbc</artifactId>
                <version>3.19.0</version>
            </dependency>

I finally got it to work. I'm using SpringBoot and HikariCP for the connection.

It looks like my main issue was that I was using the HikariConfig.setJdbcUrl and HikariConfig.setDriverClassName. When I do this, I am unable to do the addDataSourceProperty ( "privateKey", pk ), because it came back with the issue of the non-String property issue that's mentioned earlier in this thread.

I had to end up doing this:

datasource.snowflake.hikari.dataSourceClassName=net.snowflake.client.jdbc.SnowflakeBasicDataSource
datasource.snowflake.hikari.dataSourceProperties[url]=jdbc:snowflake://[REDACTED].snowflakecomputing.com
datasource.snowflake.hikari.dataSourceProperties[user]=[REDACTED]
datasource.snowflake.hikari.dataSourceProperties[databaseName]=[REDACTED]
datasource.snowflake.hikari.dataSourceProperties[warehouse]=[REDACTED]
datasource.snowflake.hikari.dataSourceProperties[schema]=[REDACTED]

Then, for the private key, I had to write my own extraction method with BouncyCastle (v1.78) and then add in the property manually:

extractPrivateKey( resourceLoader.getResource ( getKeyFile () ), getPassphrase () )
.ifPresent ( pk -> getHikari ().addDataSourceProperty ( "privateKey", pk ) );

Regarding this error:
Private key provided is invalid or not supported: org.bouncycastle.operator.OperatorCreationException: 1.2.840.113549.1.5.13 not available: Cannot find any provider supporting AES/CBC/PKCS7Padding

This error happens in this class in the driver jar:

net.snowflake.client.core.SessionUtilKeyPair.extractPrivateKeyWithBouncyCastleextractPrivateKeyWithBouncyCastle(byte[] privateKeyBytes, String privateKeyPwd)

I'm not sure if it's possible to change it to this instead (following solution here) :

// Current:
InputDecryptorProvider pkcs8Prov =
	  new JceOpenSSLPKCS8DecryptorProviderBuilder().build(privateKeyPwd.toCharArray());
	  
// Proposed
InputDecryptorProvider pkcs8Prov 
	= new JceOpenSSLPKCS8DecryptorProviderBuilder()
	.setProvider ( new BouncyCastleProvider () ) // This line made the error go away for me.
	.build ( passphrase.toCharArray ( ) );

@sfc-gh-wfateem
Copy link
Collaborator

sfc-gh-wfateem commented Oct 30, 2024

@jase52476 For this particular issue, regarding how to use the new property with Hikari, the objective was for you to read the private key data and pass that as a base64 encoded string using the property private_key_base64 and if it's encrypted then provide the passphrase using the property private_key_pwd.

You can't use the option privateKey which is why this new property was developed. The privateKey option expects an instance of PrivateKey, which won't work when you try to pass that to Hikari.

I'm not clear on what your issue was when you attempted to use private_key_file and private_key_file_pwd, were you getting this error or something else?

Private key provided is invalid or not supported: org.bouncycastle.operator.OperatorCreationException: 1.2.840.113549.1.5.13 not available: Cannot find any provider supporting AES/CBC/PKCS7Padding

Did you also add the following JVM argument when you tried passing those two options?
-Dnet.snowflake.jdbc.enableBouncyCastle=true

If you have already tried that, and you want us to look into this error, then let's open a new issue and discuss that separately. In that new issue, just let us know how you generated the private key (i.e. the OpenSSL command or whatever else you were using) so we can try to reproduce your problem and take it from there. The reason is that you shouldn't really need to explicitly set the provider; the JDBC driver already adds BouncyCastleProvider as a security provider at startup unless it detects that you have already loaded the BouncyCastleFipsProvider. I suspect the issue has to do with the fact that we shade and relocate the Bouncy Castle classes. But again, that's a whole different topic and a separate issue entirely from the one in this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Backlog feature status-triage_done Initial triage done, will be further handled by the driver team
Projects
None yet
Development

No branches or pull requests