flink-connector-jdbc
flink-connector-jdbc copied to clipboard
[FLINK-33761][Connector/JDBC] Add Snowflake JDBC Dialect
Feature: New JBDC connector dialect - connect to Snowflake.
Testing: As Snowflake is a SAAS offering, I did not provide any tests for it. We don't want to incur any more expenses. Note that it's a new feature. Any potential breakages in this PR should not impact existing users.
Usage Example: This is how I use it in my sample code:
public static Table getSnowflakeTable(StreamTableEnvironment tEnv, SnowflakeConfig config) {
tEnv.createTemporaryTable(
"XXX",
TableDescriptor.forConnector("jdbc")
.schema(Schema.newBuilder()
.column("F1", DataTypes.BIGINT())
.column("F2", DataTypes.TIMESTAMP().bridgedTo(java.time.LocalDateTime.class))
.build()
)
.option("driver", config.getDriver())
// https://docs.snowflake.com/developer-guide/jdbc/jdbc-configure#connection-parameters
.option("url", config.getUrl() + "&CLIENT_TIMESTAMP_TYPE_MAPPING=TIMESTAMP_NTZ")
.option("table-name", "Snowflake-Sample-Table-Replace-This")
.build()
);
return tEnv.sqlQuery("SELECT * FROM XXX");
}
Thanks for opening this pull request! Please check out our contributing guidelines. (https://flink.apache.org/contributing/how-to-contribute.html)
@davidradl WDYT?
Hello all,
Is there any progress on this? will it be merged, if yes, when do you think this would be?
Thank you!
Hello all,
Is there any progress on this? will it be merged, if yes, when do you think this would be?
Thank you!
Ecaterina, I am still waiting for a months for a year. Using this in prod with 1.19.1. @EcaterinaL @davidradl
Thank you for your prompt answer!, Do you actively use it? Have you merge it with your master?
I actively use it, yes. You can built from branch.
On Wed, Nov 20, 2024, 17:24 Ecaterina Lazarescu @.***> wrote:
Thank you for your prompt answer!, Do you actively use it? Have you merge it with your master?
— Reply to this email directly, view it on GitHub https://github.com/apache/flink-connector-jdbc/pull/118#issuecomment-2488874691, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACKDRNMTPNR7WTWC4LVYVT32BSSRRAVCNFSM6AAAAABG77RBHWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIOBYHA3TINRZGE . You are receiving this because you authored the thread.Message ID: @.***>
Hi @borislitvak @EcaterinaL , thanks for the changes - I cannot merge this as I am not a committer. I notice the branch has conflicts. I have suggested this be included in the forthcoming JDBC connector for Flink 1.20.
@snuyanzin is this something you can help with reviewing/merging once the conflicts are resolved please?
@davidradl, I used this repo to try and create a connector myself, however I found that some of the methods(ex createInputSerializer()) are not available on flink 1.19 I have to refactor. Is it possible to resolve some of them in 1.20? And when do you think the new version will be available ? :)
Thank you!
@davidradl, I used this repo to try and create a connector myself, however I found that some of the methods(ex createInputSerializer()) are not available on flink 1.19 I have to refactor. Is it possible to resolve some of them in 1.20? And when do you think the new version will be available ? :)
Thank you!
Hi @EcaterinaL , I am not sure what you mean by "Is it possible to resolve some of them in 1.20?" I suggest checking the git history to see when the methods were removed and how callers now get this capability. At the moment it is early days in getting a JDBC connector out that works with 1.20; I initiated a conversation on the dev list to get this started. The release manager has not been agreed yet, so I do not have a timescale for you.
Thank you @davidradl , I'll look at the history, good idea. I'm going to keep monitor this thread, for me is important to have this connector ready in coming months.
Thank you @davidradl , I'll look at the history, good idea. I'm going to keep monitor this thread, for me is important to have this connector ready in coming months.
no worries - I also suggest that you respond with your need to my dev list post - I proposed that we can get a 1.20 release of the Flink JDBC connector cut off main - then if your PR is merged this dialect will be included.