druid icon indicating copy to clipboard operation
druid copied to clipboard

Unable to inject data from postgresql to druid (deployed using helm)

Open sairamdevarashetty opened this issue 2 years ago • 0 comments

I deployed druid using helm from repository using commands from https://github.com/apache/druid/tree/master/helm/druid and got it deployed successfully but when I created a task with following spec

{
  "type": "index_parallel",
  "id": "sairam_testing_postgresql_100",
  "spec": {
    "dataSchema": {
      "dataSource": "test-ingestion-postgresql-100",
      "timestampSpec": {
        "format": "iso",
        "column": "created_at"
      },
      "dimensionsSpec": {
        "dimensions": [
          "xcall_id","app_id"
        ]
      }
    },
    "ioConfig": {
      "type": "index_parallel",
      "inputSource": {
        "type": "sql",
        "database": {
          "type": "postgresql",
          "connectorConfig": {
            "connectURI": "jdbc:postgresql://35.200.128.167:5432/mhere_trans",
            "user": "postgres@jiovishwam-frp-att-prod-mhere-trans-psql-db-1",
            "password": "lFRWncdXG4Po0e"
          }
        },
        "sqls": [
          "SELECT xcall_id ,app_id,created_at FROM transactions limit 10"
        ]
      }
    },
    "maxNumConcurrentSubTasks": 2,
    "tuningConfig": {
      "type": "index_parallel",
      "partitionsSpec": {
        "type": "dynamic"
      }
    }
  }
}


it is throwing error Failed to submit task: Cannot construct instance of org.apache.druid.firehose.PostgresqlFirehoseDatabaseConnector, problem: java.lang.ClassNotFoundException: org.postgresql.Driver at [Source: (org.eclipse.jetty.server.HttpInputOverHTTP); line: 1, column: 969] (through reference chain: org.apache.druid.indexing.common.task.batch.parallel.ParallelIndexSupervisorTask["spec"]->org.apache.druid.indexing.common.task.batch.parallel.ParallelIndexIngestionSpec["ioConfig"]->org.apache.druid.indexing.common.task.batch.parallel.ParallelIndexIOConfig["inputSource"]->org.apache.druid.metadata.input.SqlInputSource["database"])

NOTE: I did try using quickstart and found similar issue (got fixed by manually adding postgresql jar file to lib directory ) but not sure how to handle this when the druid is deployed using helm charts on production.

sairamdevarashetty avatar Jul 15 '22 12:07 sairamdevarashetty