-
Bug
-
Resolution: Not a Bug
-
Major
-
None
-
3.2.0.Final
-
None
-
False
-
-
False
-
Critical
In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.
Bug report
For bug reports, provide this information, please:
What Debezium connector do you use and what version?
3.2.0-Final
What is the connector configuration?
I am using Docker Compose to create an environment with Kafka, Kafka Connect and 3 Databases. All my Source connectors using Debezium work just fine. It is the Sink connector with JDBC Postgres driver which is the problem.
What is the captured database version and mode of deployment?
Docker Compose, local Linux Machine.
Databases: Postgres 16.4
Source: Informix: ibmcom/informix-developer-database
What behavior do you expect?
I expect the Source connector to get the Kafka messages and write to postgres.
What behavior do you see?
I have:
java.sql.SQLException: No suitable driver
HHH000342: Could not obtain connection to query metadata
Do you see the same behaviour using the latest released Debezium version?
I haven't tried the CR but 3.2.0 and 3.1.0. Same thing.
Do you have the connector logs, ideally from start till finish?
(You might be asked later to provide DEBUG/TRACE level log)
Yes:
Log is attached.
How to reproduce the issue using our tutorial deployment?
Yes. Same problem when using MySQL as a Sink Conenctor.
------
Here is my Docker Compose:
services:
# Data Base Systems # # Informix DB informix:
# ----------------------------- # To connect to the DB: # $ docker-compose exec informix bash # # To display logs: # $ docker logs -f informix # ----------------------------- # Create schemas: # docker-compose exec informix bash -c '/docker-entrypoint-initdb.d/schema.sh' # docker-compose exec informix bash -c '/docker-entrypoint-initdb.d/schema2.sh' # # Populate data # docker-compose exec informix bash -c '/docker-entrypoint-initdb.d/insert.sh' # docker-compose exec informix bash -c '/docker-entrypoint-initdb.d/insert2.sh' # # User: informix # Pass: in4mix # # Informix needs to be set for CDC. To do that one needs to run the SQL script # syscdcv1.sql located at /opt/ibm/informix/etc # # To run it, you will need dbaccess, the command is: # $ dbaccess - syscdcv1.sql # # You might need to re-set the PATH env variable # $ export PATH=$PATH:/opt/ibm/informix/bin # # Docker Hub: https://hub.docker.com/r/ibmcom/informix-developer-database # # # For version information look at running image # $ cat /opt/ibm/informix/ids.properties image: ibmcom/informix-developer-database
container_name: informix
hostname: informix
ports:
- "9088:9088" # JDBC port environment:
LICENSE: accept
TYPE: oltp
#DBSERVERNAME: db_ifx1 DB_USER: informix
DB_NAME: informix
DB_PASS: in4mix
# DB_INIT: 1 BASEDIR: /opt/ibm
# sets Informix dir which is /opt/ibm/informix/ INFORMIXDIR: /opt/ibm/informix
INFORMIXSERVER: informix
# INFORMIXSQLHOSTS: /opt/ibm/informix/etc/sqlhosts PATH: $PATH:/opt/ibm/informix/bin
volumes:
# This will mount current dir data/informix to docker-entrypoint-initdb.d - ./data.scripts/informix:/docker-entrypoint-initdb.d/
command: [ "cdc_update.sh" ]
# # Postgres DB postgres:
# ----------------------------- # To connect to the DB: # docker-compose exec postgres bash -c 'psql -U $POSTGRES_USER $POSTGRES_DB' # ----------------------------- image: postgres:16.4
container_name: postgres
hostname: postgres
restart: always
environment:
# POSTGRES_DB: test_db POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: public
volumes: - ./data.scripts/postgres:/docker-entrypoint-initdb.d/
healthcheck:
test: [ "CMD-SHELL", "pg_isready" ]
interval: 1s
timeout: 5s
retries: 10
ports: - "5432:5432" # MySQL mysql:
image: quay.io/debezium/example-mysql:${DEBEZIUM_VERSION}
container_name: mysql
hostname: mysql
ports: - 3306:3306
environment: - MYSQL_ROOT_PASSWORD=debezium
- MYSQL_USER=mysqluser
- MYSQL_PASSWORD=mysqlpw
zookeeper:
image: quay.io/debezium/zookeeper:${DEBEZIUM_VERSION}
container_name: zookeeper
ports: - "2181:2181" - "2888:2888" - "3888:3888" kafka:
image: quay.io/debezium/kafka:${DEBEZIUM_VERSION}
hostname: kafka
container_name: kafka
ports: - "9092:9092" depends_on:
- zookeeper
environment: - ZOOKEEPER_CONNECT=zookeeper:2181
- KAFKA_CONFLUENT_SCHEMA_REGISTRY_URL=http://schema-registry:8081
# NOTE: Kafka will drop messages with self signed certificates. There is a work around but it did not work. # Original with SSL #- KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=BROKER:PLAINTEXT,SSL:SSL,CONTROLLER:PLAINTEXT - KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=BROKER:PLAINTEXT,CONTROLLER:PLAINTEXT,PLAIN:PLAINTEXT
# Original with SSL #- KAFKA_LISTENERS=BROKER://0.0.0.0:9093,SSL://0.0.0.0:9092,CONTROLLER://0.0.0.0:9094 - KAFKA_LISTENERS=BROKER://0.0.0.0:9093,PLAIN://0.0.0.0:9092,CONTROLLER://0.0.0.0:9094
# Original with SSL #- KAFKA_ADVERTISED_LISTENERS=SSL://localhost:9092,BROKER://localhost:9093 - KAFKA_ADVERTISED_LISTENERS=PLAIN://localhost:9092,BROKER://localhost:9093
#- KAFKA_SSL_CLIENT_AUTH=required #- KAFKA_SSL_KEYSTORE_LOCATION=/opt/config/ssl/kafka-ssl-keystore.p12 #- KAFKA_SSL_KEYSTORE_PASSWORD=top-secret #- KAFKA_SSL_KEYSTORE_TYPE=PKCS12 #- KAFKA_SSL_TRUSTSTORE_LOCATION=/opt/config/ssl/kafka-ssl-truststore.p12 #- KAFKA_SSL_TRUSTSTORE_PASSWORD=top-secret - KAFKA_INTER_BROKER_LISTENER_NAME=BROKER
volumes: - ./resources:/opt/config/ssl:z
# postgres:# image: quay.io/debezium/example-postgres:${DEBEZIUM_VERSION}# ports:# - "5432:5432"# environment:# - POSTGRES_USER=postgres# - POSTGRES_PASSWORD=postgres connect:
image: quay.io/debezium/connect:${DEBEZIUM_VERSION}
container_name: connect
#hostname: connect ports: - "8083:8083" environment:
# use host localhost if network_mode: host is set - CONNECT_BOOTSTRAP_SERVERS=localhost:9092 - BOOTSTRAP_SERVERS=localhost:9092
- GROUP_ID=1
- CONFIG_STORAGE_TOPIC=my_connect_configs
- OFFSET_STORAGE_TOPIC=my_connect_offsets
- STATUS_STORAGE_TOPIC=my_connect_statuses
- CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL=http://schema-registry:8081
# Will set plugin path in properties file - CONNECT_PLUGIN_PATH=/kafka/connect,/kafka/connect/custom-plugins - CLASSPATH=.:/kafka/connect:/kafka/connect/custom-plugins
#- CONNECT_SECURITY_PROTOCOL=SSL #- CONNECT_SSL_KEYSTORE_LOCATION=/opt/config/ssl/kafka-ssl-keystore.p12 #- CONNECT_SSL_KEYSTORE_PASSWORD=top-secret #- CONNECT_SSL_KEYSTORE_TYPE=PKCS12 #- CONNECT_SSL_KEY_PASSWORD=top-secret #- CONNECT_SSL_TRUSTSTORE_LOCATION=/opt/config/ssl/kafka-ssl-truststore.p12 #- CONNECT_SSL_TRUSTSTORE_PASSWORD=top-secret #- CONNECT_SSL_TRUSTSTORE_TYPE=PKCS12 #- CONNECT_PRODUCER_SECURITY_PROTOCOL=SSL #- CONNECT_PRODUCER_SSL_KEYSTORE_LOCATION=/opt/config/ssl/kafka-ssl-keystore.p12 #- CONNECT_PRODUCER_SSL_KEYSTORE_PASSWORD=top-secret #- CONNECT_PRODUCER_SSL_KEYSTORE_TYPE=PKCS12 #- CONNECT_PRODUCER_SSL_KEY_PASSWORD=top-secret #- CONNECT_PRODUCER_SSL_TRUSTSTORE_LOCATION=/opt/config/ssl/kafka-ssl-truststore.p12 #- CONNECT_PRODUCER_SSL_TRUSTSTORE_PASSWORD=top-secret #- CONNECT_PRODUCER_SSL_TRUSTSTORE_TYPE=PKCS12 #- CONNECT_CONSUMER_SECURITY_PROTOCOL=SSL #- CONNECT_CONSUMER_SSL_KEYSTORE_LOCATION=/opt/config/ssl/kafka-ssl-keystore.p12 #- CONNECT_CONSUMER_SSL_KEYSTORE_PASSWORD=top-secret #- CONNECT_CONSUMER_SSL_KEYSTORE_TYPE=PKCS12 #- CONNECT_CONSUMER_SSL_KEY_PASSWORD=top-secret #- CONNECT_CONSUMER_SSL_TRUSTSTORE_LOCATION=/opt/config/ssl/kafka-ssl-truststore.p12 #- CONNECT_CONSUMER_SSL_TRUSTSTORE_PASSWORD=top-secret #- CONNECT_CONSUMER_SSL_TRUSTSTORE_TYPE=PKCS12 #command: # - /bin/bash # - -c # - | # # JDBC Drivers # # ------------ # # Informix JDBC Driver # cd /kafka/connect/debezium-connector-informix # curl https://repo1.maven.org/maven2/com/ibm/informix/jdbc/4.50.8/jdbc-4.50.8.jar --compressed --output informix-jdbc-4.50.3.jar # # CLASSPATH # #export CLASSPATH=".:/usr/share/confluent-hub-components/confluentinc-kafka-connect-datagen/lib/:/usr/share/confluent-hub-components/confluentinc-kafka-connect-jdbc/lib/:/home/appuser/plugins/:$CLASSPATH" # #export plugin.path=/usr/local/share/kafka/plugins # # sleep infinity network_mode: host
#network_mode: bridge volumes: - ./resources:/opt/config/ssl:z
- ./bin/informix-jdbc-4.50.3.jar:/kafka/connect/debezium-connector-informix/informix-jdbc-4.50.3.jar
- ./bin/ifx-changestream-client-1.1.3.jar:/kafka/connect/debezium-connector-informix/ifx-changestream-client-1.1.3.jar
# Mount your connector plugins here if they are not included in the image - ./custom-plugins:/kafka/connect/custom-plugins
depends_on: - kafka
- postgres
# # Second Kafka Connect Confluent # # # Schema Registry # # # Schema Registry: # URL: https://hub.docker.com/r/confluentinc/cp-schema-registry # initial version used: 7.2.1 # latest: 7.9.1 # # schema-registry:
image: confluentinc/cp-schema-registry:7.9.1
hostname: schema-registry
container_name: schema-registry
depends_on: - kafka
ports: - "8081:8081" environment:
SCHEMA_REGISTRY_HOST_NAME: schema-registry
SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: 'broker:9092' SCHEMA_REGISTRY_LISTENERS: http://0.0.0.0:8081
# UI # # Kafka Connect UI kafka-connect-ui:
image: landoop/kafka-connect-ui
container_name: kafka-connect-ui
environment:
CONNECT_URL: "http://connect:8083" # Pointing to the Kafka Connect REST API ports: - "9001:8000" depends_on:
- kafka
- connect
- schema-registry
network_mode: bridge
extra_hosts: - "connect:host-gateway" # # Kafka UI kafka-ui:
container_name: kafka-ui
image: provectuslabs/kafka-ui:latest
#image: ashish1981/kafka-ui:latest ports: - "9004:8080" depends_on:
- kafka
- connect
- schema-registry
environment:
#KAFKA_CLUSTERS_0_NAME: local KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: "broker:9092" # JMX port KAFKA_CLUSTERS_0_METRICS_PORT: "9101" KAFKA_CLUSTERS_0_SCHEMAREGISTRY: "http://schema-registry:8081" KAFKA_CLUSTERS_0_KAFKACONNECT_0_NAME: connect
KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS: "http://connect:8083" DYNAMIC_CONFIG_ENABLED: 'true' # # kafbat-UI # #kafbat-ui: # container_name: kafbat-ui # image: ghcr.io/kafbat/kafka-ui:latest # ports: # - 9005:8080 # environment: # DYNAMIC_CONFIG_ENABLED: 'true' # volumes: # - ./ui/kafka-ui.yml:/etc/kafkaui/dynamic_config.yaml #kafbat-ui: # container_name: kafbat-ui # hostname: kafbat-ui # image: ghcr.io/kafbat/kafka-ui:latest # ports: # - 9005:8080 # environment: # DYNAMIC_CONFIG_ENABLED: 'true' # KAFKA_CLUSTERS_0_NAME: local # other properties, omitted #SPRING_CONFIG_ADDITIONAL-LOCATION: /tmp/config.yml # volumes: # - ./ui/kafka-ui.yml:/tmp/config.yml # depends_on: # - kafka # - schema-registry # #
--------
Here is my Source connector that works just fine:
{
"name": "informix-test2-source",
"config": {
"connector.class": "io.debezium.connector.informix.InformixConnector",
"database.hostname": "informix",
"database.port": "9088",
"database.user": "informix",
"database.password": "in4mix",
"database.dbname": "test",
"topic.prefix": "test2",
"table.include.list": "test.informixserver.users",
"schema.history.internal.kafka.bootstrap.servers": "broker:9092",
"schema.history.internal.kafka.topic": "schemahistory.test2" {color}}
}
-------
Here is my Postgres Sink Connector that does not work:
{
"name": "informix-test2-sink",
"config": {
"connector.class": "io.debezium.connector.jdbc.JdbcSinkConnector",
"topics": "test2",
"tasks.max": "1",
"connection.url": "jdbc:postgres://postgres:5432/postgres?user=postgres&password=postgres",
"schema.history.internal.jdbc.url": "jdbc:postgres://postgres:5432/postgres?user=postgres&password=postgres",
"jakarta.persistence.jdbc.url": "jdbc:postgres://postgres:5432/postgres?user=postgres&password=postgres",
"connection.username": "postgres",
"connection.password": "postgres",
"connection.attempts": "10",
"connection.provider": "org.hibernate.c3p0.internal.C3P0ConnectionProvider",
"connection.pool.min_size": "5",
"connection.pool.max_size": "32",
"connection.pool.acquire_increment": "32",
"connection.pool.timeout": "1800",
"connection.restart.on.errors": "false",
"hibernate.connection.driver_class": "org.postgresql.Driver",
"hibernate.c3p0.idle_test_period": "300",
"insert.mode": "upsert",
"delete.enabled": "true",
"primary.key.mode": "record_key",
"schema.evolution": "basic",
"use.time.zone": "UTC",
"errors.log.enable": "true",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "true",
"//": "Test comment",
"dialect.postgres.postgis.schema": "public" {color}}
}
-------
I have my plugin.path properly set as I can show from cat /kafka/config/connect-distributed.properties:
plugin.path=/kafka/connect,/kafka/connect/custom-plugins
value.converter.schema.registry.url=http://schema-registry:8081
rest.host.name=4a:e1:32:9f:2d:9a
rest.port=8083
offset.flush.timeout.ms=5000
task.shutdown.graceful.timeout.ms=10000
------
I have placed the Informix Driver inside Debezium Informix directory and the source works just fine.
The Postgres driver is inside the connect/debezium-connector-jdbc/ as it came from the image:
rw-rr- 1 kafka kafka 27439 Jan 11 2023 angus-activation-2.0.0.jar
rw-rr- 1 kafka kafka 322137 Apr 15 2022 antlr4-runtime-4.10.1.jar
rw-rr- 1 kafka kafka 4212623 Dec 20 2023 byte-buddy-1.14.11.jar
rw-rr- 1 kafka kafka 501445 Dec 12 2019 c3p0-0.9.5.5.jar
rw-rr- 1 kafka kafka 67815 Oct 19 2019 classmate-1.5.1.jar
rw-rr- 1 kafka kafka 31627 Jun 25 11:30 debezium-api-3.2.0.CR1.jar
rw-rr- 1 kafka kafka 256183 Jun 25 11:34 debezium-connector-jdbc-3.2.0.CR1.jar
rw-rr- 1 kafka kafka 1354932 Jun 25 11:30 debezium-core-3.2.0.CR1.jar
rw-rr- 1 kafka kafka 32320 Jun 25 11:33 debezium-sink-3.2.0.CR1.jar
rw-rr- 1 kafka kafka 12934 Apr 26 2024 hibernate-c3p0-6.4.8.Final.jar
rw-rr- 1 kafka kafka 67807 Jan 26 2023 hibernate-commons-annotations-6.0.6.Final.jar
rw-rr- 1 kafka kafka 11578744 Apr 26 2024 hibernate-core-6.4.8.Final.jar
rw-rr- 1 kafka kafka 26147 Mar 28 2022 istack-commons-runtime-4.1.1.jar
rw-rr- 1 kafka kafka 63473 Dec 2 2021 jakarta.activation-api-2.1.0.jar
rw-rr- 1 kafka kafka 10681 Oct 16 2021 jakarta.inject-api-2.0.1.jar
rw-rr- 1 kafka kafka 165250 Feb 25 2022 jakarta.persistence-api-3.1.0.jar
rw-rr- 1 kafka kafka 28607 Mar 31 2022 jakarta.transaction-api-2.0.1.jar
rw-rr- 1 kafka kafka 127111 Mar 15 2022 jakarta.xml.bind-api-4.0.0.jar
rw-rr- 1 kafka kafka 327493 Jun 8 2023 jandex-3.1.2.jar
rw-rr- 1 kafka kafka 138596 Jan 27 2023 jaxb-core-4.0.2.jar
rw-rr- 1 kafka kafka 908151 Jan 27 2023 jaxb-runtime-4.0.2.jar
rw-rr- 1 kafka kafka 62562 Apr 28 2022 jboss-logging-3.5.0.Final.jar
rw-rr- 1 kafka kafka 746140 Mar 27 02:41 mariadb-java-client-3.5.3.jar
rw-rr- 1 kafka kafka 630956 Nov 9 2019 mchange-commons-java-0.2.19.jar
rw-rr- 1 kafka kafka 1438398 Oct 27 2023 mssql-jdbc-12.4.2.jre8.jar
rw-rr- 1 kafka kafka 2597591 Oct 14 2024 mysql-connector-j-9.1.0.jar
rw-rr- 1 kafka kafka 5253051 Aug 3 2024 ojdbc11-21.15.0.0.jar
rw-rr- 1 kafka kafka 1098494 May 28 10:12 postgresql-42.7.6.jar
rw-rr- 1 kafka kafka 73267 Jan 27 2023 txw2-4.0.2.jar
----------
CLASSPATH has also been set and here are the environment variables:
MYSQL_MD5=b35c3a2f8f32607eb1819a902ff8dec9
SHA512HASH=00722AB0A6B954E0006994B8D589DCD8F26E1827C47F70B6E820FB45AA35945C19163B0F188CAF0CAF976C11F7AB005FD368C54E5851E899D2DE687A804A5EB9
KAFKA_URL_PATH=kafka/4.0.0/kafka_2.13-4.0.0.tgz
STATUS_STORAGE_TOPIC=my_connect_statuses
MONGODB_MD5=84b931e8a449e39c5dbb7f6ccdf35a35
HOSTNAME=Dell-Precision-7560
JDBC_MD5=1179a2690fd0482a82c2277086279939
OFFSET_STORAGE_TOPIC=my_connect_offsets
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL=http://schema-registry:8081
SQLSERVER_MD5=c2040c5d1161d86684e7caac89c736da
MAVEN_REPO_CENTRAL=
SCRIPTING_MD5=db6d061f0550a44661a43eee25f58bca
CONFIG_STORAGE_TOPIC=my_connect_configs
IBMI_MD5=6b6385d85814eaa139ac78aeb15c4bd9
INFORMIX_MD5=d3df90c98ef5a9cc4cd0dbac5847b54e
VITESS_MD5=87b6c5187359a20047ae466ef0ab8d53
OPENTELEMETRY_INSTRUMENTATION_VERSION=1.23.0
PWD=/kafka/connect/debezium-connector-jdbc
DEBEZIUM_VERSION=3.2.0.CR1
POSTGRES_MD5=612c304c9c39f8f165a1d07799834b36
container=oci
MAVEN_REPOS_ADDITIONAL=
HOME=/kafka
KAFKA_HOME=/kafka
APICURIO_VERSION=2.6.2.Final
MAVEN_DEP_DESTINATION=/kafka/libs
SPANNER_MD5=7d7bd6a8bf056779c996c97585c94382
TERM=xterm
OPENTELEMETRY_VERSION=1.23.1
KAFKA_VERSION=4.0.0
JOLOKIA_VERSION=1.7.2
SHLVL=1
EXTERNAL_LIBS_DIR=/kafka/external_libs
CLASSPATH=.:/kafka/connect:/kafka/connect/custom-plugins
GROUP_ID=1
KAFKA_CONNECT_PLUGINS_DIR=/kafka/connect
ORACLE_MD5=08821f423f6f2298190bed150d3a0ff9
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
MARIADB_MD5=059f2dd21f2c34a928ef02a79f53d4e4
DB2_MD5=972ab631f3c0d641944b5512a6b579d3
KAFKA_DATA=/kafka/data
BOOTSTRAP_SERVERS=localhost:9092
CONNECT_PLUGIN_PATH=/kafka/connect,/kafka/connect/custom-plugins
SCALA_VERSION=2.13
CONNECT_BOOTSTRAP_SERVERS=localhost:9092
_=/usr/bin/env
OLDPWD=/kafka
-------
I did a lot of research on the web and I could not find any answer different to what I had done here in terms of JDBC url and setting plugin.path etc. Tried many different things.