Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-8710

PDB database name default considering as UPPERCASE

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major Major
    • 3.1.0.Beta1
    • 3.1.0.Alpha1
    • oracle-connector
    • None

      The source connector in default considering the pdb database name as uppercase even after connector configuration  to forced to consider it as lowercase

      Bug report

      For bug reports, provide this information, please:

      https://debezium.zulipchat.com/#narrow/channel/348250-community-oracle/topic/connector.20deafult.20considering.20pdb.20as.20uppercase
       

      What Debezium connector do you use and what version?

      2.6

      What is the connector configuration?

      sudo docker exec -it oraclefin-source-connector curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" http://localhost:8083/connectors/ -d '{
        "name": "oraclefin-source-connector-cdb1-worker",
        "config":

      {     "connector.class": "io.debezium.connector.oracle.OracleConnector",     "tasks.max": "1",     "database.url": "jdbc:oracle:thin:@//xxx:1596/UATCDB",     "database.dbname": "UATCDB",     "database.pdb.name": "ebsuat",     "database.user": "xxx",     "database.password": "xxx",   "database.pdb.case.insensitive": "true",   "database.connection.adapter": "logminer",    "database.jdbc.property.oracle.jdbc.remarksReporting": "true",    "database.jdbc.property.oracle.jdbc.V8Compatible": "true",    "database.jdbc.property.oracle.jdbc.convertNcharLiterals": "false",    "database.jdbc.property.oracle.jdbc.initializationQuery": "ALTER SESSION SET CONTAINER=ebsuat",     "topic.prefix": "oraclefin",     "decimal.handling.mode": "double",     "heartbeat.interval.ms": "3000",     "linger.ms": "5",     "log.mining.archive.log.hours": "24",     "log.mining.batch.size.default": "10000",     "log.mining.batch.size.max": "500000",     "log.mining.batch.size.min": "5000",     "log.mining.scn.gap.detection.gap.size.min": "100000",     "log.mining.scn.gap.detection.time.interval.max.ms": "10000",     "log.mining.sleep.time.max.ms": "500",     "log.mining.strategy": "online_catalog",     "log.mining.query.filter.mode": "in",     "log.mining.transaction.retention.hours": "24",     "max.batch.size": "2048",     "max.in.flight.requests.per.connection": "10",     "max.queue.size": "16384",     "max.request.size": "1048576",     "poll.interval.ms": "2000",     "producer.override.batch.size": "131072",     "producer.override.linger.ms": "5",     "producer.override.offset.flush.timeout.ms": "60000",     "schema.history.internal.consumer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"xx\" password=\"xxxxx+/xxxx\";",     "schema.history.internal.consumer.sasl.mechanism": "PLAIN",     "schema.history.internal.consumer.security.protocol": "SASL_SSL",     "schema.history.internal.kafka.bootstrap.servers": "xxx.eu-west-1.aws.confluent.cloud:9092",     "schema.history.internal.kafka.topic": "oraclefin.cdc.connector",     "schema.history.internal.producer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"xx\" password=\"xxxx+/xxx\";",     "schema.history.internal.producer.sasl.mechanism": "PLAIN",     "schema.history.internal.producer.security.protocol": "SASL_SSL",     "schema.history.internal.skip.unparseable.ddl": "true",     "schema.history.internal.store.only.captured.tables.ddl": "true",     "schema.history.internal.store.only.captured.databases.ddl": "true",     "snapshot.fetch.size": "5000",     "snapshot.locking.mode": "none",     "snapshot.mode": "initial",     "snapshot.select.statement.overrides": "GL.GL_BALANCES",     "snapshot.select.statement.overrides.GL.GL_BALANCES": "SELECT * FROM GL.GL_BALANCES WHERE rownum <1",     "table.include.list": "ebsuat.GL.GL_BALANCES,ebsuat.GL.GL_CODE_COMBINATIONS,ebsuat.GL.GL_LEDGERS",     "value.converter": "io.confluent.connect.avro.AvroConverter",     "value.converter.basic.auth.credentials.source": "USER_INFO",     "value.converter.basic.auth.user.info": "xx:xxx",     "value.converter.schema.registry.url": "https://xxxx.aws.confluent.cloud",     "key.converter.basic.auth.credentials.source": "USER_INFO",     "key.converter.basic.auth.user.info": "xxx:xxx",     "key.converter": "io.confluent.connect.avro.AvroConverter",     "key.converter.schema.registry.url": "https://xxxx.aws.confluent.cloud"   }

      }'

      What is the captured database version and mode of deployment?

      (E.g. on-premises, with a specific cloud provider, etc.)

      on-prem

      What behavior do you expect?

      PDB name should force to lowercase

      What behavior do you see?

      always converting to Uppercase

      Do you see the same behaviour using the latest released Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      <Your answer>

      Do you have the connector logs, ideally from start till finish?

      (You might be asked later to provide DEBUG/TRACE level log)

      Caused by: Error : 65011, Position : 0, Sql = alter session set container="EBSUAT", OriginalSql = alter session set container="EBSUAT", Error Msg = ORA-65011: Pluggable database EBSUAT does not exist

      How to reproduce the issue using our tutorial deployment?

      create a docker container for a source connector to connect PDB database of lowercases

      Feature request or enhancement

      For feature requests or enhancements, provide this information, please:

      Which use case/requirement will be addressed by the proposed feature?

      <Your answer>

      Implementation ideas (optional)

      <Your answer>

              ccranfor@redhat.com Chris Cranford
              rameshrajch Ramesh raj
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: