Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-4255

Reduce verbosity of logging Oracle memory metrics

    XMLWordPrintable

Details

    Description

      Hi Team,

      I recently upgraded my Debezium Oracle setup to 1.7.0-Final, since the upgrade I'm getting a large numer of extra message in the Connect log.

      I'm running 4 Connectors and getting about 15,000 lines and 5MB of extra logs each hour, the size of this extra logging makes it very difficult to spot legitimate messages. 

      I've been through the 1.7 documentation and I can't see any parameter regarding extra message logging.

      Each connector gives 4-5 lines of messages every few seconds (presumably each time logs are mined).

      Messages appear as:

      [2021-11-07 21:45:27,356] INFO Requested thread factory for connector OracleConnector, id = source named = db-history-config-check (io.debezium.util.Threads:270)
      [2021-11-07 21:45:27,359] INFO KafkaDatabaseHistory Consumer config: {key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, enable.auto.commit=false, group.id=source-dbhistory, bootstrap.servers=AWS_MSK:9092, fetch.min.bytes=1, session.timeout.ms=10000, auto.offset.reset=earliest, client.id=source-dbhistory} (io.debezium.relational.history.KafkaDatabaseHistory:219)
      [2021-11-07 21:45:27,359] INFO KafkaDatabaseHistory Producer config: {retries=1, value.serializer=org.apache.kafka.common.serialization.StringSerializer, acks=1, batch.size=32768, max.block.ms=10000, bootstrap.servers=AWS_MSK:9092, buffer.memory=1048576, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=source-dbhistory, linger.ms=0} (io.debezium.relational.history.KafkaDatabaseHistory:220)
      [2021-11-07 21:45:27,359] INFO Requested thread factory for connector OracleConnector, id = source named = db-history-config-check (io.debezium.util.Threads:270)
      [2021-11-07 21:45:27,577] INFO Oracle Session UGA 1MB (max = 2.01MB), PGA 74.53MB (max = 107.53MB) (io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource:194)
      [2021-11-07 21:45:27,579] INFO Oracle Session UGA 3.01MB (max = 4.13MB), PGA 8.73MB (max = 52.3MB) (io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource:194)
      [2021-11-07 21:45:27,806] INFO WorkerSourceTask{id=ora-sonata-source-connector2-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:510)
      [2021-11-07 21:45:27,809] INFO WorkerSourceTask{id=ora-sonata-source-connector1-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:510)
      [2021-11-07 21:45:27,811] INFO KafkaDatabaseHistory Consumer config: {key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, enable.auto.commit=false, group.id=cache-dbhistory, bootstrap.servers=AWS_MSK:9092, fetch.min.bytes=1, session.timeout.ms=10000, auto.offset.reset=earliest, client.id=cache-dbhistory} (io.debezium.relational.history.KafkaDatabaseHistory:219)
      [2021-11-07 21:45:27,811] INFO KafkaDatabaseHistory Producer config: {retries=1, value.serializer=org.apache.kafka.common.serialization.StringSerializer, acks=1, batch.size=32768, max.block.ms=10000, bootstrap.servers=AWS_MSK:9092, buffer.memory=1048576, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=cache-dbhistory, linger.ms=0} (io.debezium.relational.history.KafkaDatabaseHistory:220)
      [2021-11-07 21:45:27,811] INFO Requested thread factory for connector OracleConnector, id = cache named = db-history-config-check (io.debezium.util.Threads:270)
      [2021-11-07 21:45:27,889] INFO Oracle Session UGA 1MB (max = 3.01MB), PGA 65.17MB (max = 117.05MB) (io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource:194)
      [2021-11-07 21:45:28,893] INFO KafkaDatabaseHistory Consumer config: {key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, enable.auto.commit=false, group.id=source-dbhistory, bootstrap.servers=AWS_MSK:9092, fetch.min.bytes=1, session.timeout.ms=10000, auto.offset.reset=earliest, client.id=source-dbhistory} (io.debezium.relational.history.KafkaDatabaseHistory:219)
      [2021-11-07 21:45:28,893] INFO KafkaDatabaseHistory Producer config: {retries=1, value.serializer=org.apache.kafka.common.serialization.StringSerializer, acks=1, batch.size=32768, max.block.ms=10000, bootstrap.servers=AWS_MSK:9092, buffer.memory=1048576, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=source-dbhistory, linger.ms=0} (io.debezium.relational.history.KafkaDatabaseHistory:220)

      I reverted to Debezium 1.6 and the messages didn't appear any longer.

      Attachments

        Activity

          People

            ccranfor@redhat.com Chris Cranford
            dridge@bravurasolutions.com David Ridge (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: