You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Producer and Consumer can use Python logging via the logger configuration option. This works well as long as the program is running. Unfortunately, the last log messages get lost when the program exits. Those messages can only be seen if the logger feature is not used.
When the consumer is closed, internally librdkafka closes the consumer first and then destroys the client. The logs related to consumer close method is being logged properly and logs related to destroy method are not logged. The logs are added to the queue in destroy as well but not getting logged properly. We will need to debug this more.
Description
The
Producer
andConsumer
can use Python logging via thelogger
configuration option. This works well as long as the program is running. Unfortunately, the last log messages get lost when the program exits. Those messages can only be seen if the logger feature is not used.How to reproduce
logger.py:
output with logger:
output without logger:
Expected behaviour
Both outputs should be similar.
Consumer.close()
should really terminate the consumer. TheProducer
should provide aclose()
method to terminate it cleanly.Checklist
Please provide the following information:
confluent_kafka.version()
andconfluent_kafka.libversion()
):{...}
'debug': '..'
as necessary)The text was updated successfully, but these errors were encountered: