You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After upgrading from a previous version of librdkafka to version 2.53, I've encountered an issue where the KRB5CCNAME environment variable and sasl.kerberos.kinit.cmd configurations are ignored by broker threads during Kerberos authentication.
The initial kinit works correctly, and the Kerberos ticket is refreshed successfully as seen in the logs. However, once the broker threads start, the Kerberos credentials are not recognized, and the Kafka consumer fails with the following error:
GSSAPI Error: No credentials were supplied, or the credentials were unavailable or inaccessible
(No Kerberos credentials available (default cache: FILE:/tmp/krb5cc_1000))
Reproduction Steps:
Set up a Kafka consumer with Kerberos authentication using the following configuration:
sasl.kerberos.keytab
sasl.kerberos.principal
sasl.kerberos.kinit.cmd = KRB5CCNAME={path} kinit -c {path} -kt {keytab} {principal}
sasl.kerberos.min.time.before.relogin = 120000
Set the KRB5CCNAME environment variable to a specific cache file location.
Start the consumer and observe the logs for successful ticket refresh.
After the broker threads start, Kerberos authentication fails with the No Kerberos credentials available error, attempting to access /tmp/krb5cc_1000 instead of the specified cache file.
Logs:
DEBUG:deserializing_consumer:P_MainProcess:poll(98):SASLMECHS [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-de
v.az.ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: Broker supported SASL mechanisms: GSSAPI
20240910.151059:DEBUG:deserializing_consumer:P_MainProcess:poll(98):AUTH [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.
ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: Auth in state AUTH_HANDSHAKE (handshake supported)
20240910.151059:DEBUG:deserializing_consumer:P_MainProcess:poll(98):STATE [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.az
.ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: Broker changed state AUTH_HANDSHAKE -> AUTH_REQ
20240910.151059:DEBUG:deserializing_consumer:P_MainProcess:poll(98):SASL [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.
ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: Initializing SASL client: service name kafka, hostname vmdu
se1kafka02.dbb-ops-dev.az.ses, mechanisms GSSAPI, provider Cyrus
20240910.151059:DEBUG:deserializing_consumer:P_MainProcess:poll(98):SASL [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.
ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: My supported SASL mechanisms: GSS-SPNEGO GSSAPI GS2-KRB5 GS
2-IAKERB SCRAM-SHA-256 SCRAM-SHA-1 DIGEST-MD5 EXTERNAL CRAM-MD5 NTLM PLAIN LOGIN ANONYMOUS
20240910.151059:DEBUG:deserializing_consumer:P_MainProcess:poll(98):LIBSASL [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.
az.ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: GSSAPI client step 1
20240910.151059:CRITICAL:deserializing_consumer:P_MainProcess:poll(98):LIBSASL [rdkafka#consumer-1] [thrd:sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap]: sasl_ssl://vmduse1kafka02.dbb-ops-dev.az.ses:9092/bootstrap: GSSAPI Error: No credentials were supplied, or the credentials were unavailable or inaccessible (No Kerberos credentials available (default cache: FILE:/tmp/krb5cc_1000))
Expected Behavior: The Kafka consumer should use the correct ticket cache file specified in KRB5CCNAME for Kerberos authentication across all threads.
Workaround: I have temporarily resolved this issue by using a global kinit for all processes, but this leads to slower Kerberos refresh operations and is not ideal.
Additional Notes: This issue did not occur in the previous version of librdkafka that I was using (version maybe 2.3), and it seems the bug was introduced between these two versions.
Thank you for looking into this issue!
Checklist
librdkafka version (release number or git tag): 33882880
Description:
After upgrading from a previous version of librdkafka to version 2.53, I've encountered an issue where the KRB5CCNAME environment variable and sasl.kerberos.kinit.cmd configurations are ignored by broker threads during Kerberos authentication.
The initial kinit works correctly, and the Kerberos ticket is refreshed successfully as seen in the logs. However, once the broker threads start, the Kerberos credentials are not recognized, and the Kafka consumer fails with the following error:
Reproduction Steps:
Logs:
Expected Behavior: The Kafka consumer should use the correct ticket cache file specified in KRB5CCNAME for Kerberos authentication across all threads.
Workaround: I have temporarily resolved this issue by using a global kinit for all processes, but this leads to slower Kerberos refresh operations and is not ideal.
Environment:
Additional Notes: This issue did not occur in the previous version of librdkafka that I was using (version maybe 2.3), and it seems the bug was introduced between these two versions.
Thank you for looking into this issue!
Checklist
{ 'bootstrap.servers': 'vmduse1kafka01.XXX:9092,vmduse1kafka02.XXX:9092', 'sasl.mechanisms': 'GSSAPI',
'security.protocol': 'sasl_ssl', 'sasl.kerberos.service.name': 'kafka',
'sasl.kerberos.principal': '[email protected]', 'sasl.kerberos.keytab': 'platformui/resources/gilatplatformtestdev.key',
'sasl.kerberos.min.time.before.relogin': '120000',
'ssl.ca.location': 'platformui/resources/certs/ca-bundle.crt',
'sasl.kerberos.kinit.cmd': 'kinit -c /usr/local/nms/run/cache/krb5cc_1000_12 -kt platformui/resources/gilatplatformtestdev.key [email protected]',
'key.deserializer': None,
'value.deserializer': <confluent_kafka.schema_registry.avro.AvroDeserializer object at 0x71caa959a9e0>,
'group.id': 'gilatplatformtestdev-local-prod-test_kafka',
'auto.offset.reset': 'earliest',
'debug': 'security,broker,protocol', 'logger': <Logger test_kafka_consumer (DEBUG)>}
--> The bold line: Tried without env KRB5CCNAME , and using the env without the kinit.cmd and using relative path.
debug=..
as necessary) from librdkafkaThe text was updated successfully, but these errors were encountered: