console
console copied to clipboard
iam configuration not working
i use kowl successfully with msk on kubernets (helm) using SCRAM-SHA-512 with the following configuration: kowl: config: kafka: brokers: - ${broker1} - ${broker2} - ${broker3} tls: enabled: true insecureSkipTlsVerify: true sasl: enabled: true username: ${kafka_username} mechanism: SCRAM-SHA-512
When i try to use awsMskIam, the container is not being created on the kubernetes cluster with CreateContainerConfigError. The following is the configuration i use, and the accessKey and secretKey are valid keys with appropriate authorization (works for other clients)
kowl: config: kafka: brokers: - ${broker1} - ${broker2} - ${broker3} tls: enabled: true insecureSkipTlsVerify: true sasl: enabled: true mechanism: AWS_MSK_IAM awsMskIam: accessKey: XXXXXXXX secretKey: XXXXXXXXXXXXX
any hint? thanks, Amit
What's the log messages / error you see? Please also enable debug logging and post them here.
state: waiting: message: couldn't find key kafka-sasl-password in Secret kafka/services-secrets reason: CreateContainerConfigError
@amitca71 I assume you tried to deploy Console/Kowl with Helm? Could you please provide more information about what commands you executed? I recently added a new Helm chart, but I wasn't expecting someone to use it by now because it's fairly hidden. Thus I'm wondering what chart you are trying to use.
@wecco I do use helm charts... the new chart was not working.... i set it to older version: repository = "https://raw.githubusercontent.com/cloudhut/charts/master/archives" chart = "kowl" version= "2.3.0"
btw, when i put fake value for kafka-sasl-password in the secret, the container is being created, and the error is regular error for failure: {"level":"warn","ts":"2022-07-21T19:42:45.017Z","msg":"Failed to test Kafka connection, going to retry in 1s","remaining_retries":5} {"level":"info","ts":"2022-07-21T19:42:46.018Z","msg":"connecting to Kafka seed brokers, trying to fetch cluster metadata"}
@amitca71 I don't plan to make any changes to the existing chart anymore, but I'll make sure that the new chart will be compatible with it. Thanks for filing the issue!
thanks alot!! which version is it? can u share location?
@amitca71 Here: https://github.com/redpanda-data/console/tree/master/helm . Please let me know if you still have issues, then I'm happy to fix this :)
@weeco Hi, took me some time, but i got back to it. it looks like that authentication passes now, but there is an issue with authorization. its probably not related to helm, as i get the same issue with docker-compose. i can see that if i put wrong key, i get authentication error, while when i use the correct key, i get:
{"level":"info","ts":"2022-10-18T07:19:49.222Z","msg":"connecting to Kafka seed brokers, trying to fetch cluster metadata"} console_1 | {"level":"warn","ts":"2022-10-18T07:20:04.224Z","msg":"Failed to test Kafka connection, going to retry in 8s","remaining_retries":2}
i use the following example (with the resource change to my..): { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "kafka-cluster:Connect", "kafka-cluster:AlterCluster", "kafka-cluster:DescribeCluster" ], "Resource": [ "arn:aws:kafka:us-east-1:0123456789012:cluster/MyTestCluster/abcd1234-0123-abcd-5678-1234abcd-1" ] }, { "Effect": "Allow", "Action": [ "kafka-cluster:Topic", "kafka-cluster:WriteData", "kafka-cluster:ReadData" ], "Resource": [ "arn:aws:kafka:us-east-1:0123456789012:topic/MyTestCluster/" ] }, { "Effect": "Allow", "Action": [ "kafka-cluster:AlterGroup", "kafka-cluster:DescribeGroup" ], "Resource": [ "arn:aws:kafka:us-east-1:0123456789012:group/MyTestCluster/" ] } ] }
i can see that according to AWS documentation, there are two more parameters expected, that dont apear on kowl- red pand console code: sasl.client.callback.handler.class: software.amazon.msk.auth.iam.IAMClientCallbackHandler sasl.jaas.config: software.amazon.msk.auth.iam.IAMLoginModule required awsProfileName="default"; i was trying the same using both, quay.io/cloudhut/kowl:master as well as docker.redpanda.com/vectorized/console:latest , with same results. any idea? thanks, Amit
Any updates on this ! I'm trying to configure MSK authentication using helm as here but no luck , how can I specify the iam role ?
console:
config:
kafka:
brokers:
- b-1.services-stage-clus.ccc.c9.kafka.eu-central-1.amazonaws.com:9096
- b-2.services-stage-clus.ccc.c9.kafka.eu-central-1.amazonaws.com:9096
clientId: redpanda-console
sasl:
enabled: false
mechanism: AWS_MSK_IAM
Guys please .. any ideas on this ! documentation in redpanda helm chart is not very clear , I'm using IRSA :
console:
config:
logger:
level: debug
kafka:
brokers:
- b-1.services-stage-clus.ccc.c9.kafka.eu-central-1.amazonaws.com:9098
- b-2.services-stage-clus.ccc.c9.kafka.eu-central-1.amazonaws.com:9098
clientId: redpanda-console
sasl:
enabled: true
mechanism: AWS_MSK_IAM
tls:
enabled: true
I'm getting error like this :
{"level":"debug","ts":"2022-12-05T12:52:20.557Z","msg":"opening connection to broker","source":"kafka_client","addr":"b-1.services
-stage-clus.xxxxx.c9.kafka.eu-central-1.amazonaws.com:9098","broker":"seed 0"}
{"level":"debug","ts":"2022-12-05T12:52:20.565Z","msg":"kafka connection succeeded","source":"kafka_client_hooks","host":"b-1.serv
ices-stage-clus.xxxxxx.c9.kafka.eu-central-1.amazonaws.com","dial_duration":0.007470003}
{"level":"debug","ts":"2022-12-05T12:52:20.565Z","msg":"connection opened to broker","source":"kafka_client","addr":"b-1.services-
stage-clus.xxxxxx.c9.kafka.eu-central-1.amazonaws.com:9098","broker":"seed 0"}
{"level":"debug","ts":"2022-12-05T12:52:20.565Z","msg":"connection initialized successfully","source":"kafka_client","addr":"b-1.s
ervices-stage-clus.xxxxxx.c9.kafka.eu-central-1.amazonaws.com:9098","broker":"seed 0"}
{"level":"debug","ts":"2022-12-05T12:52:20.565Z","msg":"wrote Metadata v9","source":"kafka_client","broker":"seed 0","bytes_writte
n":36,"write_wait":0.007605138,"time_to_write":0.000020163,"err":null}
{"level":"debug","ts":"2022-12-05T12:52:20.866Z","msg":"read Metadata v9","source":"kafka_client","broker":"seed 0","bytes_read":0
,"read_wait":0.000046957,"time_to_read":0.301091737,"err":"EOF"}
{"level":"warn","ts":"2022-12-05T12:52:20.866Z","msg":"read from broker errored, killing connection after 0 successful responses (
is sasl missing?)","source":"kafka_client","addr":"b-1.services-stage-clus.xxxxx.c9.kafka.eu-central-1.amazonaws.com:9098","broke
r":"seed 0","err":"EOF"}
{"level":"debug","ts":"2022-12-05T12:52:20.866Z","msg":"kafka broker disconnected","source":"kafka_client_hooks","host":"b-1.servi
ces-stage-clus.xxxxx.c9.kafka.eu-central-1.amazonaws.com"}
{"level":"debug","ts":"2022-12-05T12:52:20.866Z","msg":"retrying request","source":"kafka_client","tries":5,"backoff":2.5,"request
_error":"EOF","response_error":"EOF"}
Is this option ever worked with IAM role?we are using serviceaccount that has an IAM role for argocd applications , deploying it with helm seems to
sasl:
# enabled: false
# username:
# password: # This can be set via the --kafka.sasl.password flag as well
# mechanism: AWS_MSK_IAM
Keeps failing to connect to MSK with the following error Internal error: SASL_AUTHENTICATION_FAILED:
When using SCRAM-SHA-12 with Secretsmanager secret it works , using awsMSKiam does not seem to work too
Hi, i tryed now with docker-compose as below, getting error: console_1 | {"level":"debug","ts":"2023-01-31T13:27:55.792Z","msg":"connection initialized successfully","source":"kafka_client","addr":"xxx.amazonaws.com:9098","broker":"seed 1"} console_1 | {"level":"debug","ts":"2023-01-31T13:27:55.793Z","msg":"wrote Metadata v9","source":"kafka_client","broker":"seed 1","bytes_written":36,"write_wait":1.2409851,"time_to_write":0.0002379,"err":null} console_1 | {"level":"debug","ts":"2023-01-31T13:27:58.242Z","msg":"read Metadata v9","source":"kafka_client","broker":"seed 1","bytes_read":0,"read_wait":0.0008087,"time_to_read":2.4485623,"err":"context deadline exceeded"} console_1 | {"level":"debug","ts":"2023-01-31T13:27:58.242Z","msg":"read from broker errored, killing connection","source":"kafka_client","addr":"xxx.us-east-2.amazonaws.com:9098","broker":"seed 1","successful_reads":0,"err":"context deadline exceeded"} console_1 | {"level":"debug","ts":"2023-01-31T13:27:58.242Z","msg":"kafka broker disconnected","source":"kafka_client_hooks","host":"xxx.amazonaws.com"} console_1 | {"level":"debug","ts":"2023-01-31T13:27:58.243Z","msg":"retrying request","source":"kafka_client","tries":2,"backoff":0.524828258,"request_error":"context deadline exceeded","response_error":"context deadline exceeded"}
kafka:
brokers: ["xxxxx.amazonaws.com:9098", "xxx.amazonaws.com:9098", "xxxx.amazonaws.com:9098"]
tls:
enabled: true
insecureSkipTlsVerify: true
sasl:
enabled: true
mechanism: AWS_MSK_IAM
awsMskIam:
accessKey: xxxx
secretKey: xxx
sessionToken: xxxx
Hey, I'm struggling to understand weather AWS_MSK_IAM can be used at all without the need of explicitly settings parameters for accessKey or secretKey but rather using the IAM credentials that the AWS compute instance have (ECS in my case). I would like to avoid having to create an AWS IAM User just for this. All of the clients are authenticating to MSK using AWS IAM SALS auth method therefore not having to maintain key/user rotation of any kind.
Setting the sasl mechanisim to AWS_MSK_IAM without setting the rest of the parameters doesn't seem to work, ending up in the following error:
{
"level": "warn",
"ts": "2023-06-30T15:24:08.505Z",
"msg": "read from broker errored, killing connection after 0 successful responses (is SASL missing?)",
"source": "kafka_client",
"addr": "<broker-addr>:9098",
"broker": "seed 0",
"err": "EOF"
}
Thanks in advance for any response.
Hey, I'm struggling to understand weather
AWS_MSK_IAMcan be used at all without the need of explicitly settings parameters foraccessKeyorsecretKeybut rather using the IAM credentials that the AWS compute instance have (ECS in my case). I would like to avoid having to create an AWS IAM User just for this. All of the clients are authenticating to MSK using AWS IAM SALS auth method therefore not having to maintain key/user rotation of any kind. Setting the sasl mechanisim toAWS_MSK_IAMwithout setting the rest of the parameters doesn't seem to work, ending up in the following error:{ "level": "warn", "ts": "2023-06-30T15:24:08.505Z", "msg": "read from broker errored, killing connection after 0 successful responses (is SASL missing?)", "source": "kafka_client", "addr": "<broker-addr>:9098", "broker": "seed 0", "err": "EOF" }Thanks in advance for any response.
Hey did you find any solution?
We have a separate issue for env refresh, not sure what's going on in this issue.
(note #275 mentioned above)
Hey, I'm struggling to understand weather
AWS_MSK_IAMcan be used at all without the need of explicitly settings parameters foraccessKeyorsecretKeybut rather using the IAM credentials that the AWS compute instance have (ECS in my case). I would like to avoid having to create an AWS IAM User just for this. All of the clients are authenticating to MSK using AWS IAM SALS auth method therefore not having to maintain key/user rotation of any kind. Setting the sasl mechanisim toAWS_MSK_IAMwithout setting the rest of the parameters doesn't seem to work, ending up in the following error:{ "level": "warn", "ts": "2023-06-30T15:24:08.505Z", "msg": "read from broker errored, killing connection after 0 successful responses (is SASL missing?)", "source": "kafka_client", "addr": "<broker-addr>:9098", "broker": "seed 0", "err": "EOF" }Thanks in advance for any response.
Hey did you find any solution?
No, this is the configuration I end up using. I could not use IAM auth with MSK and I end up use SASL username and password and store those in AWS Secret manager.
"environment": [
{
"value": "${aws-region}",
"name": "AWS_REGION"
},
{
"value": "${endpoints}",
"name": "KAFKA_BROKERS"
},
{
"value": "true",
"name": "KAFKA_TLS_ENABLED"
},
{
"value": "SCRAM-SHA-512",
"name": "KAFKA_SASL_MECHANISM"
},
{
"value": "true",
"name": "KAFKA_SASL_ENABLED"
},
{
"value": "true",
"name": "KAFKA_TLS_INSECURESKIPTLSVERIFY"
}
],
"secrets": [
{
"valueFrom": "${pass}:password::",
"name": "KAFKA_SASL_PASSWORD"
},
{
"valueFrom": "${user}:username::",
"name": "KAFKA_SASL_USERNAME"
}
],
(Sorry for the late reply)