security-dashboards-plugin
security-dashboards-plugin copied to clipboard
[BUG] Failed to resolve if it's a readonly tenant: Error: Request Timeout after 30000ms
What is the bug? After upgrading Opensearch-Dashboards from 2.18 to 2.19 I was unable to log in. The login would time out and I received the following error in the service logs:
Feb 19 18:14:48 sg-int-elastic04 opensearch-dashboards[4186]: {
"type": "log",
"@timestamp": "2025-02-19T18:14:48Z",
"tags": [
"error",
"plugins",
"securityDashboards"
],
"pid": 4186,
"message": "Failed to resolve if it's a readonly tenant: Error: Request Timeout after 30000ms\n at SecurityClient.dashboardsinfo (/usr/share/opensearch-dashboards/plugins/securityDashboards/server/backend/opensearch_security_client.ts:130:13)\n at processTicksAndRejections (node:internal/process/task_queues:95:5)\n at ReadonlyService.isReadonly (/usr/share/opensearch-dashboards/plugins/securityDashboards/server/readonly/readonly_service.ts:101:30)\n at ReadonlyService.hideForReadonly (/usr/share/opensearch-dashboards/src/core/server/security/readonly_service.js:18:13)\n at /usr/share/opensearch-dashboards/src/plugins/dashboard/server/plugin.js:48:14\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:52:21\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/resolve_capabilities.js:51:26\n at /usr/share/opensearch-dashboards/src/core/server/capabilities/routes/resolve_capabilities.js:53:26\n at Router.handle (/usr/share/opensearch-dashboards/src/core/server/http/router/router.js:174:44)\n at handler (/usr/share/opensearch-dashboards/src/core/server/http/router/router.js:140:50)\n at exports.Manager.execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/toolkit.js:60:28)\n at Object.internals.handler (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/handler.js:46:20)\n at exports.execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/handler.js:31:20)\n at Request._lifecycle (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/request.js:371:32)\n at Request._execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/request.js:281:9)"
}
How can one reproduce the bug? Steps to reproduce the behavior:
- Upgrade Opensearch and Opensearch-Dashboards from 2.18 to 2.19
What is the expected behavior? I am able to log into the server.
What is your host/environment?
- OS: Ubuntu
- Version 20.04.6 LTS
- Plugins
- Standard
- https://github.com/fbaligand/kibana-enhanced-table
Do you have any screenshots? If applicable, add screenshots to help explain your problem.
Do you have any additional context?
- After rolling back dashboards to 2.18, I still received the same error message
Failed to resolve if it's a readonly tenant: Error: Request Timeout after 30000ms - I was able to successfully query the endpoint
/_plugins/_security/dashboardsinfowhich the plugin seems unable to do. - This seems similar but is probably unrelated to 2164
- Setting
opensearch_security.multitenancy.enabled: falseallowed me to log in and bypass this error.
opensearch_dashboards.yml
opensearch.hosts:
- "https://opensearch01:59200"
- "https://opensearch02:59200"
- "https://opensearch03:59200"
- "https://opensearch07:59200"
- "https://opensearch08:59200"
- "https://opensearch09:59200"
opensearch.ssl.verificationMode: none
opensearch.username: xxxxxxxxxxxxxxxxxxx
opensearch.password: xxxxxxxxxxxxxxxxxxx
opensearch.requestHeadersWhitelist:
- securitytenant
- Authorization
opensearch_security.multitenancy.enabled: true
opensearch_security.multitenancy.tenants.enable_global: true
opensearch_security.multitenancy.tenants.enable_private: false
opensearch_security.multitenancy.tenants.preferred:
- Private
- Global
opensearch_security.multitenancy.enable_filter: false
opensearch_security.readonly_mode.roles:
- kibana_read_only
opensearch_security.cookie.secure: true
opensearch_security.cookie.ttl: 86400000
opensearch_security.session.ttl: 86400000
opensearch_security.session.keepalive: true
server.port: 443
server.host: 0.0.0.0
server.ssl.enabled: true
server.ssl.certificate: /etc/opensearch-dashboards/certs/dashboard-public.crt
server.ssl.key: /etc/opensearch-dashboards/certs/dashboard-private.key
opensearch_security.ui.basicauth.login.brandimage: "https://example.com/logos/logo-dark-sans-tag-2028x256.png"
opensearch_security.ui.basicauth.login.title: "Example AdvancedLogging"
opensearch_security.ui.basicauth.login.subtitle: ""
[Triage] Thank you for filing this error @msoler8785. Not sure why it didn't revert to previous behavior when downgrading to 2.18, but looks like this could be a bug.
I was able to successfully query the endpoint
/_plugins/_security/dashboardsinfowhich the plugin seems unable to do.
Do you see any error in the opensearch logs?
Hi, meet same issue here. opensearch logs:
[2025-06-19T16:00:43,810][WARN ][i.n.c.ChannelOutboundBuffer] [opensearch-node-1] Failed to mark a promise as success because it has failed already: DefaultChannelPromise@49778dbe(failure: io.netty.handler.codec.EncoderException: java.nio.channels.ClosedChannelException), unnotified cause:
io.netty.handler.codec.EncoderException: java.nio.channels.ClosedChannelException
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:107) ~[netty-codec-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.MessageToMessageCodec.write(MessageToMessageCodec.java:130) ~[netty-codec-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:891) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:875) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:984) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:868) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:113) ~[netty-codec-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:893) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:875) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:984) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:868) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.write(Netty4HttpPipeliningHandler.java:83) ~[transport-netty4-client-2.19.0]
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:891) [netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:956) [netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.AbstractChannelHandlerContext$WriteTask.run(AbstractChannelHandlerContext.java:1263) [netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) [netty-common-4.1.118.Final.jar:4.1.118.Final]
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) [netty-common-4.1.118.Final.jar:4.1.118.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472) [netty-common-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569) [netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:998) [netty-common-4.1.118.Final.jar:4.1.118.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.118.Final.jar:4.1.118.Final]
at org.opensearch.common.util.concurrent.OpenSearchExecutors$PrivilegedOpenSearchThreadFactory$1.lambda$run$0(OpenSearchExecutors.java:454) [opensearch-2.19.0]
at java.base/java.security.AccessController.doPrivileged(AccessController.java:319) [?:?]
at org.opensearch.common.util.concurrent.OpenSearchExecutors$PrivilegedOpenSearchThreadFactory$1.run(OpenSearchExecutors.java:453) [opensearch-2.19.0]
at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
Caused by: java.nio.channels.ClosedChannelException
at io.netty.channel.embedded.EmbeddedChannel.checkOpen(EmbeddedChannel.java:928) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.embedded.EmbeddedChannel.ensureOpen(EmbeddedChannel.java:948) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.channel.embedded.EmbeddedChannel.writeOutbound(EmbeddedChannel.java:420) ~[netty-transport-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.http.HttpContentEncoder.encode(HttpContentEncoder.java:341) ~[netty-codec-http-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.http.HttpContentEncoder.encodeContent(HttpContentEncoder.java:274) ~[netty-codec-http-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.http.HttpContentEncoder.encodeFullResponse(HttpContentEncoder.java:232) ~[netty-codec-http-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.http.HttpContentEncoder.encode(HttpContentEncoder.java:191) ~[netty-codec-http-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.http.HttpContentEncoder.encode(HttpContentEncoder.java:57) ~[netty-codec-http-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.MessageToMessageCodec$2.encode(MessageToMessageCodec.java:85) ~[netty-codec-4.1.118.Final.jar:4.1.118.Final]
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:90) ~[netty-codec-4.1.118.Final.jar:4.1.118.Final]
... 24 more
I found the reason, dashboards call api with header Accept-Encoding: zstd, and call will hang, issue: https://github.com/opensearch-project/OpenSearch/pull/17408
You can reproduce this with curl:
curl http://IP:9200/_plugins/_security/dashboardsinfo?pretty -uadmin: -H'Accept-Encoding: zstd'
@xiaoyuan0821 thank you for confirming. Can you confirm that this has been fixed with later patch release of 2.19 (either 2.19.1 or 2.19.2)?
@cwperks already fixed in 2.19.1
Thank you for confirming @xiaoyuan0821. Closing this issue.