ui
ui copied to clipboard
UI Forwarding Expired Tokens to Codec Server
Describe the bug Temporal UI is setup with a codec server and to forward the user access token the ui has. It seems that temporal has a token with a pretty short lifespan. When the token is forwarded to our codec server, and then that codec server validates it with the issuer, it receives the error that the token has expired.
To Reproduce
- Setup Temporal Web With OIDC (were using v2.21.3)
- Setup Codec Server With OIDC validation against the issuer
- Log into Temporal web.
- Validate that the new token does work and decodes workflows.
- Wait for token to expire (aprox 30min)
- Note that temporal web does not request a new login.
- Try to view an encoded workflow.
- Note errors.
- Manually hit the log out button in web ui.
- Refresh page.
- Note decoding is now working.
Expected behavior For temporal to get a new valid token and forward that to the codec server when needed.
Screenshots
Desktop (please complete the following information):
- OS: [e.g. iOS] Windows 11
- Browser [e.g. chrome, safari] Chrome 120
- Version [e.g. 22]
Additional context Okta is the issuer we use.
I am seeing something similar on 2.21.3 where you have to clear the localStorage every time the token expires. Once the localStroage is cleared you will be re-prompted for SSO login to get a new valid token. The api calls return 403 when the token is expired and I see this screen below
We are experiencing the same issue. The easiest workaround, although certainly not the convenient one is to go to https://your-temporal-ui-instance/login, re-authenticate with your SSO provider and return back to the page you need. This bug makes the UI rather unusable after 1h from re-logging in.
The below screenshots are based on requesting https://your-temporal-ui-instance/namespaces/default/workflows.
Here's what happens when the token has expired (1h for most OIDC providers):
With the refreshed token the same page looks like this:
I did some digging in the frontend code and I noticed, that the requestFromAPI() function by default includes the handleError() request handler, which is supposed to arrange a browser redirect on 401 and 403 responses. I tried to take a look at what is happening in each specific case.
-
For the
/api/v1/namespaces
route there is a fetchNamespaces function, which has onError set to a custom handler: https://github.com/temporalio/ui/blob/5bbf32a944f44e69826eb21b89d695814638fbee/src/lib/services/namespaces-service.ts#L57-L61 This doesn't seem to trigger the redirect logic from handleError(). -
For the
/api/v1/namespaces/$namespace/search-attributes
route there is a fetchSearchAttributesForNamespace function, which has no onError handler set at all: https://github.com/temporalio/ui/blob/cdc9f705a4262bf84254349471a3497dfb92b596/src/lib/services/search-attributes-service.ts#L13-L15 -
For the
/api/v1/cluster-info
route there is a fetchCluster function, which has no onError handler set either: https://github.com/temporalio/ui/blob/5bbf32a944f44e69826eb21b89d695814638fbee/src/lib/services/cluster-service.ts#L13-L17
Where I can see how handleError() is set up and seems to be working is in the fetchAllWorkflows() function: https://github.com/temporalio/ui/blob/5bbf32a944f44e69826eb21b89d695814638fbee/src/lib/services/workflow-service.ts#L111-L130
But perhaps I am totally wrong here and the issue lays anywhere else. The issue was introduced somewhere between v2.20 and v2.21.3 and persisted ever since.
Perhaps @Alex-Tideman could take a quick look when possible.
Taking a look this morning, thanks for the details and the screenshots @yurkeen!
Here's how these 403s appear in the frontend server logs as requests received from the UI server:
{
"level":"debug",
"ts":"2024-02-06T10:14:11.827Z",
"msg":"attempted incoming TLS connection",
"address":"172.X.Y.194:54796",
"server-name":"temporal-frontend-server.svc.cluster.local",
"logging-call-at":"local_store_tls_provider.go:321"
}
{
"level":"debug",
"ts":"2024-02-06T10:14:11.827Z",
"msg":"returning TLS config for connection",
"address":"172.X.Y.194:54796",
"server-name":"temporal-frontend-server.svc.cluster.local",
"logging-call-at":"local_store_tls_provider.go:378"
}
{
"level":"debug",
"ts":"2024-02-06T10:14:11.828Z",
"msg":"successfully established incoming TLS connection",
"server-name":"temporal-frontend-server.svc.cluster.local",
"name":"",
"logging-call-at":"tls_config_helper.go:85"
}
{
"level":"error",
"ts":"2024-02-06T10:14:11.830Z",
"msg":"Authorization error",
"error":"Token is expired",
"logging-call-at":"interceptor.go:174",
"stacktrace":"...."
}