devise_token_auth
devise_token_auth copied to clipboard
Storing auth tokens outside of users table, or in a separate db?
I would like to set up my api to support changing tokens on every request, however I'm concerned about the performance impact this might have at scale since having tokens stored on the user column would imply that I'm updating the Users table on every request.
In Rails you can set up a separate database table for storing sessions, and even abstract that session store away from the default database by moving it to a fast key/value datastore like Redis -- (which has become a commonly recommended "good practice.")
-
Do you see there being any way for
devise_token_authto accommodate storing tokens elsewhere, like in a separate table or database? -
When tokens are changed on each request, what are the performance implications of the current approach? How many simultaneous users could the User's postgresql json column for token storage take before resulting in bottlenecks? 10 requests a second? 20?
Hey @KelseyDH of course your approach us possible with a pull request.
As for performance... Nobody has reported an issue with this approach but I also think many users disable changing on each request.
Have you tried benchmarking it yourself? Shouldn't be to hard to setup.
Do you see there being any way for devise_token_auth to accommodate storing tokens elsewhere, like in a separate table or database?
In any case the User table will need to be queried first with each request using the user's id. If we break the tokens out into a separate table, then we need to also query the tokens table. A query to the User table with a join to a Tokens table will be slower than just a query to the User table.
If the tokens are stored in redis, then we would need to query the User table and then check redis for the user's tokens that match the given client_id. In this case we would be calling out to redis when we could just be referencing the data in the user record that needs to be fetched first anyway.
Maybe I'm missing something, but how would either of those approaches be faster than just checking the tokens column on the User record?
In addition to the points above about performance, breaking out the tokens into another table or redis store would increase complexity. The performance gains would need to be significant in order to justify the increased complexity.
How many simultaneous users could the User's postgresql json column for token storage take before resulting in bottlenecks? 10 requests a second? 20?
See the section on batch requests. You will only be able to update the token once per 5 seconds or so. Any requests within that timeframe will need to use the same token. The issues are that we need to accommodate for concurrent requests that use the same token, and that we can't guarantee the order in which the client receives the responses from the API.
Just to be clear I'm not saying that the API can only handle one request every 5 seconds. That is just the minimum timeframe to use when changing tokens.
Hi all, Just to re-instantiate the discussion: We recently switched to Redis for the token management and the reason for this was the following: After switching to a replicated database setup for high availability, we observed that write requests in databases are quite costly which is obvious due to replication etc. Hence, you want to keep the write requests to a minimum while read requests are not an issue at all. In our application, we have a lot of routes where we solely fetch data, however due to the token management which was previously tied to the database, this incurred a write to the database with each and every request. Even worse, when there are concurrent requests, we experienced very high lock contention/transaction abort rates due to the used lock. Using Redis for token management, there are no writes to the database anymore for the majority of our routes. We are currently running this feature in beta stage to see if there are any issues. I can share the code later on if anyone is interested.