Postico
Postico copied to clipboard
AWS IAM support for logging in
Please add support for logging in via AWS IAM
https://aws.amazon.com/about-aws/whats-new/2018/09/amazon-rds-postgresql-now-supports-iam-authentication/
I've seen a few requests for this recently (also with regard to Redshift) and I'm wondering how to best implement this. It's probably a lot of UI work, so I want to think about how to best integrate this into Postico.
Do you have any suggestions how you imagine the UI should look like? Are there any other DB clients that have AWS IAM integration?
As a quick solution, it should be possible to write a shell script that fetches temporary credentials using aws-cli, and then use the 'open' command to connect to the db with Postico:
#!/bin/bash
PASS=$(some command that fetches a password)
open -a Postico postgres://user:"$PASS"@dbhost:5432/dbname
AWS AMI should just use username/password. It is just generated by AWS for you.
AWS AMI should just use username/password. It is just generated by AWS for you.
I don't think you are understanding the problem. Some folks use a more advanced method than simple username/password to authenticate to redshift. There is an option to request essentially a one time token using your IAM credentials which is then used to authenticate to the database. Here is some documentation. https://docs.aws.amazon.com/redshift/latest/mgmt/redshift-iam-access-control-identity-based.html
I'm still not quite sure how to implement IAM authentication. I've read the docs, and it seems pretty straightforward. I guess I'll have to do an API call to get a (temporary) password, and then use that to connect to the PostgreSQL / Redshift server.
But how should it work? How would you imagine the UI to look like? Are there any other apps that offer IAM authentication?
OK, so I played around, and I built a little demo app that takes an AWS Redshift IAM URL, and fetches the DB credentials, and then opens the URL with Postico.
It looks like this:
There's a bug in Postico 1.4.3 where passwords with special characters from URLs are incorrectly parsed. Since random IAM passwords often contain special characters, this might cause connections to fail. So use the demo app with this build of Postico: Postico 3937.zip
I'm not sure how to integrate this into Postico. Can you try the demo app? Is it useful? What would you like improved?
Looks like I forgot to upload the demo app. It's hard to try a screenshot :)
Here's the link: AWS Playground.zip
For future reference: Redshift IAM URLs are documented here: https://docs.aws.amazon.com/redshift/latest/mgmt/generating-iam-credentials-configure-jdbc-odbc.html
Note: The "AWS Playground" tool above currently only supports URLS of the form jdbc:redshift:iam://identifier.xyz.region.redshift.amazonaws.com
. URLs of the form jdbc:redshift:iam://name:region
are not yet supported (require an extra API call)
@jakob Any updates on this? Definitely would be useful.
By the way, I'm happy to offer suggestions on how this could look, UI wise.
I have a script that I generate my temporary password (password expires after some time). I have a saved favorite connection (where I have unchecked Save In Keychain
but seems to still save the password :( )
When that password times out Postico is basically stuck and I have to close the window and go back to the connection favorites, click edit, and enter a new password. Why doesn't Postico just prompt for the new username/password which is very common amongst other similar apps?
Is there a way to enter the updated password without backing all the way out and starting over?
One good immediate enhancement would be to prompt the user for the new password when authentication fails while Postico is re-establishing the connection.
This is what my connection favorite looks like:
Don't suppose there is an update on this is there? Our support team needs to connect to our redshift db's using iam to query around and right now they are all using sql workbench J. If I can upgrade them to postico they will be happy campers, but we can't do that until iam is supported.
This would be a very welcome feature in our environment . . .
Ok I moved to Aurora Serverless and am now in need of this!
Would love this +1
Plz to add support. I am crying.
Any updates on this one?
Hey, our company really needs that, is there any update?
Please add this feature - we really need it!!!!
It's worth noting that even on AWS, IAM users aren't the only non-standard login solution: you can store and rotate passwords in Secrets Manager, which requires a similar credential retrieval step before any connection is initiated.
I think the best way to handle this would be more generally, with support for a pre-connect script. This would allow Postico to work with other one-time-password schemes and other database providers. This feature seems to be ubiquitous in database client libraries, although in that case the interface is a simple and flexible function callback.
What I would want is a script plugin system, where users can write (say, Python) scripts to handle any pre-connection dance their service requires. You could ship scripts which handle the most popular methods, such as AWS IAM authentication.
UI-wise, I would want a submenu under "Connect via SSH" called "Pre-Connection Script". The submenu would be populated with scripts in some folder. Scripts would ideally support two commands: one which lists the required inputs (IAM access key id, IAM secret, AWS region and Secret Name), and one which transforms the inputs into usable Postgres credentials. Choosing a script would replace the standard input fields with ones specified by the script.
I know this would be a lot of work, but it would allow Postico to support most non-standard login systems without you having to build that support yourself.
Support for AWS Secret Manager would be a great idea :)
I agree with danielbrauer about adding a scripting point. Many companies using AWS will be using something like aws-vault for managing AWS credentials or AWS SSO. This would be pretty difficult to support from inside Postico (e.g. setting environment variables, working directory, Keychain access, etc...)
This works for me, though I'd really like built-in support :)
RDSHOST="abc.xyz.us-east-1.rds.amazonaws.com"
RDSDBNAME="mydatabase"
open -a "Postico 2" "postgres://iamuser:$( \
aws rds generate-db-auth-token --hostname ${RDSHOST} --port 5432 --region us-east-1 --username iamuser \
| python3 -c "import sys; from urllib.parse import quote; sys.stdout.write(sys.stdin.read().replace('%', '%25').replace(':', '%3A').replace('/', '%2F').replace('?', '%3F'))" \
)@${RDSHOST}:5432/${RDSDBNAME}"
Any progress here? This would be a great feature.
Sorry, no updates at the moment.
I sounds like adding a feature specific for AWS would be too constraining for a lot of you, so the better solution is probably to add a generic scripting hook. Some time back I've already tried to do that, but I had problems reliably starting subprocesses from a sandboxed app.
It's however still a feature that I would absolutely love to add to Postico, since it seems to be a blocker for many of you.
However, it would be really useful if you could share commands/scripts that you would use to generate login credentials, and show some sample output. This would help me design how this feature works.
Hi @jakob , most of us are using this article from AWS to generate temporary IAM credentials and connect to the RDS instance using them. Hope this helps.
Below is an example of actual usage of the rds generate-db-auth-token
command:
AWS_PROFILE=${AWS_PROFILE} aws --region ${REGION} rds generate-db-auth-token --hostname $(AWS_PROFILE=${AWS_PROFILE} aws --region ${REGION} rds describe-db-clusters --db-cluster-identifier ${DB_CLUSTER_NAME} --output json | jq -r '.DBClusters[].Endpoint') --port 5432 --username "${IAM_USERNAME}" --output text
This command has 2 nested parts:
- top one is the
generate-db-auth-token
that actually generates the temporary DB password for the${IAM_USERNAME}
on a givenDB_HOST
, where - the actual hostname of the
DB_HOST
is retrieved using the nesteddescribe-clusters
sub-command.
The temporary password produced in the output is a long string of the following format:
<DB_HOST>:5432/?Action=connect&DBUser=<IAM_USERNAME>&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=<AWS_ACCESS_KEY_ID>%2F20230927%2F<REGION>%2Frds-db%2Faws4_request&X-Amz-Date=20230927T023804Z&X-Amz-Expires=900&X-Amz-SignedHeaders=host&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEIr%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJHMEUCIGK1ApasYQ%2BIKi8lkjntmRt001KXMExNUNgI%2FejPs6OZAiEA%2B86FCfaxHBR1UuPDXLg6lszdFUHrAP%2BHfg%2F49BWm6mAqwwMIg%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FARABGgwxNTMwNDIyMjA5NDQiDGA8vPo%2FfMr8tVhUBCqXA%2ByEDbhj9Lo09jxKOSpiNBGQGbmON9MNSTkIuNBfZAJz5EzmclgIfRW%2FX33MjGCO5AayZOiI2Whwhwe21oxHDDhLQh7R3Vf74C20mucZuORfe2GDK6nRT77f93BXSeRuZIRUN%2BtdcKrioc%2FtY75D91RfUI0pwQYjt%2B2tyZSxZUeg6EkUaKODplZJrRVoh0L9Tg8eYSNW279eak1LfDi0fJ25dAHgBuXAXrw7P77OctcXK89twX7tFESeUgTwE4xiMaIoJPSGCqUiF%2B6BLxffr4oF8V6vNCtPdQCBp5LARRU2Wtt%2BemERPjJby8WUp37wuyVbGUDpucrlLlG9S46w6X1zIktHow%2F4iPZX5fnIuj6G7o%2BLmRtT77MgDVukV2Ge%2FqBtVu8flB5RR1dmV3W2DXVdoS8elZGJgtvG4%2FZFI%2Br6iY6ZK%2FjOCtGxyWd83%2B0Zyi07xKcRnxhUDKxQIENHVPM%2B9zByV%2BE%2BU5Cjrj0%2B0bq9v%2BvtL2cUPZFspns8zQswnYYlsJTEm7bnD%2FSnSHvYmUO07CwoAhnjMOGKzqgGOqYBVRYhV0%2FH%2BU3h0XthGAGYRrufEjWS6Zu1%2BEKSV4COaqCh5Hf3helXmkxeivbsW2QId2F558JjMviYXcmF3J9vya0GkkBlRrMyPJYfsvoEQ163X6QEclkQS44BLt54WAuYdY%2BSUiiNnvJoZMGPAkC7VU3R6Z9jRxktXBEbwGSxEsHm0IsQ4kpKyPuvY%2B2nvbrzNWl4IzKnJav7Tj%2FWSuHfLhSsbuqX3g%3D%3D&X-Amz-Signature=<AWSV4Auth_SIGNATURE>
When you authenticate to the PostgreSQL instance on RDS, you have to provide this whole string as a user's password.
Please let me know if I can help with providing more information.
The above solutions very nearly work with Postico 2, but as @kurianoff mentioned the temporary password is a long string that is url-like which I believe trips up the connection string parsing within Postico.
For example, if I run open -a Postico\ 2 postgres://readonly:[email protected]:5432/database
Postico interprets readonly
as the nickname, readonly
as the host, and all other fields are blank. I also tried escaping $PGPASSWORD
by passing it through python3 -c "import urllib.parse;print(urllib.parse.quote(input()))" <<< $PGPASSWORD
- this resulted in the password becoming part of the nickname and part of the password ending up as the database name in Postico.
I also tried manually copy/pasting one of these temporary passwords into the connection parameters within Postico and it worked as expected, so my temporary solution is to do this:
PGPASSWORD="$(aws rds generate-db-auth-token ... )"
echo "$PGPASSWORD" | tr -d '\n' | pbcopy
open -a "Postico 2" "postgres://readonly@$host.com:5432/$database"
That way Postico opens and the temporary password is on my clipboard. However, this only works the first time because Postico saves the password by default and does not prompt for it on subsequent connections.
It seems these temporary passwords don't play well with connection strings, which is why AWS recommends using a different format with psql
- "host=host.com port=5432 dbname=database user=user password=$PGPASSWORD"
. If Postico could support this connection string format the above solution would work wonderfully.
@jakob I am using the following for temporary Redshift credentials, albeit in a script with more prompts and error handling:
credentials=$(aws redshift get-cluster-credentials \
--cluster-identifier "$CLUSTER_IDENTIFIER" \
--db-user "$DATABASE_USER" \
--db-name "$DATABASE_NAME" \
--duration-seconds "$DURATION_SECONDS" \
--profile "$AWS_PROFILE"
)
iam_username=$(echo "$credentials" | jq -r '.DbUser' | sed -e 's/%/%25/g' -e 's/:/%3A/g' -e 's/\//%2F/g' -e 's/?/%3F/g')
temporary_password=$(echo "$credentials" | jq -r '.DbPassword' | sed -e 's/%/%25/g' -e 's/:/%3A/g' -e 's/\//%2F/g' -e 's/?/%3F/g')
open -a "$POSTICO_CLIENT" "postgres://${iam_username}:${temporary_password}@${DATABASE_HOST}:${DATABASE_PORT}/${DATABASE_NAME}"
For all those who may be interested (FYI @agpiermarini @jczaplew), I have created a single backend PostgreSQL proxy that is called pgpx
, https://github.com/kurianoff/pgpx. The proxy injects itself into the PostgreSQL authentication step, runs AWS CLI command to generate the RDS IAM dynamic password, and substitutes client-provided password with a dynamic one. Once the connection is authenticated, it works as a dumb proxy between the client (in this case - Postico), and the backend PostgreSQL server.
This helped my users to forget about the need to put a new password into Postico every time they run it. They can simply use their beloved PostgreSQL client while benefitting from the security of RDS IAM authentication.
It is not a postgresql connection balancer in any way, it creates a single connection, but it can be easily extended to become a multi-backend proxy as it is built on the amazing postgresql proxy implementation from Encore.dev.
Just wanted to share. Hope this example could also be helpful for @jakob to implement full RDS IAM authentication support in Postico.
Hi! I've just added a new feature to Postico that should make using IAM a lot easier: a pre-connect shell script. I think I came up with a generic solution that should cover pretty much every use case we discussed here.
Here's a sneak peek what the feature looks like:
I also wrote a lengthy documentation page about it.
The main thing I wonder is if it is accessible enough -- it does require writing a script, which could be an obstacle.
Please download the latest development build and let me know if you can figure it out.
Thank you very much, @jakob - I just tested the solution, and it works very well!
I haven't tried a JSON scenario
, but can confirm that a simple echo
'ing of the password to stdout
does its job.
#!/usr/bin/env bash
AWS_PROFILE="<... profile ...>"
REGION="<... region ...>"
IAM_USERNAME="<... username ...>"
DB_ENDPOINT=$(... retrieve database host name ...)
PG_PASSWORD=$(AWS_PROFILE="${AWS_PROFILE}" aws --region "${REGION}" \
rds generate-db-auth-token --hostname "${DB_ENDPOINT}" --port 5432 --username "${IAM_USERNAME}" \
--output text --no-cli-pager)
echo "${PG_PASSWORD}"
Exactly what was needed! Thanks again 👍