docker-airflow icon indicating copy to clipboard operation
docker-airflow copied to clipboard

Incorrect padding errors from Fernet encryption

Open gagejustins opened this issue 6 years ago • 55 comments

No matter what password I use or where (what OS) I run the container, adding an Airflow connection through the CLI returns this error:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 171, in get_fernet
    _fernet = Fernet(fernet_key.encode('utf-8'))
  File "/usr/local/lib/python3.6/site-packages/cryptography/fernet.py", line 34, in __init__
    key = base64.urlsafe_b64decode(key)
  File "/usr/local/lib/python3.6/base64.py", line 133, in urlsafe_b64decode
    return b64decode(s)
  File "/usr/local/lib/python3.6/base64.py", line 87, in b64decode
    return binascii.a2b_base64(s)
binascii.Error: Incorrect padding

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 32, in <module>
    args.func(args)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1151, in connections
    new_conn = Connection(conn_id=args.conn_id, uri=args.conn_uri)
  File "<string>", line 4, in __init__
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/state.py", line 414, in _initialize_instance
    manager.dispatch.init_failure(self, args, kwargs)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 66, in __exit__
    compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 187, in reraise
    raise value
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/state.py", line 411, in _initialize_instance
    return manager.original_init(*mixed[1:], **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 695, in __init__
    self.parse_from_uri(uri)
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 717, in parse_from_uri
    self.password = temp_uri.password
  File "<string>", line 1, in __set__
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 735, in set_password
    fernet = get_fernet()
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 174, in get_fernet
    raise AirflowException("Could not create Fernet object: {}".format(ve))
airflow.exceptions.AirflowException: Could not create Fernet object: Incorrect padding

BUT: adding the login info through the UI Connections tab works totally fine. I've tried changing passwords but that doesn't help. The command I'm using:

airflow connections -a --conn_id first_conn --conn_uri postgresql://jgage:password@domain:port/schema

Any ideas?

gagejustins avatar Dec 27 '18 19:12 gagejustins

The exports in entrypoint.sh weren't working - not sure why

gagejustins avatar Dec 28 '18 14:12 gagejustins

Any idea why this problem was occuring? I'm experiencing the exact same problem and have no clue on why the exports aren't working.

SDubrulle avatar Jan 03 '19 07:01 SDubrulle

I don’t know either. I have it so that my Makefile exports the key directly in the shell every time I build a container, but then I can’t view the connections in the UI. Super weird

gagejustins avatar Jan 03 '19 16:01 gagejustins

@gagejustins what did you do to fix it? I am running into exactly the same problem.

darshanmehta10 avatar Jan 04 '19 15:01 darshanmehta10

  1. Look at script/entry point.sh - you’ll see that a Python command is being run and exported as an environment variable to create the Fernet key
  2. Copy the Python code and package it into an export statement with the env variable name FERNET_KEY
  3. In config/airflow.cfg, the fernet key is being defined as $FERNET_KEY, so its meant to pull from whatever you set the env variable as
  4. Whenever you run a container, run that export statement - I do it through Docker exec in my Makefile

Note: this is definitely not the optimal way to do it, but I haven’t been able to get it to work at all in any other way. I tried putting it at the end of the entrypoint script, I tried running it as a command in the Dockerfile, but to no avail

gagejustins avatar Jan 04 '19 16:01 gagejustins

I’m reopening the issue if both of y’all are having the same problem. My hacky solution wouldn’t work if you’re using multiple containers, and doesn’t let you use the connections tab in the UI, so we need to find a way to fix this

gagejustins avatar Jan 04 '19 16:01 gagejustins

@gagejustins - I'm working on this same issue right now as well. I will let you know what I come up with.

iter-io avatar Jan 05 '19 00:01 iter-io

Please do, so I don't need to be embarrassed when I show my personal Airflow setup to my coworkers 🥇

gagejustins avatar Jan 05 '19 00:01 gagejustins

Here are my logs related to this issue:

/entrypoint.sh: line 5: REDIS_HOST:=redis: command not found
/entrypoint.sh: line 6: REDIS_PORT:=6379: command not found
/entrypoint.sh: line 7: REDIS_PASSWORD:=: command not found
/entrypoint.sh: line 9: POSTGRES_HOST:=postgres: command not found
/entrypoint.sh: line 10: POSTGRES_PORT:=5432: command not found
/entrypoint.sh: line 11: POSTGRES_USER:=airflow: command not found
/entrypoint.sh: line 12: POSTGRES_PASSWORD:=airflow: command not found
/entrypoint.sh: line 13: POSTGRES_DB:=airflow: command not found 

The variable assignments are being executed as commands.

iter-io avatar Jan 05 '19 01:01 iter-io

Would triple quotes solve the problem?

gagejustins avatar Jan 05 '19 01:01 gagejustins

@gagejustins - I am no longer able to reproduce this error. Not sure exactly was causing it before.

iter-io avatar Jan 07 '19 19:01 iter-io

Jealous! Also where are your logs stored? Can not for the life of me find them the way this image is set up

gagejustins avatar Jan 07 '19 19:01 gagejustins

I'm getting it too and can't quite figure it out.

PedramNavid avatar Jan 08 '19 03:01 PedramNavid

@puckel any idea what's up here? Happy to fix if you have any insights

gagejustins avatar Jan 08 '19 03:01 gagejustins

What is your value for FERNET_KEY?

rsivapr avatar Jan 08 '19 03:01 rsivapr

ah, think there's the problem..guess we need to set a key? The readme says: By default docker-airflow generates the fernet_key at startup but I don't see it being done.

airflow@c522d5b593e5:~$ grep -i fernet airflow.cfg
fernet_key = $FERNET_KEY
airflow@c522d5b593e5:~$ echo $FERNET_KEY

airflow@c522d5b593e5:~$

PedramNavid avatar Jan 08 '19 04:01 PedramNavid

@PedramNavid it's done in the script/entrypoint.sh file

gagejustins avatar Jan 08 '19 04:01 gagejustins

Same issue here: echo $FERNET_KEY inside container gives nothing, which results in Incorrect padding exception. I cannot add connection neither with environment variables passed in from docker-compose.yml (they're actually correctly set inside running airflow container), nor from running airflow connections -a --conn_id postgres_staging --conn_uri="postgresql://airflow_user:airflow_password@postgres_stage.airflow.local:5432/airflow_user". Does anyone have solution yet?

eduard-sukharev avatar Jan 11 '19 14:01 eduard-sukharev

Thanks to @PedramNavid , setting FERNET_KEY env variable works, but is a workaround, IMO. Keep in mind that Fernet key must be 32 url-safe base64-encoded bytes, so doing openssl rand -base64 32 should generate you safe valid fernet key.

eduard-sukharev avatar Jan 14 '19 10:01 eduard-sukharev

Same error message

happyshows avatar Jan 21 '19 21:01 happyshows

Also experiencing this issue 👍

FranciscoCanas avatar Jan 21 '19 21:01 FranciscoCanas

same problem here as well

venuktan avatar Jan 24 '19 03:01 venuktan

~I had the same error on a fresh airflow install from the pip~

Update: I just had an old config file without the fernet key. Now it works fine.

Also, I followed those steps: https://airflow.apache.org/howto/secure-connections.html?highlight=fernet

oskar-j avatar Jan 27 '19 19:01 oskar-j

running openssl rand -base64 32 helped me for updating from 1.9 to 1.10.1

stefpe avatar Jan 28 '19 15:01 stefpe

I had the same issue, it was due to the fact that my Fernet keys were not the same on all the Airflow containers (webserver, scheduler and workers), which are being passed into Airflow via the Docker Env FERNET_KEY. Once I confirmed that all the containers had the same Fernet key this problem was solved.

cepefernando avatar Mar 20 '19 15:03 cepefernando

I have v1.10.1 and airflow initdb for postgesql 11 failed too. I required:

--python
>>> from cryptography.fernet import Fernet
>>> fernet_key= Fernet.generate_key()
>>> print(fernet_key.decode())
somelongkeyval
then

--bash export FERNET_KEY='somelongkeyval'; airflow initdb;

rupert160 avatar Mar 29 '19 00:03 rupert160

In my case, the fernet_key was loaded as '{FERNET_KEY}' when I tested 'airflow test ...' command. (I wrote some logs at the 'airflow/models/init.py' file) It means that os env. variables are not called correctly when the 'airflow.cfg' is loaded. So I wrote my fernet_key to 'airflow.cfg' directly. [core] ... fernet_key = f0e...... ... And It worked!

socar-thomas avatar May 10 '19 10:05 socar-thomas

HI

In my case, this work for me in bash: FERNET_KEY=$(python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)") export FERNET_KEY=$FERNET_KEY

HenrryVargas avatar Jun 18 '19 08:06 HenrryVargas

If you are looking for a consistent fernet key across executions to run on your host for development, you can use

: "${AIRFLOW__CORE__FERNET_KEY:=${FERNET_KEY:=$(python -c "import platform, hashlib, base64;print(base64.urlsafe_b64encode(hashlib.scrypt(platform.node().encode('utf8'),salt=platform.os.getenv('USER').encode('utf8'), n=2, r=8, p=1, dklen=32)))")}}"

on script/entrypoint.sh and it will create a consistent key.

jaul avatar Aug 12 '19 14:08 jaul

I was running into the same issue on Mac and found out it was my misunderstanding of how shell works:

The way I interact with the container was by docker exec -it {my-container-name} bash, which opens another process beside the original process that runs entrypoint.sh

Since it's a separate process, it doesn't have access to the environment variables exported by the entrypoint.sh. Therefore if I do echo $AIRFLOW__CORE__FERNET_KEY it returns null value, which is the reason of Incorrect padding errors

Now if I do source /entrypoint.sh before running airflow connection, the connection will be successfully added, but that means the fernet_key used will be different from the airflow.cfg. Therefore I guess the best way is to manually generate a fernet_key and pass it as an environment variable when you do the docker run, e.g. --env FERNET_KEY={my_key}

Hope this helps

jonathanlxy avatar Sep 19 '19 15:09 jonathanlxy

Had the same issue, followed these steps to generate the Fernet key and replaced it in the airflow.cfg file. Worked out well!

anshajgoel avatar Sep 30 '19 10:09 anshajgoel

i was having no problems with this until i ran airflow resetdb... now it seems i cannot connect to the postgres instance any more and i get this error.

apurvis avatar Oct 22 '19 19:10 apurvis

I'm using mount volume in Postgres container to save connections which I create in the first-run That way I think I'll be able to save connections across all container runs (I assume my understanding is correct about the way of saving airflow connections), and then I spin up my containers (webserver and postgres) with local executor. Then I create a connection and bring down containers and bring them up again. After that, I'm not able to edit the newly created connection. It gives me an invalid token error. Does anyone have a clue?

I've tried to set FERNET_KEY in the docker-compose file as suggested in the readme, but didn't help.

javidy avatar Oct 23 '19 20:10 javidy

I found a workaround to my problem. The problem was that if I created a connection and restarted the webserver container, then the connection would lost. I had to create my connection over again each time I restart containers.

My goal was to create airflow connections once only using UI and make them persist across container restarts. Therefore:

  1. I created a custom volume on Postgres in docker-compose-LocalExecutor.yml to persist metadata
  2. Spinned up my containers first time docker-compose -f docker-compose-LocalExecutor.yml up -d
  3. Created my connection from UI
  4. Restarted containers

After restart, I could not use the new connection as password could not be decrypted. There was an "invalid token" error. This is because each time you run containers entrypoint.sh generates new fernet_key and that key wasn't matching with the key airflow encrypted my password. To solve the issue, I had to comment out this line from entrypoint.sh and hardcode fernet_key generated from first run (Step 2).
#: "${AIRFLOW__CORE__FERNET_KEY:=${FERNET_KEY:=$(python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)")}}" : "${AIRFLOW__CORE__FERNET_KEY:=${FERNET_KEY:="myfernetkey="}}"

I guess this could be achieved by hardcoding fernet_key in airflow.cfg file as well, but remember that Airflow will use environment variable over the value in airflow.cfg. So make sure AIRFLOW__CORE__FERNET_KEY is not set if you want to achieve the same using airflow.cfg

javidy avatar Oct 24 '19 18:10 javidy

Hi @JavidY Could you please share steps for the below step

I created a custom volume on Postgres in docker-compose-LocalExecutor.yml to persist metadata

KarthikRajashekaran avatar Nov 04 '19 20:11 KarthikRajashekaran

Hi @KarthikRajashekaran ,

For creating volume I've modified docker-compose-LocalExecutor.yml. I've added a new section for volume under postgres container. Additionally, I've declared the new volume at the very bottom of the .yml file. The name (source) for the new volume is "airflow_metadata". You can pick whatever name you want. And destination for the volume is "/var/lib/postgresql/data" which is the default location for database files for. That is where all metadata is written.

For postgres container in compose file now whole section looks like below.

postgres:
    image: postgres:9.6
    environment:
        - POSTGRES_USER=airflow
        - POSTGRES_PASSWORD=airflow
        - POSTGRES_DB=airflow
    volumes:
        - airflow_metadata:/var/lib/postgresql/data

And declaration of new volume at the end of file is like this:

volumes: airflow_metadata:

javidy avatar Nov 04 '19 21:11 javidy

Got an error ERROR: yaml.scanner.ScannerError: mapping values are not allowed here in "./docker-compose-LocalExecutor.yml", line 37, column 34

KarthikRajashekaran avatar Nov 04 '19 21:11 KarthikRajashekaran

@JavidY Please check below

image

KarthikRajashekaran avatar Nov 04 '19 21:11 KarthikRajashekaran

I am having this issue on current Airflow 1.10.6. Will appreciate any hints.

smdelacruz avatar Nov 05 '19 06:11 smdelacruz

@KarthikRajashekaran this is how my .yml file looks like: image

javidy avatar Nov 05 '19 08:11 javidy

@JavidY Thanks able to spin up .. I followed the steps you have mentioned 1-4

Have added connections and variable to UI . Stopped the container and re-ran the docker-compose ...I lost those connections and variables again

KarthikRajashekaran avatar Nov 05 '19 11:11 KarthikRajashekaran

@KarthikRajashekaran did you rebuild the image after hardcoding fernet key into local entrypoint.sh file?

I can see that you're not using your local image but puckel/docker-airflow:latest. In this case, your webserver container is still using entrypoint.sh from puckel/docker-airflow:latest image. So, in a way your changes in entrypoint.sh is not regarded.

javidy avatar Nov 05 '19 15:11 javidy

I have restarted using docker-compose . Then manually added the connections to UI ..

docker-compose -f docker-compose-LocalExecutor.yml up -d

KarthikRajashekaran avatar Nov 05 '19 16:11 KarthikRajashekaran

That is ok. But before adding connections you need to change entrypoint.sh file as I mentioned in my second comment in this thread. You need to generate fernet key and hardcode it inside the shell file.

You need to carry out all the steps from my comment to achieve the desired result.

javidy avatar Nov 05 '19 16:11 javidy

@JavidY It worked well .thanks a lot 👍

krajashekaranvonage avatar Nov 05 '19 20:11 krajashekaranvonage

you're welcome @KarthikRajashekaran :)

javidy avatar Nov 06 '19 07:11 javidy

@JavidY I am trying to write to file in as below . but it failed

Error as

 No such file or directory: '/usr/local/airflow/tmp/snowflake_roles.csv

part of code as below

TMP_DIRECTORY = os.path.join(os.path.abspath("."), "tmp")
  with open(ROLES_PATH, "w") as roles_file, open(
        ROLE_GRANTS_PATH, "w"
    )

krajashekaranvonage avatar Nov 06 '19 19:11 krajashekaranvonage

be shure to copy /paste all the output from python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)" including the trailing "="

Example.

python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)"
6cLsuD9kKqr70xN5PKlFgJuGahER3DKmWtyseR8dZIA=

in mac if you doble click on the key, it will not select the traling '=' 🗡

zakkg3 avatar Nov 26 '19 09:11 zakkg3

for me setting this in my docker-compose.yaml worked

  webserver:
      build: .
      restart: always
      depends_on:
          - postgres
      environment:
          - LOAD_EX=y
          - EXECUTOR=Local
          - AIRFLOW__CORE__FERNET_KEY="<generated-key>"

SolbiatiAlessandro avatar Dec 22 '19 18:12 SolbiatiAlessandro

I have the same issue while I was trying to add a new variable using the UI Admin tab -> Variables. The UI showed me an error message Could not create Fernet object: Incorrect padding

In my case, as @eduard-sukharev pointed out that Fernet key must be 32 url-safe base64-encoded bytes, but apparently I created one with illegal characters :joy:

Solution is simply, I created a new one using openssl rand -base64 32, put it in our secret storage (we use Vault), and did the following

airflow@cdcf48897012:~$ /entrypoint.sh airflow resetdb

The entrypoint.sh contains the routine of reading the secrets from Vault, so it has to go first.

zachliu avatar Dec 31 '19 20:12 zachliu

To update for folks who may find this for a google search, this was happening for me because my key was not base64 encoded (no = at the end). Changing AIRFLOW__CORE__FERNET_KEY= so it had a proper value allowed me to resetdb

jeremy-page avatar Jan 07 '20 14:01 jeremy-page

I will add this as another alternative if people want to still pull the image and use the SequentialExecutor for quick demos.

docker pull puckel/docker-airflow
YOUR_FERNET_KEY=$(openssl rand -base64 32)
docker run -d -e AIRFLOW__CORE__FERNET_KEY=$YOUR_FERNET_KEY -p 8080:8080 puckel/docker-airflow webserver

MeeshCompBio avatar Oct 01 '20 02:10 MeeshCompBio

@javidy and @zachliu I followed the steps you stated and I was able to run /entrypoint.sh airflow db reset but still when I click on connection I get the error again.

Please I need help resolving this.

gabidoye avatar Mar 15 '22 05:03 gabidoye

Can you send me the error logs

Regards, Vénu Kasyap Tangirala

On Mar 14, 2022, at 10:43 PM, gabidoye @.***> wrote:

 @javidy and @zachliu I followed the steps you stated and I was able to run /entrypoint.sh airflow db reset but still when I click on connection I get the error again.

Please I need help resolving this.

— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you commented.

venuktan avatar Mar 15 '22 05:03 venuktan

@venuktan here is the log

Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/usr/local/lib/python3.9/site-packages/flask/_compat.py", line 39, in reraise raise value File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functionsrule.endpoint File "/usr/local/lib/python3.9/site-packages/flask_appbuilder/security/decorators.py", line 148, in wraps return f(self, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/flask_appbuilder/views.py", line 554, in list widgets = self._list() File "/usr/local/lib/python3.9/site-packages/flask_appbuilder/baseviews.py", line 1129, in _list widgets = self.get_list_widget( File "/usr/local/lib/python3.9/site-packages/flask_appbuilder/baseviews.py", line 1028, in get_list_widget count, lst = self.datamodel.query( File "/usr/local/lib/python3.9/site-packages/flask_appbuilder/models/sqla/interface.py", line 471, in query query_results = query.all() File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/query.py", line 3341, in all return list(self) File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/loading.py", line 101, in instances cursor.close() File "/usr/local/lib/python3.9/site-packages/sqlalchemy/util/langhelpers.py", line 68, in exit compat.raise( File "/usr/local/lib/python3.9/site-packages/sqlalchemy/util/compat.py", line 178, in raise raise exception File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/loading.py", line 81, in instances rows = [proc(row) for row in fetch] File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/loading.py", line 81, in rows = [proc(row) for row in fetch] File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/loading.py", line 602, in _instance state.manager.dispatch.load(state, context) File "/usr/local/lib/python3.9/site-packages/sqlalchemy/event/attr.py", line 322, in call fn(*args, **kw) File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/mapper.py", line 3378, in _event_on_load instrumenting_mapper._reconstructor(state.obj()) File "/usr/local/lib/python3.9/site-packages/airflow/models/connection.py", line 153, in on_db_load if self.password: File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/attributes.py", line 358, in get retval = self.descriptor.get(instance, owner) File "/usr/local/lib/python3.9/site-packages/airflow/models/connection.py", line 238, in get_password return fernet.decrypt(bytes(self._password, 'utf-8')).decode() File "/usr/local/lib/python3.9/site-packages/cryptography/fernet.py", line 195, in decrypt raise InvalidToken cryptography.fernet.InvalidToken

gabidoye avatar Mar 15 '22 13:03 gabidoye