mask
mask copied to clipboard
Possibility to set global variables
It can simplify a lot if we can setup variables that are global per whole script.
Strongly for that. That would follow DRY principle and simplify a lot of automations!
I agree!
In our usecase, mask
would replace individual scripts of type shown below, where we need to import global variables which come from .env
(especially where a service is deployed multiple times)
(As noted below, optional variables are also something that we could use well)
#!/bin/bash
source .env
DC_CMD="docker-compose -f docker-compose.yml -f docker-compose.override.yml"
case $1 in
up) ## start docker-compose stack or services
$DC_CMD up -d $2
;;
down) ## stop docker-compose stack or services
$DC_CMD down $2
;;
# ...
backup)
mkdir -p ${DATA_DIR}/backup
BASENAME=backup-$(date +%s)
docker exec ${SERVICE}_database backup.sh > $DATA_DIR/backup/$BASENAME.db
As a workaround, I tried to do this and a few variants of it: (imagine an .env
with VAR=abc
)
## echo
> echoes a variable
~~~bash
$MASK init
# also tried: source <($MASK init)
echo $VAR
~~~
## init
~~~bash
source .env
# also tried "export" here
~~~
The issue is, of course, that each bash
block is executed as its own shell and so the environment gets lost once it exits. Not knowing much Rust and looking at the docs, I can't see a very obvious way to circumvent this and stay completely shell agnostic. It would be feasible to wrap the entire command source into the shell chosen in the "special command" init
(or other) assuming it is appendable. But I can't think of an elegant way that stays as nicely simple as it is now.
(Note that I assume here that I want more capabilities than adding environment variables. This alone could be done with process::Command::envs
but then I could just source .env
at the beginning of every bash block. In my case, $DC_CMD
is something I don't want to put into .env
.)
I've got around this by creating a helper task:
## var (v)
> Some global vars that can be used by other tasks
You can override these by setting the respective env var before running `mask`
~~~sh
THING=${THING:-"world!"}
echo "${!v}"
~~~
## hello-shell
**OPTIONS**
* thing
* flags: -t --thing
* type: string
* desc: Thing
~~~sh
T=${thing:-"$($MASK var THING)"}
echo "Hello, $T"
~~~
## hello-py
**OPTIONS**
* thing
* flags: -t --thing
* type: string
* desc: Thing
~~~python
import os
mask = os.getenv("MASK")
thing = os.getenv("thing", os.popen('{} var THING'.format(mask)).read())
print("Hello, {}".format(thing))
~~~
The results
Using defaults:
$ mask hello-shell
Hello, world!
$ mask hello-py
Hello, world!
Override default using an option:
$ mask hello-shell -t "You ;)"
Hello, You ;)
$ mask hello-py -t "You ;)"
Hello, You ;)
Override using an environment variable:
THING="Planet Earth" mask hello-shell
Hello, Planet Earth
THING="Planet Earth" mask hello-py
Hello, Planet Earth
I am not overly keen on this approach, but it solves my problem. I would love to see native support for global vars/envs.
A new workaround which works better for me (note that I specifically don't need to override from an external environment variable)
manage.sh
(which was the old name of my scripts anyway)
#!/bin/bash
set -a
source .env
DC_CMD="docker-compose -f docker-compose.yml -f docker-compose.override.yml"
mask $*
Maskfile.md
## up
> Run the services
~~~bash
$DC_CMD up
~~~
## echovar
~~~bash
echo $VAR
# VAR is then from .env
~~~
I then run ./manage.sh echovar
instead of mask echovar
. You can of course locally have a mask
script that you call with ./mask
or make an alias to a custom script that checks if there is a local ./mask
.
So I currently work around this by sourcing an env file whenever I need it:
## build
~~~bash
source "$MASKFILE_DIR/.env"
echo $WHATEVER_VAR
~~~
But I do see a need to have this happen automatically, especially for other runtimes where sourcing an env file is not as easy.
I see three ways this could be implemented.
-
Adding a new mask argument such as
--env /path/to/env
which mask would inject into the shell environment before running the command. If the path given to--env
is relative, it's treated as relative to the $MASKFILE_DIR, not relative to whatever directory you might be in. This means it's still important that whenever you call mask inside your maskfile.md to run another command, you should use$MASK <...>
instead which inherits the--maskfile
arg as well as this new--env
arg to ensure it calls mask with the same context (the documentation describes this which is mainly useful for a global maskfile which you set up an alias for). -
Parsing the maskfile.md for something like
ENV_PRELOAD: "$MASKFILE_DIR/whatever.env"
which would tell mask to load that env file into the shell environment before executing any command. (ENV_PRELOAD
is not an ideal name... probably something better but I haven't thought about it fully) -
Mask automatically looks for a
./maskfile.env
file in the same directory, and if it exists, it injects it into the shell environment before executing any command.
I think # 3 is the most simple solution and probably results in the best experience... no need to specify mask args meaning that you can call mask
as normal (no need for a shortcut alias like m=$(mask --env path/to/env)
)
#2 and #3 would be nice. Many software assume .env
for environment files and it would by a miss opportunity to have to keep two files in sync.
I just thought of a 4th option which I believe is better than options 2 & 3 and is more flexible.
Mask could look for a top-level code block that uses an "env"
lang code. If it finds that, it will interpret it as a shell script and inject any variables into the environment before executing the command you're running.
# Maskfile Title
~~~env
# You could either source a file that includes env vars...
source "$MASKFILE_DIR/.env"
# And/or write out your vars here...
MY_VAR=value
ANOTHER_VAR=value
~~~
## some_command
~~~bash
# No need to source anything
echo $MY_VAR
~~~
This is much more flexible and explicit than the other options and allows you to both specify env vars and source any file if necessary.