mike
mike copied to clipboard
Does Mike work with GitLab Pages?
I'm not actually sure. It probably does? (I've never used Gitlab, so I don't know much about how its pages work.)
At worst, you could manually generate the docs and put them somewhere, and hopefully Gitlab would be happy with that. You might also be able to do it as part of Gitlab's CI to auto-generate docs whenever you push some commits.
You might also be able to do it as part of Gitlab's CI to auto-generate docs whenever you push some commits.
I try to do that but it didn't work 👎
I currently have gh-pages branch and CI has mike installation instruction (pip install mike), but now i'm having problems with CORS

Hmm. Unfortunately, Gitlab isn't my area of expertise, so I'm not sure I can be much help. If anyone else out there knows how to get this working on Gitlab though, I'd be happy to add some documentation explaining the details.
Im having the same issue trying to deploy my versioned docs with mkdocs-material and mike via Gitlab Pages. mike creates a gh-pages branch wich contains all my deployed versions and aliases and also the versions.json but im unable to get Pages to deploy the versioned docs. If someone solves this please let me know! A documentation on this would be great!
I'm also trying to use mike with GitLab pages. If I do get it figured out I'd be happy to contribute some docs.
We found a solution that works well for us:
When using mike deploy the generated version of the site will be pushed to the gh-pages branch by default. Gitlab expects a folder 'public' for its pages function. To your docs with GitLab-Pages you simply have to move the contents of the gh-pages branch into the public folder. Add a pipeline via a .gitlab-ci.yml file to the gh-pages branch, that looks like this:
image: alpine:3.13
pages:
stage: deploy
only:
- gh-pages
script:
- mkdir .public
- cp -r * .public
- mv .public public
- ls
artifacts:
paths:
- public
Edit: We now use alpine, since there is no need for python to move the files.
The gh-pages branch looks like this, where the folders contain a version of the docs each:

Hope this helps.
Hello @hpsem
I'm struggling with adding the .gitlab-ci.yml file in the branch gh-pages. I added that file directly from the web IDE that gitlab.com provides, but I'm getting a error when I try to deploy some changes in the docs from another branch (say some_docs for instance).
The ci/cd pipeline that runs with the mike deploy step raise an error that said:
error: failed to push branch gh-pages to http://cissf:[MASKED]@gitlab.com/group/repo.git: "warning: redirecting to https://gitlab.com/group/repo.git/
To http://gitlab.com/group/repo.git
! [rejected] gh-pages -> gh-pages (non-fast-forward)
error: failed to push some refs to 'http://cissf:[MASKED]@gitlab.com/group/repo.git'
hint: Updates were rejected because a pushed branch tip is behind its remote
hint: counterpart. Check out this branch and integrate the remote changes
hint: (e.g. 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details."
The deploy command that is executed is this:
mike deploy -r http://cissf:[email protected]/group/repo.git -b gh-pages --rebase -p 1.23
I use the --rebase option in an attempt to rebase any changes that the remote branch may have but didn't work, I get the same result.
So, after all this, what's the best way to add the .gitlab-ci.yml file to the gh-pages branch? Do I need to add it from the WebIDE that gitlab.com provides or is there another way to do it?
Thanks for any advice, cheers!
To add the .gitlab-ci.yml you can treat the gh-pages branch like any other branch. Use git checkout gh-pages, add the file, commit and push. Then switch back to the branch where you keep your docs folder and deploy via mike. Now the pipeline should be triggered by the commit that mike creates when deploying.
Keep in mind that when you use the webIDE to add a file, you need to pull the changes to your local repo before deploying.
Edit: I updated my previous post to be more clear how this works, hope this helps.
Thanks for the reply, I manage to get it to work. The mistake I was making is that I do not checkout and pull the changes to the gh-pages branch in my pipeline before the mike deploy command.
My solution with semver. I create branch for each minor version. Like that 0.12.x.
name: Publish docs via GitHub Pages
on:
push:
branches:
- '*.*.x'
jobs:
upload:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: "3.9"
- name: Cache pip
uses: actions/cache@v2
with:
# This path is specific to Ubuntu
path: ~/.cache/pip
# Look to see if there is a cache hit for the corresponding requirements file
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
- name: Install dependencies
run: pip install -r requirements.txt
- name: Set user
run: |
git config --local user.email "41898282+github-actions[bot]@users.noreply.github.com"
git config --local user.name "github-actions[bot]"
- name: Extract branch name
shell: bash
run: echo "##[set-output name=branch;]$(echo ${GITHUB_REF#refs/heads/})"
id: extract_branch
- name: Mike
run: |
mike deploy --push ${{ steps.extract_branch.outputs.branch }}
And manually bump minor version to latest :)
extra:
version:
provider: mike
default: latest
I wanted to get close to a native GitLab Pages experience out of the box - CI only/without local deploys, no manual steps for initially creating a pages branch and no custom .gitlab-ci.yml in a separate branch, so that normal git workflows can be used by developers.
This example is deploying on tags only and sets each tag as latest. The pages branch is only used by CI as storage for persisting public/ artifacts, which are then checked out in the job itself so they can be consumed by artifacts:.
Prerequisites:
- the image used for the
pagesjob should have git & mike etc installed, so it can push automatically (you could also add this tobefore_script) - create a project access token (or use another PAT), with e.g. CI variables
PROJECT_BOT_USER(its username) andPROJECT_BOT_TOKEN(the token)
pages:
stage: deploy
variables:
PAGES_BRANCH: pages
HTTPS_REMOTE: https://${PROJECT_BOT_USER}:${PROJECT_BOT_TOKEN}@${CI_SERVER_HOST}/${CI_PROJECT_PATH}.git
before_script:
- git config user.name $PROJECT_BOT_USER
- git config user.email [email protected]
- git fetch origin $PAGES_BRANCH && git checkout -b $PAGES_BRANCH origin/$PAGES_BRANCH || echo "Pages branch not deployed yet."
- git checkout $CI_COMMIT_SHA
script:
- mike deploy --rebase --prefix public -r $HTTPS_REMOTE -p -b $PAGES_BRANCH $CI_COMMIT_TAG
- mike set-default --rebase --prefix public -r $HTTPS_REMOTE -p -b $PAGES_BRANCH $CI_COMMIT_TAG
- git checkout $PAGES_BRANCH -- public/
artifacts:
paths:
- public/
only:
- tags
Room for improvement:
- I'm sure some of the script step could be cleaned up a bit, let me know
- squashing to always only have a single commit would make the gitlab graph look much nicer, this will always show an orphaned pages branch with its own history otherwise.
- if gitlab implements pushing via
CI_JOB_TOKEN, no token setup would be required (upstream: https://gitlab.com/gitlab-org/gitlab/-/issues/223679) - alternatively if
CI_JOB_TOKENcould fetch previous job artifacts, pushing to branches could be eliminated entirely (and just always take what's already deployed to Pages inpublic/perhaps.
If anyone tries this I'd be interested if it works out for you, but the snippet above should just work if you add git, mike and your mkdocs dependencies as well, and then start tagging your docs (I'm sure it can be extended so you can use both tags and master deployments though, just haven't tried).
For anyone still looking into this, can I also propose an alternative solution to nejch's, mostly similar however:
pages:
stage: deploy
image: python:latest
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
PAGES_BRANCH: gl-pages
HTTPS_REMOTE: https://gitlab-ci-token:${ACCESS_TOKEN}@${CI_SERVER_HOST}/${CI_PROJECT_PATH}.git
before_script:
- pip install mkdocs-material mike
- git config user.name $GITLAB_USER_NAME
- git config user.email $GITLAB_USER_EMAIL
- git fetch origin $PAGES_BRANCH && git -b checkout $PAGES_BRANCH origin/$PAGES_BRANCH || git checkout $PAGES_BRANCH || echo "Pages branch not deployed yet."
- git checkout $CI_COMMIT_SHA
script:
- mike deploy --rebase --prefix public -r $HTTPS_REMOTE -p -b $PAGES_BRANCH -u $CI_COMMIT_TAG latest
- mike set-default --rebase --prefix public -r $HTTPS_REMOTE -p -b $PAGES_BRANCH latest
- git checkout $PAGES_BRANCH -- public/
artifacts:
paths:
- public/
only:
- tags
Don't forget to create an access token and save it as a CI/CD variable in settings.
Solution mostly found here - needed some minor changes to make it work however
I guess that just basically adds the dependencies which as described above will really depend on your pip/docker image/poetry setup etc :) gitlab-ci-token usually implies CI job token so not sure that's a safe assumption for the username going forward but might be cleaner for now.