Open-Assistant
Open-Assistant copied to clipboard
swagger docs for api's?
Do we have anything in place for building swagger or openapi type docs automatically and having them hosted somewhere?
maybe some github action that can just dump the openapi.json somewhere in the repo?
maybe automating something like this: https://fastapi.tiangolo.com/advanced/extending-openapi/#self-hosting-javascript-and-css-for-docs
just thinking in terms of developer docs where someone can just go an read the api docs without having to go near the app frontend.
Do we have anything in place for building swagger or openapi type docs automatically and having them hosted somewhere?
FastAPI provides automatically with a Swagger documentation for the backend API. Do you mean to enhace it? Also I think now they are hosting the docs in a GitHub Pages. @andrewm4894
I kind of mean should we not dump the swagger yaml or json. like this: https://github.com/netdata/netdata/blob/master/web/api/netdata-swagger.yaml
such that any dev can then just go somewhere like this: https://editor.swagger.io/?url=https://raw.githubusercontent.com/netdata/netdata/master/web/api/netdata-swagger.yaml
to see the api and not have to go near some app url like http://web.app.open-assistant.io/docs since that's really the end user facing app. just in case it ends up being too clunky to have to actually go into the app frontend to read up about the backend api's.
i must actually go into the FE and see if i can get to the autogenerated fastapi docs as i'm not even sure if the FE pulls them or how that wold work with next.js etc
yep - having an API section next to the Doc's section in here is kind of what i mean too.
https://projects.laion.ai/Open-Assistant/
So if we could generate the static files needed and dump them in /doc/docs/api/ and then we could have docusaurus pick them up i'd imagine.
made a backend-dev devcontainer here and manually generated some swagger type docs as part of that https://github.com/LAION-AI/Open-Assistant/pull/608
adding good first issue label as could be a nice one for someone to try add a GH action to publish openapi.json file from the backend into the repo in the docs/docs/api/ folder.
adding good first issue label as could be a nice one for someone to try add a GH action to publish
openapi.jsonfile from the backend into the repo in thedocs/docs/api/folder.
It would be on on merge with main?
It would be on on merge with main?
yep this is a good sort of starter example https://github.com/LAION-AI/Open-Assistant/blob/main/.github/workflows/test-api-contract.yaml
would just need to change the steps, and maybe then last step would be to make a PR to merge the updated openapi.json back into docs/docs/api folder - thats only last slight complex step. maybe something like this https://github.com/marketplace/actions/add-commit
am sure is probably already examples of use cases very similar to what we need out there - uses repo code to make some artifact, either then auto commit that artifact back to repo, or automate a PR so still a human in the loop.
If the json file will go into source control then it might be better to:
- Have a function that generates the output to a given path.
- Have a test that writes it to a tmpdir and diffs with what's in the docs dir, failing on any differences.
- Add a pre-commit hook that fires when there's changes to the pydantic code, and overwrites what's in the docs dir.
Benefits:
- We don't need github to run/test the process. Catches errors earlier.
- Keeps complexity/stovepiping out of build pipelines. Easier to maintain.
- If it can't be generated for whatever reason, we're not left in some weird state with a broken pipeline or out-of-date docs
- Atomic so no chance of race conditions in the process
- Only need one PR/merge commit rather than 2
@bitplane im just unsure if it's possible to write such a function from just the code alone. my manual steps were to:
- spin up
backend-devdevcontainer. - run the run_script.sh script to get dev backend up and running.
- finally a wget command to pull from localhost:8080/api/v1/openapi.json into
docs/docs/api
i reckon i could get a GH workflow to do this but not sure if is some easier or better way to just extract a openapi.json from the code itself. The fastapi docs looked a bit scary and complex and unsure how relevant it is in our codebase (probably just that i'm not super familiar with all the moving parts tbh).
@andrewm4894
I've actually been doing fastapi tests today in the day job :)
This ought to do it, just need to pass the app in:
from fastapi.testclient import TestClient
def get_openapi_json(app):
client = TestClient(app)
return client.get("/openapi.json").read()
@bitplane any idea how i would use that? like some sort of export_openapi.py file that lives next to main.py or ? Was going to have a go at it but unsure of the high level approach i should try take?
@andrewm yeah something like that. It'd need to be in the API project and import the file with app in it.
If it were me I'd have 4 functions in a file:
- The thing above that returns a string.
- Function to save some data to a given file path.
- Parse some arguments using argparse or whatever is cool nowadays, (with
args=sys.argvas a parameter and a default location set) - A
def main(args=sys.argv)that calls each of them.
Then have a #!/usr/bin/env python3 at the top and if __name__ == "__main__": that runs main(sys.argv). chmod +x the file so you can run it directly.
I'm pretty anal about modularity; functions have tunnel vision, unit tests need no thinking, things like step 2 can be re-used and extended easily (if in future it saves to any URI then we get that here for free). Might be going too far though. Just my 2c anyway, the important thing is to just have a python file that dumps the file to disk.
turns out we already actually had the openapi.json file as part of the api contract testing workflow so just added a step to save that to right docs folder and step to autocommit if the file has been changed after tests have ran successfully: https://github.com/LAION-AI/Open-Assistant/pull/719