EvalAI-Starters
EvalAI-Starters copied to clipboard
How to create a challenge on EvalAI?
The scripts add the ability to add challenges to the locally hosted environment. This would help in testing, validating and creating challenges locally. Also added the associated documentation to use...
In `utils.py` in two instances to fetch the repo the calls are chained as ``` def add_pull_request_comment(github_auth_token, repo_name, pr_number, comment_body): .... client = Github(github_auth_token) repo = client.get_user().get_repo(repo_name) ``` This does...
Trying to run a simple challenge evaluation server using the create challenge using Github method. Getting the following error once I push to challenge branch, with changes made to the...
To resolve #65 based on #https://github.com/actions/setup-python/issues/555#issuecomment-1337036543
Hi, I'm having difficulty retrieving a docker image that I submitted. I'm following the remote challenge evaluation script, and I am able to retrieve the submission from the queue which...
This PR adds examples for logger in evaluation script and adds some suggestions in the README.
Bumps [numpy](https://github.com/numpy/numpy) from 1.19.4 to 1.22.0. Release notes Sourced from numpy's releases. v1.22.0 NumPy 1.22.0 Release Notes NumPy 1.22.0 is a big release featuring the work of 153 contributors spread...
I am trying to create a new challenge. Where should I put the test annotation file? Any help would be highly appreciated.
Context: Bi-directional GitHub sync Deliverables: - [x] Avoiding unnecessary EvalAI calls for the GitHub sync. - [x] Add Github token @KhalidRmb cc: @Ram81