Quickstart on Saturn Cloud
The goal of this guide is to get you started developing submissions for the competition as quickly as possible. This section will detail the necessary steps to create your first submission.
1. Register on EvalAI
EvalAI is an open source platform to host machine learning competitions. We’re using this tool to evaluate submissions and host the leaderboard. To register for the competition, you must:
2. Register on Saturn Cloud
- Go to https://saturncloud.io/ and login or create a free account
- Under “Create a Resource” at the top of the page, click on the NeurIPS Openbio template card and click through the pop-up modals to create a new Saturn Cloud project using this template.
- Click on the ▶ button to start a Jupyter server
- Once the server is started, click the “Jupyter Lab” button to access the server
- From the Saturn Cloud dashboard, you should see a blue circle in the lower right corner with a which chat icon inside. Click this button and send a message saying “Can you please upgrade my account for the Open Problems NeurIPS competition?”. Someone at Saturn Cloud should respond shortly and you should see 100 hours appear under the “Hours remaining” box on the left sidebar.
3. Configure your local environment
First you need to install some requisites:
sudo apt-get update
sudo apt-get install -y unzip zip
In order to easily submit your solution to eval.ai, you should install and configure the evalai cli. Click here for more instructions. Please run:
pip install evalai
evalai set_token <your token here>
4. Grab a starter kit
You can find a set of starter kits for each task below or on the GitHub releases of the competition codebase. Download the starter kit which is most relevant to you and unzip it in a directory.
Task 1 - Predict Modality
Task 2 - Match Modality
Task 3 - Joint Embedding
Once you’ve chosen your starter kit, download and unzip it:
mkdir starter_kit
cd starter_kit
wget https://github.com/openproblems-bio/neurips2021_multimodal_viash/releases/latest/download/starter_kit-predict_modality-python.zip
unzip starter_kit-predict_modality-python.zip
ls -l # view contents
5. Tweak starter kit
Running Docker containers on Saturn Cloud is not enabled as it would pose security problems for other containers running on the same platform. This means we need to make some minor tweaks to the starter kit. Please run:
echo "" >> scripts/nextflow.config
echo "docker.enabled = false" >> scripts/nextflow.config
sed -i 's#docker info#echo hi#g' scripts/0_sys_checks.sh
sed -i 's#-p docker#-p native#g' scripts/1_unit_test.sh
sed -i 's#-p docker#-p native#g' scripts/2_generate_submission.sh
echo 'echo No local evaluation is possible on Saturn Cloud' > scripts/3_evaluate_submission.sh
6. Generate your first submission
To make sure your local environment is set up correctly, first run the 2_generate_submission.sh
script. This script triggers a Nextflow workflow to:
- Build a viash component of the method in
script.py/R
using the configuration specified inconfig.vsh.yaml
. - Sync the training datasets from S3 (
s3://openproblems-bio/public/phase1-data
) to a local drive (output/datasets/
). - Apply the containerized method on the training datasets.
- Create a
submission.zip
file that can be uploaded to EvalAI for evaluation on the competition data and registration of the method on the leaderboard.
Please run the following commands:
scripts/0_sys_checks.sh
scripts/1_unit_test.sh
scripts/2_generate_submission.sh
If this workflow finishes successfully, you’ll see something like this:
$ scripts/2_generate_submission.sh
...
######################################################################
## Generating submission files using nextflow ##
######################################################################
N E X T F L O W ~ version 21.04.1
Pulling openproblems-bio/neurips2021_multimodal_viash ...
Launching `openproblems-bio/neurips2021_multimodal_viash` [intergalactic_roentgen] - revision: a784bb6c4b [main_build]
[f6/c705c5] process > method:method_process (openproblems_bmmc_multiome_phase1) [100%] 2 of 2 ✔
######################################################################
## Submission summary ##
######################################################################
Please upload your submission at the link below:
https://eval.ai/web/challenges/challenge-page/1111/submission
Or use the command below create a private submission:
> evalai challenge 1111 phase 2278 submit --file submission.zip --large --private
Or this command to create a public one:
> evalai challenge 1111 phase 2278 submit --file submission.zip --large --public
Good luck!
Make note of the outputs generated in the nextflow step. If all went well, you should see a 100% success rate.
7. Submitting to EvalAI
To upload the submission, you have two options:
- Upload the submission via https://eval.ai/web/challenges/challenge-page/1111/submission
- Use the EvalAI-CLI to upload the submission following the instructions outputted by the generate script.
Once your method is submitted, you can navigate to the My Submissions tab for the competition and select the phase that matches your recent submission. Here you will find a table that lists each submission you’ve uploaded for a given phase. Once the “Status” column is marked “Finished” you can view the results.json
file that provides the method performance. Note that it might take up to 5 minutes for your submission to update from “Running” to “Finished”, and that your submission might have to wait in a queue for an undetermined amount of time depending on the number of submissions being run on the submission server.
Encountering issues?
If you encounter a problem, please read the FAQ to see whether a solution is already described. If you can’t find a solution to your problem, please reach out on the competition Discord server. See the #troubleshooting
or #viash-help
channels.