How To Submit¶
For detailed information about opening and closing dates of the submission phases, please visit the Important Dates page.
The challenge GitHub repository publishes the evaluation code as well as an example algorithm.
Preliminary Phase: Release of Training Data¶
Teams are allowed to submit their algorithm through grand-challenge. The algorithms will be tested against four (4) cases taken from the training set (i.e., patients F012, F027, F056, and F065). The submissions can be tested up to 2 times per team per week. This phase is mainly designed to test the Docker images and the algorithms. We selected patients from the training case so that teams would be able to see logs for the output of their algorithm submission. As submission output, the grand-challenge evaluation code will produce the 95 Hausdorff distance (HD95) and Dice Score statistics. The ranking is explained in the Ranking page.
Final Test Phase¶
The teams are required to submit their algorithm. The algorithm will be tested against fifty (50) hidden cases. The submission can be tested up to 3 times. The metrics used for the evaluations are described in the Ranking page. Of the three submissions, the last received will be considered. The three best-performing teams will be awarded as challenge winners (see the Prize page for more details). Winners' algorithm must be publicly released and accompanied by a written report describing its main features.