batch_prediction_interface.md 2.18 KiB
Batch Prediction Interface
- Date:
14-05-2024
Your submitted models can now make batch predictions on the test set, allowing you to fully utilize the multi-GPU setup available during evaluations.
Changes to Your Code
-
Add a
get_batch_size()
Function:- This function should return an integer between
[1, 16]
. The maximum batch size supported at the moment is 16. - You can also choose the batch size dynamically.
- This function is a required interface for your model class.
- This function should return an integer between
-
Replace
generate_answer
withbatch_generate_answer
:- Update your code to replace the
generate_answer
function withbatch_generate_answer
. - For more details on the
batch_generate_answer
interface, please refer to the inline documentation in dummy_model.py.
# Old Interface def generate_answer(self, query: str, search_results: List[Dict], query_time: str) -> str: .... .... return answer # New Interface def batch_generate_answer(self, batch: Dict[str, Any]) -> List[str]: batch_interaction_ids = batch["interaction_id"] queries = batch["query"] batch_search_results = batch["search_results"] query_times = batch["query_time"] .... .... return [answer1, answer2, ......, answerN]
- The new function should return a list of answers (
List[str]
) instead of a single answer (str
). - The simplest example of a valid submission with the new interface is as follows:
class DummyModel: def get_batch_size(self) -> int: return 4 def batch_generate_answer(self, batch: Dict[str, Any]) -> List[str]: queries = batch["query"] answers = ["i dont't know" for _ in queries] return answers
- Update your code to replace the
Backward Compatibility
To ensure a smooth transition, the evaluators will maintain backward compatibility with the generate_answer
interface for a short period. However, we strongly recommend updating your code to use the batch_generate_answer
interface to avoid any disruptions when support for the older interface is removed in the coming weeks.