2st CASTLE Grand Challenge
The second CASTLE Challenge will be held at ACM Multimedia 2026 in Rio de Janeiro, Brazil.
To submit your paper and your results (for participants of the automatic track), please use the submission form on OpenReview
Timeline
- 20 March 2026: Query Release
- 05 July 2026: Fully-Automated Submission & Paper Deadline
- 16 July 2026: Notification to Authors
- 06 August 2026: Camera-Ready Deadline
Tasks
The CASTLE Challenge features a diverse set of tasks, including event detection, retrieval, and question answering. Future editions will expand the scope, but for this edition, the tasks include:
๐ Event Instance Search
Given a textual description (in English), participants must identify all timeframes where a specific event occurs. Events should be reported with both a time range and a video ID.
๐ฆ Object Instance Search
Given a textual (in English) or visual (i.e., using an image) example of a physical object, participants must find all occurrences of that object across any of the video streams.
๐ฌ Question Answering
Given a question in natural language (in English), participants must provide an answer. The response should be formulated in natural language and include references to relevant sensor streams and time intervals as supporting evidence.
Evaluation
The challenge will operate across two tracks: fully-automatic and interactive.
โ๏ธ Fully-Automatic Track
Participants receive queries in advance and generate results using any method they choose. These results are then submitted to the challenge organizers for evaluation. Please see the list of queries below.
๐ฎ Interactive Track
This track will be evaluated live during the conference. Participants must solve tasks synchronously and interactively within a limited timeframe. This format follows established competitions such as the Video Browser Showdown and the Lifelog Search Challenge.
Queries
๐ Event Instance Search
- MM26-EIS01: Find instances of somebody pouring tea from the pot into a mug
- MM26-EIS02: Find instances of somebody opening the front door
- MM26-EIS03: Find instances of somebody throwing a paper airplane
- MM26-EIS04: Find instances of somebody speaking with a full mouth
- MM26-EIS05: Find instances of somebody accidentally dropping something
- MM26-EIS06: Find instances of somebody eating a piece of candy
- MM26-EIS07: Find instances of somebody pouring an alcoholic beverage
- MM26-EIS08: Find instances of somebody putting something into their pants pocket
- MM26-EIS09: Find instances of somebody sneezing
- MM26-EIS10: Find instances of leftover food being packaged for later
- MM26-EIS11: Find instances of someone playing the Austrian card game โSchnapsenโ
- MM26-EIS12: Find instances of someone laughing
๐ฆ Object Instance Search
- MM26-OIS01: Find a UNO +2 card
- MM26-OIS02: Find the blue whale plush toy
- MM26-OIS03: Find the bird Christmas tree ornament
- MM26-OIS04: Find the potato masher
- MM26-OIS05: Find the tiny plastic toy cowboy hat
- MM26-OIS06: Find a blue or black writing instrument (pen, pencil, marker, etc) not currently in use
- MM26-OIS07: Find the electric vehicle charger
- MM26-OIS08: Find a key or a key chain
- MM26-OIS09: Find any hand tool not used for cooking
- MM26-OIS10: Find a front/ego view of a smartphone
- MM26-OIS11: Find a living non-human animals
- MM26-OIS12: Find a TV remote
๐ฌ Question Answering
- MM26-QA01: Who was the last one who did the reading experiment?
- MM26-QA02: How many people went grocery shopping on Tuesday morning?
- MM26-QA03: What alcoholic sweet did Klaus bring?
- MM26-QA04: How many rounds of chess did Allie play with Klaus?
- MM26-QA05: Who drew a picture of a church behind a bridge?
- MM26-QA06: What did Cathal and Klaus do when they went outside the house in the evening with a torch?
- MM26-QA07: Who was scraping burnt food from a pan into the trash?
- MM26-QA08: Who taught Bjรถrn a card game?
- MM26-QA09: Who was sick one morning?
- MM26-QA10: What song were Klaus and Onanong practicing on the guitar?
- MM26-QA11: Who was in charge of cooking the Indian food?
- MM26-QA12: What are the first cookies that Luca baked called?
Submission Format
โ๏ธ Fully-Automatic Track
For the fully-automatic track, please submit your results together with your paper on OpenReview. Results should be submitted in CSV format with one file per task type (i.e., three files, one each for the ๐ Event Instance Search, ๐ฆ Object Instance Search, and ๐ฌ Question Answering tasks) in a ZIP file. The name of each file should indicate the task type. The columns for the files are listed below.
๐ Event Instance Search
- task: the Id of the task
- day: the day on which the event occured as integer from 1 to 4
- starttime: the start time of the event in HH:MM:SS format
- endtime: the end time of the event in HH:MM:SS format
- source: the perspective/camera the event was recorded from (e.g., Allie, Kitchen, etc.)
๐ฆ Object Instance Search
- task: the Id of the task
- day: the day on which the object can be seen as integer from 1 to 4
- starttime: the start time from when the object can be seen in HH:MM:SS format
- endtime: the end time until when the object can be seen in HH:MM:SS format
- source: the perspective/camera the object can be seen from (e.g., Allie, Kitchen, etc.)
๐ฌ Question Answering
- task: the Id of the task
- answer: a natural language answer to the question in English
๐ฎ Interactive Track
The interactive track will be evaluated during a hybrid session at ACM Multimedia 2026. Systems are supposed to submit their task solutions to the Distributed Retrieval Evaluation Server via its API.