Real-Time Academic Paper Navigation & Discovery
Instantly ground an academic conference presentation to the resources being discussed.
We'll automatically find the paper, scroll to the relevant sections, and discover external resources.
Frequently Asked Questions
We transcribe audio in real-time. Then, we use state-of-the-art information retrieval models to map the most recent audio to resources, such as academic papers or specific paper segments. Once we find the resource, we show it to you in real-time. Currently, we do this by presenting the paper as well as outlining the most relevant section of the paper.
For the purposes of demonstration, we have manually indexed a handful of research papers. Currently, only a subset of those that are being presented at SIGIR 2025 and were publicly available before the conference are indexed. In the future, we plan to add many more.
If no paper is showing up and you are unmuted, then we are not confident enough in our automatic predictions to show anything. You can trigger the prediction process manually by pressing "Enter" or by selecting a paper from the search bar. If nothing shows up after this, then we probably have some system issues. Beyond this, you can revisit old navigations by using the arrow keys or buttons at the top of the screen.
Just press "a" (or click the robot button) and we will use an agent (read - chat-based LLM) to search the web using the transcript + navigated paper segment. This might take a few seconds, but the response should hopefully include link(s) to the desired external resources.