Skip to content

Interactive Assessment Tool for Gaze-based Machine Learning Models in Information Retrieval

Notifications You must be signed in to change notification settings

pabvald/ReMA-tool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 

Repository files navigation

ReMA Tool

Interactive Assessment Tool for Gaze-based Machine Learning Models in Information Retrieval

CHIIR '22: Proceedings of the 2022 Conference on Human Information Interaction and RetrievalMarch 2022 | Pages 332–336 | https://doi.org/10.1145/3498366.3505834

Try the demo 🚀

Abstract

Eye movements were shown to be an effective source of implicit relevance feedback in information retrieval tasks. They can be used to, e.g., estimate the relevance of read documents and expand search queries using machine learning. In this paper, we present the Reading Model Assessment tool (ReMA), an interactive tool for assessing gaze-based relevance estimation models. Our tool allows experimenters to easily browse recorded trials, compare the model output to a ground truth, and visualize gaze-based features at the token- and paragraph-level that serve as model input. Our goal is to facilitate the understanding of the relation between eye movements and the human relevance estimation process, to understand the strengths and weaknesses of a model at hand, and, eventually, to enable researchers to build more effective models.

About

Written by Pablo Valdunciel, Omair Shahzad, Michael Barz and Daniel Sonntag at the Interactive Machine Learning (IML) Department of the German Research Centre for Artificial Intelligence (DFKI).

Citation

@inproceedings{10.1145/3498366.3505834,
author = {Valdunciel, Pablo and Bhatti, Omair Shahzad and Barz, Michael and Sonntag, Daniel},
title = {Interactive Assessment Tool for Gaze-Based Machine Learning Models in Information Retrieval},
year = {2022},
isbn = {9781450391863},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3498366.3505834},
doi = {10.1145/3498366.3505834},
abstract = {Eye movements were shown to be an effective source of implicit relevance feedback in information retrieval tasks. They can be used to, e.g., estimate the relevance of read documents and expand search queries using machine learning. In this paper, we present the Reading Model Assessment tool (ReMA), an interactive tool for assessing gaze-based relevance estimation models. Our tool allows experimenters to easily browse recorded trials, compare the model output to a ground truth, and visualize gaze-based features at the token- and paragraph-level that serve as model input. Our goal is to facilitate the understanding of the relation between eye movements and the human relevance estimation process, to understand the strengths and weaknesses of a model at hand, and, eventually, to enable researchers to build more effective models.},
booktitle = {Proceedings of the 2022 Conference on Human Information Interaction and Retrieval},
pages = {332–336},
numpages = {5},
keywords = {data visualization, information retrieval, reading, interactive model assessment, relevance estimation, eye tracking},
location = {Regensburg, Germany},
series = {CHIIR '22}
}

About

Interactive Assessment Tool for Gaze-based Machine Learning Models in Information Retrieval

Topics

Resources

Stars

Watchers

Forks