Advances in machine learning have led to complex models that are difficult or impossible to interpret. This lack of explainability often has multiple causes, such as a large number of parameters, the complexity of the architecture, or the abstract nature of the features used. As education research continues to increasingly rely on such models, this lack of interpretability can further obscure issues pertaining to fairness, accountability, and actionable insights. This in turn can lead to a lack of trust among stakeholders, such as students, teachers, and administrators.
The growing need for interpretable AI in education, along with the increasing awareness of its challenges and implications, calls for a community of experts to work together to:
develop a shared vision and common vocabulary in trustworthy educational data mining,
disseminate work to raise awareness of the need for interpretable AI in education,
create robust methods for increasing interpretability,
develop evaluation metrics for assessing explanations and model interpretability, and
measure and integrate the perceptions of educational stakeholders (students, teachers, administrators, parents) into trustworthy AI pipelines.
The Human-Centric eXplainable AI in Education (HEXED) Workshop aims to bring together researchers and practitioners from the fields of machine learning and education to discuss the challenges and opportunities of using interpretable machine learning models in education research.
Workshop topics
The workshop will cover a wide range of topics related to interpretable AI in education. These include, but are not limited to:
Enabling trust and transparency in educational models
Evaluating student, teacher, and parent perceptions towards AI
The case for intrinsic vs. post-hoc explainability
Ensuring explanation fidelity to the model
Designing evaluation metrics and methods for assessing explanations and/or models
Aligning explanations with teachers’ and students’ needs
Generating actionable explanations as a basis for classroom interventions and personalized learning
Important dates
31 May (extended from 17 May): Abstract submission deadline
07 June (extended from 24 May): Submission deadline for all papers
03 July (extended from 26 June): Notification of acceptance
11 July (extended from 06 July) Camera-ready submission deadline
20 July: HEXED Workshop @ EDM 2025 (in-person in Palermo, Italy and online)
All deadlines are 11:59pm, AoE time (Anywhere on Earth).
Submission guidelines
All papers must be submitted by 24 May 2025 AoE (anywhere on Earth).
We invite submissions in the following categories:
Research papers (4-8 pages): New or in-progress work related to human-centric explainability in education. Papers should be clearly placed with respect to the state of the art, state the contribution clearly and describe the methodology in detail.
Position papers (4-6 pages): Position papers are dedicated to technical and critical views and opinions on explainable AI in education, and should present a novel viewpoint on a problem, or on a novel solution to a problem. They do not need to contain primary research data, but are substantiated by facts or principled arguments to provide new insights or opinions to a debate.
Encore Papers (no page restrictions): This track is for recently published papers (2022 onwards) relevant to the workshop. Authors submit an abstract and a link or PDF of the recently published work.
Research and Position papers should be formatted using the proper templates available here. All papers should be submitted to the HEXED Workshop on EasyChair.
All submitted papers should be carefully blinded for review. Take care to remove all authors’ names and identifying information (e.g. grant numbers), and refer to any of your prior work in the third person (e.g. “Previously, Smith et al. did … [1]” rather than “In our prior work [1]”).
References do not count towards the page limit. Authors can also include appendices to more clearly describe datasets and tools if necessary, and these do not count toward the page limit.
Accepted papers will be invited to present at a poster session, and presenters may be invited to give spotlight talks.
Attendance
Accepted authors and all other attendees (both in-person and remote) will be required to register for the workshop. Registration will be managed by the EDM 2025 main conference organization at https://educationaldatamining.org/edm2025/. Registration is yet to open.
Publication
We plan to publish accept Research and Position papers presented at HEXED through the established proceedings publication platform CEUR-WS. Papers will be linked on the workshop website.