Note: The submission deadline has been extended by one week to 10 May (abstracts) and 17 May (papers).

Call for papers

Workshop aims and scope

Advances in machine learning have led to complex models that are difficult or impossible to interpret. This lack of explainability often has multiple causes, such as a large number of parameters, the complexity of the architecture, or the abstract nature of the features used. As education research continues to increasingly rely on such models, this lack of interpretability can further obscure issues pertaining to fairness, accountability, and actionable insights. This in turn can lead to a lack of trust among stakeholders, such as students, teachers, and administrators.

The growing need for interpretable AI in education, along with the increasing awareness of its challenges and implications, calls for a community of experts to work together to:

  1. develop a shared vision and common vocabulary in trustworthy educational data mining,
  2. disseminate work to raise awareness of the need for interpretable AI in education,
  3. create robust methods for increasing interpretability,
  4. develop evaluation metrics for assessing explanations and model interpretability, and
  5. measure and integrate the perceptions of educational stakeholders (students, teachers, administrators, parents) into trustworthy AI pipelines.

The Human-Centric eXplainable AI in Education (HEXED) Workshop aims to bring together researchers and practitioners from the fields of machine learning and education to discuss the challenges and opportunities of using interpretable machine learning models in education research.

Workshop topics

The workshop will cover a wide range of topics related to interpretable AI in education. These include, but are not limited to:

  • Enabling trust and transparency in educational models
  • Evaluating student, teacher, and parent perceptions towards AI
  • The case for intrinsic vs. post-hoc explainability
  • Ensuring explanation fidelity to the model
  • Designing evaluation metrics and methods for assessing explanations and/or models
  • Aligning explanations with teachers’ and students’ needs
  • Generating actionable explanations as a basis for classroom interventions and personalized learning

Important dates

  • 10 May (extended from 3 May): Abstract submission deadline
  • 17 May (extended from 10 May): Submission deadline for all papers
  • 14 June: Notification of acceptance
  • 28 June: Camera-ready submission deadline
  • 14 July: HEXED Workshop @ EDM 2024 (in-person in Atlanta, Georgia and online)

All deadlines are 11:59pm, AoE time (Anywhere on Earth).

Submission guidelines

All papers must be submitted by 10 May 17 May 2024 AoE (anywhere on Earth).

We invite submissions in the following categories:

  • Research papers (4-8 pages): New or in-progress work related to human-centric explainability in education. Papers should be clearly placed with respect to the state of the art, state the contribution clearly and describe the methodology in detail.
  • Position papers (4-6 pages): Position papers are dedicated to technical and critical views and opinions on explainable AI in education, and should present a novel viewpoint on a problem, or on a novel solution to a problem. They do not need to contain primary research data, but are substantiated by facts or principled arguments to provide new insights or opinions to a debate.
  • Encore Papers (no page restrictions): This track is for recently published papers (2022 onwards) relevant to the workshop. Authors submit an abstract and a link or PDF of the recently published work.

Submitted papers should be formatted using the EDM submission templates available here, and submitted to the HEXED Workshop on EasyChair. Accepted papers will be invited to present at a poster session, and presenters may be invited to give spotlight talks. References do not count towards the page limit. Authors can also include appendices to more clearly describe datasets and tools if necessary, and these do not count toward the page limit.

All submitted papers should be carefully blinded for review. Take care to remove all authors’ names and identifying information (e.g. grant numbers), and refer to any of your prior work in the third person (e.g. “Previously, Smith et al. did … [1]” rather than “In our prior work [1]”).

Attendance

Accepted authors and all other attendees (both in-person and remote) will be required to register for the workshop. Registration will be managed by the EDM 2024 main conference organization at https://educationaldatamining.org/edm2024/. Registration is yet to open.

Publication

We plan to publish the Research and Position papers presented at HEXED through an established proceedings publication platform, such as CEUR-WS. Depending on the requirements of the publication platform, authors may have the option of including only an abstract or their full paper. In either case, papers will be linked on the workshop website.