Overview
A fully funded PhD fellowship focusing on explainable natural language understanding (NLU) is available at the University of Copenhagen. This opportunity is part of the ExplainYourself project, which emphasizes explainable and robust automatic fact-checking. The position is set to commence in Spring 2026 and is a significant addition to the university’s Natural Language Understanding group.
Background & Relevance
Natural language understanding is a crucial area within artificial intelligence, aiming to enable machines to comprehend and interpret human language effectively. As AI systems are increasingly integrated into various applications, the need for transparency and explainability in these systems becomes paramount. This PhD position aims to address these challenges by contributing to research that enhances the interpretability of NLU systems, thus fostering trust and reliability in AI technologies.
Key Details
- Position: PhD Fellowship in Explainable Natural Language Understanding
- Institution: University of Copenhagen
- Project: ExplainYourself
- Start Date: Spring 2026
- Application Deadline: 31 October 2025
- Supervisors: Isabelle Augenstein and Pepa Atanasova
- Funding: ERC Starting Grant
- Links: Natural Language Understanding Group, Project Details, Application
Eligibility & Participation
Candidates must hold a Master’s degree to be eligible for this position. This opportunity is primarily targeted at individuals who are passionate about advancing research in explainable AI and natural language processing.
Submission or Application Guidelines
Interested applicants should follow these steps to apply:
1. Visit the application link provided.
2. Prepare the necessary documents, including a CV and cover letter.
3. Submit your application by the deadline of 31 October 2025.
More Information
The ExplainYourself project is funded by the European Research Council’s ERC Starting Grant, which supports innovative research initiatives led by early-career scientists. The project aims to build a robust research team, including PhD students and postdoctoral researchers, contributing to the field of explainable AI. This position not only offers a chance to engage in cutting-edge research but also provides a platform for collaboration with esteemed researchers in the field.
Conclusion
This PhD opportunity at the University of Copenhagen represents a significant step for those looking to delve into the realm of explainable natural language understanding. Interested candidates are encouraged to explore this opportunity further and submit their applications before the deadline. This is a chance to contribute to a vital area of AI research that is increasingly relevant in today’s technology landscape.
Category: PhD & Postdoc Positions
Tags: nlp, explainable ai, university of copenhagen, erc starting grant, natural language understanding, machine learning, research fellowship, phd position, isabelle augenstein, copeNLU, ellis phd programme, automatic fact checking