What’s New?

Dec 2023 The Swiss AI Initiative is launched!
Dec 2023 Talk at EMNLP BlackboxNLP Workshop 2023
Nov 2023 Neuro-Symbolic AI Panel at ISWC 2023
Oct 2023 Talk at Johns Hopkins University
Oct 2023 Talk at University of Maryland
Jan 2023 Panel at Infrarouge
Jan 2023 Talk at IBM Neuro-symbolic AI Workshop
Mar 2022 Talk at EPFL Center for Intelligent Systems
Jan 2022 Talk at IBM Research
Dec 2021 Panel at World Congress of Science & Factual Producers
Nov 2021 Talk at ETH Zurich
Nov 2021 Talk at CIKM Workshop: Knowledge Injection in Neural Networks (KINN)
Nov 2021 Talk at KR Workshop: Knowledge Representation for Hybrid and Compositional AI (KRHCAI)
Sep 2021 Talk at Stanford Graph Learning Workshop
Aug 2021 Talk at IJCAI Workshop: Is Neuro-symbolic SOTA still a myth for NLI? (NSNLI)
Apr 2021 Named to the Forbes 30 under 30 list in Science & Healthcare
Mar 2021 Talk at Microsoft Research
Feb 2021 Talk at AAAI Workshop in Hybrid Artificial Intelligence
Feb 2021 Tutorial on Commonsense Knowledge Acquisition and Representation at AAAI 2021
Nov 2020 Tutorial on Neural Language Generation at EMNLP 2020
Nov 2020 Talk at UCSD Health Informatics Seminar
Nov 2020 Talk at Stanford Cognitive Science Seminar
Jul 2020 Tutorial on Commonsense Knowledge at ACL 2020
Sep 2019 Talk at WeCNLP 2019
                       

Research Interests

My research interests are broadly in natural language processing, deep learning, machine learning, and artificial intelligence, with some projects integrating computer vision and data science. Specifically, my research focuses on how we can design systems that understand the implicit human knowledge underlying language and communication.

Topics that I focus on include:

  • Neural and symbolic representations of knowledge: language models as knowledge bases, automatic knowledge graph construction, neural information retrieval

  • Neuro-symbolic reasoning methods: knowledge graph integration in NLP systems, large-scale pretraining of language and knowledge models, graph neural networks

  • Commonsense knowledge representation and reasoning: learning commonsense knowledge from language and vision, models for commonsense reasoning, applications of commonsense reasoning

  • Narrative understanding: entity representations, entity and state tracking, story generation

  • Language generation: models, decoding algorithms, evaluation metrics

  • Biomedical and social science applications of language and knowledge: understanding clinical notes, biomedical NLP, disinformation detection

EPFL NLP Group

Check out our lab website for more details!

Postdoctoral Scholars

Syrielle Montariol
Debjit Paul
Gail Weiss

PhD Students

Badr AlKhamissi
Deniz Bayazit
Beatriz Borges
Zeming (Eric) Chen
Negar Foroutan
Silin Gao
Yifan Hou
Mete Ismayilzada

Interns

Anna Dai

Alumni

Visiting PhDs

Tianqing Fang
Mike Zhang

Interns

Yu Fei
Khai Loong Aw
Karina Halevy
Fawzia Zeitoun
Spyridon Chalkias

Masters Theses

Michal Bien
Graciana Aad
Axel Marmet
Ghali Chraibi
Alexandre Variengien

Publications

Please see my Google Scholar for an up-to-date list of publications.

(2023). MEDITRON-70B: Scaling Medical Pretraining for Large Language Models. arXiv.

PDF Code Project

(2023). RECKONING: Reasoning through Dynamic Knowledge Encoding. Neural Information Processing Systems (NeurIPS).

PDF Code

(2023). CRoW: Benchmarking Commonsense Reasoning in Real-World Tasks. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code Dataset Project

(2023). CRAB: Assessing the Strength of Causal Relationships Between Real-world Events. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code

(2023). Towards a Mechanistic Interpretation of Multi-Step Reasoning Capabilities of Language Models. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code

(2023). Breaking the Language Barrier: Improving Cross-Lingual Reasoning with Structured Self-Attention. Findings of EMNLP.

PDF Code

(2023). Discovering Knowledge-Critical Subnetworks in Pretrained Language Models. arXiv.

PDF

(2023). CAR: Conceptualization-Augmented Reasoner for Zero-Shot Commonsense Question Answering. Findings of EMNLP.

PDF Code

(2023). Let Me Teach You: Pedagogical Foundations of Feedback for Language Models. arXiv.

PDF

(2023). Instruction-tuning Aligns LLMs to the Human Brain. arXiv.

PDF

(2023). PeaCoK: Persona Commonsense Knowledge for Consistent and Engaging Narratives. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL). Outstanding Paper Award.

PDF Code Dataset Video

(2023). Mitigating Label Biases for In-context Learning. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Code Video

(2023). DISCO: Distilling Counterfactuals with Large Language Models. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Code Dataset

(2023). REFINER: Reasoning Feedback on Intermediate Representations. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF

(2023). kogito: A Commonsense Knowledge Inference Toolkit. Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics (EACL) - Systems Demonstrations.

PDF Code Video

(2022). ComFact: A Benchmark for Linking Contextual Commonsense Knowledge. Findings of EMNLP.

PDF Code Dataset Video

(2022). Discovering Language-neutral Sub-networks in Multilingual Language Models. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code Poster Video

(2022). Conditional set generation using Seq2seq models. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF

(2022). Deep Bidirectional Language-Knowledge Graph Pretraining. Advances in Neural Information Processing Systems (NeurIPS).

PDF Code

(2022). Memory-Based Model Editing at Scale. Proceedings of the 39th International Conference on Machine Learning (ICML).

PDF Code Project Video

(2022). GreaseLM: Graph REASoning Enhanced Language Models for Question Answering. Proceedings of the 10th International Conference for Learning Representations (ICLR). Spotlight (Top 5%).

PDF Code Video

(2022). Fast Model Editing at Scale. Proceedings of the 10th International Conference for Learning Representations (ICLR).

PDF Code Project

(2022). End-to-End Task-Oriented Dialog Modeling with Semi-Structured Knowledge Management. IEEE/ACM Transactions on Audio Speech and Language (TASLP).

PDF

(2022). Synthetic Disinformation Attacks on Automated Fact Verification Systems. Proceedings of the 36th AAAI Conference on Artificial Intelligence (AAAI).

PDF Code Video

(2021). Conversational Multi-Hop Reasoning with Neural Commonsense Knowledge and Symbolic Logic Rules. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF

(2021). On the Opportunities and Risks of Foundation Models. arXiv.

PDF

(2021). Analyzing Commonsense Emergence in Few-shot Knowledge Models. Proceedings of the 3rd Conference on Automated Knowledge Base Construction (AKBC).

PDF Code

(2021). Edited Media Understanding Frames: Reasoning About the Intents and Implications of Visual Disinformation. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Dataset

(2021). On-the-Fly Attention Modulation for Neural Generation. Findings of the ACL.

PDF

(2021). QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering. Proceedings of the 18th Meeting of the North American Association for Computational Linguistics (NAACL).

PDF Code Project

(2021). I'm Not Mad: Commonsense Implications of Negation and Contradiction. Proceedings of the 18th Meeting of the North American Association for Computational Linguistics (NAACL).

PDF Code Dataset

(2021). Discourse Understanding and Factual Consistency in Abstractive Summarization. Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics.

PDF

(2021). (Comet-)Atomic 2020: On Symbolic and Neural Commonsense Knowledge Graphs. Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI).

PDF Code

(2021). Dynamic Neuro-Symbolic Knowledge Graph Construction for Zero-shot Commonsense Question Answering. Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI).

PDF

(2020). Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense Reasoning. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code Slides Video

(2020). Procedural Reading Comprehension with Attribute-Aware Context Flow. Proceedings of the 2nd Conference on Automated Knowledge Base Construction (AKBC). Best Paper Runner-up.

PDF Video

(2020). Commonsense Knowledge Base Completion with Structural and Semantic Context. Proceedings of the 34th AAAI Conference on Artificial Intelligence (AAAI).

PDF

(2019). COMET: Commonsense Transformers for Automatic Knowledge Graph Construction. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Code Poster Video

(2019). Efficient Adaptation of Pretrained Transformers for Abstractive Summarization. arXiv.

PDF Code

(2019). Counterfactual Story Reasoning and Generation. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code Dataset

(2019). Everything Happens for a Reason: Discovering the Purpose of Actions in Procedural Text. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Dataset Project

(2019). WIQA: A dataset for "What if..." reasoning over procedural text. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Code Dataset Project

(2019). Be Consistent! Improving Procedural Text Comprehension using Label Consistency. Proceedings of the 17th Annual Meeting of the North American Association for Computational Linguistics (NAACL).

PDF Dataset Project

(2018). Simulating Action Dynamics with Neural Process Networks. Proceedings of the 6th International Conference for Learning Representations (ICLR).

PDF Dataset Poster Video

(2018). Discourse-Aware Neural Rewards for Coherent Text Generation. Proceedings of the 16th Annual Meeting of the North American Association for Computational Linguistics (NAACL).

PDF Dataset Poster

(2018). Deep Communicating Agents for Abstractive Summarization. Proceedings of the 16th Annual Meeting of the North American Association for Computational Linguistics (NAACL).

PDF Project Poster

(2018). Modeling Naive Psychology of Characters in Simple Commonsense Stories. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Code Dataset Project Slides

(2018). Learning to Write with Cooperative Discriminators. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Code Project Poster

(2018). Reasoning about Actions and State Changes by Injecting Commonsense Knowledge. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Dataset Project

(2016). Learning Prototypical Event Structure from Photo Albums. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (ACL).

PDF Dataset Project

Media

My Thoughts in the News

Le Temps. Un super-ordinateur suisse dédié à l’IA (Dec 2023)

Corriere del Ticino. Ma davvero ChatGPT sta acquisendo tratti sempre più simili ai nostri? (Oct 2023)

Mirage News. Making AI work for everyone (Sept 2023)

RTS Forum. Les IA peuvent-elles comprendre l’humour? (May 2023)

RTS Infrarouge. Intelligence artificielle: le grand remplacement? (Jan 2023)

Tribune de Genève. Intelligence artificielle: Profession? Journaliste sportif virtuel (Jan 2023)

Heidi.news ChatGPT facilite la triche: et si c’était une bonne nouvelle? (Jan 2023)

Communications of the ACM. The Best of NLP (April 2021)

My Research in The News

L’AGEFI. L’industrie pharma croit au potentiel de l’intelligence artificielle à tous les niveaux (Dec 2023)

GGB. Meditron, EPFL’s new Large Language Model for medical knowledge (Dec 2023)

ICT Journal. Né à l’EPFL: un LLM open source spécialisé dans le domaine médical (Dec 2023)

RTS CQFD. EPFL: Meditron (Dec 2023)

Communications of the ACM. Seeking Artificial Common Sense (Nov 2020)

The Atlantic. The Easy Questions that Stump Computers (May 2020)

Quanta Magazine. Common Sense Comes Closer to Computers (April 2020)

New York Academy of Sciences. Can Researchers Create Commonsense Artificial Intelligence? (June 2019)

The Gradient. NLP’s generalization problem, and how researchers are tackling it (August 2018)

NLP Highlights Podcast. 54 - Simulating Action Dynamics with Neural Process Networks, with Antoine Bosselut (March 2018)

Joining EPFL NLP

If you’re interested in joining the EPFL NLP group, please read the following:

I am…

Looking for a postdoctoral position Feel free to contact me about potential postdoctoral positions. Also, check out these opportunities for fully funded postdoctoral positions that I can be a co-advisor on:
Horizon Europe Swiss Postdoctoral Fellowships
EPFLeaders4impact Postdoctoral Fellowships
Applying to the EPFL EDIC PhD program I will be taking on new PhD students next year! Apply if you’re interested in joining EPFL to work with me. Before you can be considered for the NLP lab, however, you will have to be admitted to the EDIC program, which handles admissions centrally. Feel free to let me know if you apply, but I unfortunately can’t conduct pre-screenings until applications are in.
An EDIC fellow I’m happy to supervise rotations provided our research interests align and there’s a good chance that the rotation will lead to a permanent position in the lab.
An EPFL Master’s student I’m happy to supervise Master’s projects and theses every semester! If you’re interested in doing a project with EPFL NLP, send an e-mail to:
nlp-projects-apply@groupes.epfl.ch
Please attach your CV and transcript and include [Masters Project] or [Masters Thesis] in your subject heading. If you want a sense of what a project in our lab would be about, check out my research interests above or those of my lab members! If you would like to complete an industry PDM, please follow the guidelines presented here
Looking for a summer internship If you are a Bachelor’s or Master’s student at another university, please apply through the Summer@EPFL program. If you are looking for a PhD internship, contact me directly.