BACK TO INDEX
Selected publications of Yair Lakretz |
-
Théo Desbordes,
Yair Lakretz,
Valérie Chanoine,
Maxime Oquab,
Jean-Michel Badier,
Agnès Trébuchon,
Romain Carron,
Christian-G Bénar,
Stanislas Dehaene,
and Jean-Rémi King.
Dimensionality and ramping: Signatures of sentence integration in the dynamics of brains and deep language models.
Journal of Neuroscience,
2023.
[WWW]
-
Alexandre Pasquiou,
Yair Lakretz,
Bertrand Thirion,
and Christophe Pallier.
Information-Restricted Neural Language Models Reveal Different Brain Regions' Sensitivity to Semantics, Syntax and Context.
arXiv preprint arXiv:2302.14389,
2023.
[PDF]
-
Stanislas Dehaene,
Fosca Al Roumi,
Yair Lakretz,
Samuel Planton,
and Mathias Sablé-Meyer.
Symbols and mental programs: a hypothesis about human singularity.
Trends in Cognitive Sciences,
2022.
[PDF]
-
Christos-Nikolaos Zacharopoulos,
Stanislas Dehaene,
and Yair Lakretz.
Disentangling Hierarchical and Sequential Computations during Sentence Processing.
bioRxiv,
pp 2022--07,
2022.
[WWW]
-
Yair Lakretz,
Theo Desbordes,
Dieuwke Hupkes,
and Stanislas Dehaene.
Causal transformers perform below chance on recursive nested constructions, unlike humans.
arXiv preprint arXiv:2110.07240,
2021.
-
Yair Lakretz,
Théo Desbordes,
Jean-Rémi King,
Benoît Crabbé,
Maxime Oquab,
and Stanislas Dehaene.
Can RNNs learn Recursive Nested Subject-Verb Agreements?.
arXiv preprint arXiv:2101.02258,
2021.
[WWW] [PDF]
-
Yair Lakretz,
Dieuwke Hupkes,
Alessandra Vergallito,
Marco Marelli,
Marco Baroni,
and Stanislas Dehaene.
Mechanisms for handling nested dependencies in neural-network language models and humans.
Cognition,
pp 104699,
2021.
[WWW]
-
Yair Lakretz,
Stanislas Dehaene,
and Jean-Rémi King.
What Limits Our Capacity to Process Nested Long-Range Dependencies in Sentence Comprehension?.
Entropy,
22(4):446,
2020.
-
Yair Lakretz,
Dieuwke Hupkes,
Alessandra Vergallito,
Marco Marelli,
Marco Baroni,
and Stanislas Dehaene.
Exploring processing of nested dependencies in neural-network language models and humans.
arXiv preprint arXiv:2006.11098,
2020.
-
Oscar Woolnough,
Cristian Donos,
Patrick S Rollo,
Kiefer J Forseth,
Yair Lakretz,
Nathan E Crone,
Simon Fischer-Baum,
Stanislas Dehaene,
and Nitin Tandon.
Spatiotemporal dynamics of orthographic and lexical processing in the ventral visual pathway.
Nature Human Behaviour,
pp 1--10,
2020.
[PDF]
-
Yair Lakretz,
German Kruszewski,
Theo Desbordes,
Dieuwke Hupkes,
Stanislas Dehaene,
and Marco Baroni.
The emergence of number and syntax units in LSTM language models.
arXiv preprint arXiv:1903.07435,
2019.
[PDF]
-
Yair Lakretz,
Théo Desbordes,
Dieuwke Hupkes,
and Stanislas Dehaene.
Can Transformers Process Recursive Nested Constructions, Like Humans?.
In Proceedings of the 29th International Conference on Computational Linguistics,
pages 3226--3232,
2022.
[WWW]
-
Yair Lakretz,
German Kruszewski,
Theo Desbordes,
Dieuwke Hupkes,
Stanislas Dehaene,
and Marco Baroni.
The emergence of number and syntax units in LSTM language models.
In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers),
Minneapolis, Minnesota,
pages 11--20,
June 2019.
Association for Computational Linguistics.
[WWW] [PDF] [Abstract]
BACK TO INDEX
Disclaimer:
This material is presented to ensure timely dissemination of
scholarly and technical work. Copyright and all rights therein
are retained by authors or by other copyright holders.
All person copying this information are expected to adhere to
the terms and constraints invoked by each author's copyright.
In most cases, these works may not be reposted
without the explicit permission of the copyright holder.
Note that this is not the exhaustive list of publications, but only a selection. Contact the individual authors for complete lists of references.
Last modified: Tue Oct 8 13:34:23 2024
Author: gs234476.
This document was translated from BibTEX by
bibtex2html