Ethan Wilcox

According to our database1, Ethan Wilcox authored at least 34 papers between 2018 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Elements of World Knowledge (EWOK): A cognition-inspired framework for evaluating basic world knowledge in language models.
CoRR, 2024

[Call for Papers] The 2nd BabyLM Challenge: Sample-efficient pretraining on a developmentally plausible corpus.
CoRR, 2024

Surprise! Uniform Information Density Isn't the Whole Story: Predicting Surprisal Contours in Long-form Discourse.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024

On the Role of Context in Reading Time Prediction.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024

Reverse-Engineering the Reader.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024

2023
Exhaustivity and Anti-Exhaustivity in the RSA Framework: Testing the Effect of Prior Beliefs.
Cogn. Sci., May, 2023

Testing the Predictions of Surprisal Theory in 11 Languages.
Trans. Assoc. Comput. Linguistics, 2023

On the Effect of Anticipation on Reading Times.
Trans. Assoc. Comput. Linguistics, 2023

WhisBERT: Multimodal Text-Audio Language Modeling on 100M Words.
CoRR, 2023

Call for Papers - The BabyLM Challenge: Sample-efficient pretraining on a developmentally plausible corpus.
CoRR, 2023

Controlled Text Generation with Natural Language Instructions.
Proceedings of the International Conference on Machine Learning, 2023

Quantifying the redundancy between prosody and text.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

Language Model Quality Correlates with Psychometric Predictive Power in Multiple Languages.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

Revisiting the Optimality of Word Lengths.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

On the Efficacy of Sampling Adapters.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023

2022
Evidence for Availability Effects on Speaker Choice in the Russian Comparative Alternation.
Proceedings of the 44th Annual Meeting of the Cognitive Science Society, 2022

2021
A Targeted Assessment of Incremental Processing in Neural LanguageModels and Humans.
CoRR, 2021

Using the Interpolated Maze Task to Assess Incremental Processing in English Relative Clauses.
Proceedings of the 43rd Annual Meeting of the Cognitive Science Society, 2021

A Targeted Assessment of Incremental Processing in Neural Language Models and Humans.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

2020
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior.
Proceedings of the 42th Annual Meeting of the Cognitive Science Society, 2020

Investigating Novel Verb Learning in BERT: Selectional Preference Classes and Alternation-Based Syntactic Generalization.
Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, 2020

A Systematic Assessment of Syntactic Generalization in Neural Language Models.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

SyntaxGym: An Online Platform for Targeted Evaluation of Language Models.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, 2020

2019
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies.
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019

Neural language models as psycholinguistic subjects: Representations of syntactic state.
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019

Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study.
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019

The Role of Prior Beliefs in The Rational Speech Act Model of Pragmatics: Exhaustivity as a Case Study.
Proceedings of the 41th Annual Meeting of the Cognitive Science Society, 2019

What Syntactic Structures block Dependencies in RNN Language Models?
Proceedings of the 41th Annual Meeting of the Cognitive Science Society, 2019

Testing Gender Markedness of Nouns with Self a Paced Reading Study.
Proceedings of the 41th Annual Meeting of the Cognitive Science Society, 2019

Phonological Cues to Syntactic Structure in a Large-Scale Corpus.
Proceedings of the 41th Annual Meeting of the Cognitive Science Society, 2019

Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations.
Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, 2019

2018
RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency.
CoRR, 2018

What do RNN Language Models Learn about Filler-Gap Dependencies?
Proceedings of the Workshop: Analyzing and Interpreting Neural Networks for NLP, 2018


  Loading...