Emanuele La Malfa

Orcid: 0000-0002-6254-0470

According to our database1, Emanuele La Malfa authored at least 14 papers between 2020 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Language-Models-as-a-Service: Overview of a New Paradigm and its Challenges.
J. Artif. Intell. Res., 2024

A Notion of Complexity for Theory of Mind via Discrete World Models.
CoRR, 2024

Code Simulation Challenges for Large Language Models.
CoRR, 2024

Deep Neural Networks via Complex Network Theory: A Perspective.
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence, 2024

Graph-enhanced Large Language Models in Asynchronous Plan Reasoning.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
The ARRT of Language-Models-as-a-Service: Overview of a New Paradigm and its Challenges.
CoRR, 2023

Language Model Tokenizers Introduce Unfairness Between Languages.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
Emergent Linguistic Structures in Neural Networks are Fragile.
CoRR, 2022

Deep Neural Networks as Complex Networks.
CoRR, 2022

The King Is Naked: On the Notion of Robustness for Natural Language Processing.
Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, 2022

2021
Distilling Financial Models by Symbolic Regression.
Proceedings of the Machine Learning, Optimization, and Data Science, 2021

On Guaranteed Optimal Robust Explanations for NLP Models.
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, 2021

Characterizing Learning Dynamics of Deep Neural Networks via Complex Networks.
Proceedings of the 33rd IEEE International Conference on Tools with Artificial Intelligence, 2021

2020
Assessing Robustness of Text Classification through Maximal Safe Radius Computation.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2020, 2020


  Loading...