Victor Quétu

Orcid: 0009-0004-2795-3749

According to our database1, Victor Quétu authored at least 13 papers between 2023 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of five.

Timeline

2023
2024
2025
0
1
2
3
4
5
6
7
1
4
2
2
4

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
LayerFold: A Python library to reduce the depth of neural networks.
SoftwareX, 2025

2024
Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers.
CoRR, 2024

Memory-Optimized Once-For-All Network.
CoRR, 2024

LaCoOT: Layer Collapse through Optimal Transport.
CoRR, 2024

NEPENTHE: Entropy-Based Pruning as a Neural Network Depth's Reducer.
CoRR, 2024

The Simpler The Better: An Entropy-Based Importance Metric to Reduce Neural Networks' Depth.
Proceedings of the Machine Learning and Knowledge Discovery in Databases. Research Track, 2024

DSD²: Can We Dodge Sparse Double Descent and Compress the Neural Network Worry-Free?
Proceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence, 2024

2023
Disentangling private classes through regularization.
Neurocomputing, October, 2023

Dodging the Sparse Double Descent.
CoRR, 2023

The Quest of Finding the Antidote to Sparse Double Descent.
Proceedings of the Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2023

Dodging the Double Descent in Deep Neural Networks.
Proceedings of the IEEE International Conference on Image Processing, 2023

Sparse Double Descent in Vision Transformers: Real or Phantom Threat?
Proceedings of the Image Analysis and Processing - ICIAP 2023, 2023

Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?
Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023


  Loading...