Junze Yin

According to our database1, Junze Yin authored at least 18 papers between 2023 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Inverting the Leverage Score Gradient: An Efficient Approximate Newton Method.
CoRR, 2024

Conv-Basis: A New Paradigm for Efficient Attention Inference and Gradient Computation in Transformers.
CoRR, 2024

How to Inverting the Leverage Score Distribution?
CoRR, 2024

Low Rank Matrix Completion via Robust Alternating Minimization in Nearly Linear Time.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

Solving Attention Kernel Regression Problem via Pre-conditioner.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2024

Fast Dynamic Sampling for Determinantal Point Processes.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2024

2023
Local Convergence of Approximate Newton Method for Two Layer Nonlinear Regression.
CoRR, 2023

Revisiting Quantum Algorithms for Linear Regressions: Quadratic Speedups without Data-Dependent Parameters.
CoRR, 2023

The Expressibility of Polynomial based Attention Scheme.
CoRR, 2023

A Unified Scheme of ResNet and Softmax.
CoRR, 2023

A Fast Optimization View: Reformulating Single Layer Attention in LLM Based on Tensor and SVM Trick, and Solving It in Matrix Multiplication Time.
CoRR, 2023

GradientCoin: A Peer-to-Peer Decentralized Large Language Models.
CoRR, 2023

Efficient Alternating Minimization with Applications to Weighted Low Rank Approximation.
CoRR, 2023

Query Complexity of Active Learning for Function Family With Nearly Orthogonal Basis.
CoRR, 2023

Faster Robust Tensor Power Method for Arbitrary Order.
CoRR, 2023

Federated Empirical Risk Minimization via Second-Order Method.
CoRR, 2023

An Iterative Algorithm for Rescaled Hyperbolic Functions Regression.
CoRR, 2023

A Nearly-Optimal Bound for Fast Regression with ℓ<sub>∞</sub> Guarantee.
Proceedings of the International Conference on Machine Learning, 2023


  Loading...