André Uschmajew

According to our database1, André Uschmajew authored at least 34 papers between 2010 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
On the approximation of vector-valued functions by volume sampling.
J. Complex., 2025

2024
Time-Varying Semidefinite Programming: Path Following a Burer-Monteiro Factorization.
SIAM J. Optim., March, 2024

2023
Kronecker Product Approximation of Operators in Spectral Norm via Alternating SDP.
SIAM J. Matrix Anal. Appl., December, 2023

Riemannian thresholding methods for row-sparse and low-rank matrix recovery.
Numer. Algorithms, June, 2023

Local convergence of alternating low-rank optimization methods with overrelaxation.
Numer. Linear Algebra Appl., May, 2023

Dynamical low-rank tensor approximations to high-dimensional parabolic problems: existence and convergence of spatial discretizations.
CoRR, 2023

Gauss-Southwell type descent methods for low-rank matrix optimization.
CoRR, 2023

On the approximation of vector-valued functions by samples.
CoRR, 2023

Dynamical low-rank approximation of the Vlasov-Poisson equation with piecewise linear spatial boundary.
CoRR, 2023

2022
A note on overrelaxation in the Sinkhorn algorithm.
Optim. Lett., 2022

A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions.
J. Optim. Theory Appl., 2022

Editorial: High-performance tensor computations in scientific computing and data science.
Frontiers Appl. Math. Stat., 2022

2021
Computing Eigenspaces With Low Rank Constraints.
SIAM J. Sci. Comput., 2021

Existence of dynamical low-rank approximations to parabolic problems.
Math. Comput., 2021

2020
Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation.
IEEE Trans. Neural Networks Learn. Syst., 2020

Chebyshev Polynomials and Best Rank-one Approximation Ratio.
SIAM J. Matrix Anal. Appl., 2020

Existence of dynamical low-rank approximations to parabolic problems.
CoRR, 2020

2019
A Gradient Sampling Method on Algebraic Varieties and Application to Nonsmooth Low-Rank Optimization.
SIAM J. Optim., 2019

2018
Alternating Least Squares as Moving Subspace Correction.
SIAM J. Numer. Anal., 2018

On Orthogonal Tensors and Best Rank-One Approximation Ratio.
SIAM J. Matrix Anal. Appl., 2018

2017
A Riemannian Gradient Sampling Algorithm for Nonsmooth Optimization on Manifolds.
SIAM J. Optim., 2017

Perturbation of Higher-Order Singular Values.
SIAM J. Appl. Algebra Geom., 2017

On the interconnection between the higher-order singular values of real tensors.
Numerische Mathematik, 2017

Finding a low-rank basis in a matrix subspace.
Math. Program., 2017

2016
Parallel algorithms for tensor completion in the CP format.
Parallel Comput., 2016

Tensor Networks and Hierarchical Tensors for the Solution of High-Dimensional Partial Differential Equations.
Found. Comput. Math., 2016

Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition.
CoRR, 2016

2015
Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality.
SIAM J. Optim., 2015

On Convergence of the Maximum Block Improvement Method.
SIAM J. Optim., 2015

2014
Low-Rank Tensor Methods with Subspace Correction for Symmetric Eigenvalue Problems.
SIAM J. Sci. Comput., 2014

Approximation rates for the hierarchical tensor format in periodic Sobolev spaces.
J. Complex., 2014

2013
On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format.
SIAM J. Numer. Anal., 2013

2012
Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation.
SIAM J. Matrix Anal. Appl., 2012

2010
Well-posedness of convex maximization problems on Stiefel manifolds and orthogonal tensor product approximations.
Numerische Mathematik, 2010


  Loading...