Home
Experience
Accomplishments
Publications
Posts
Contact
CV
Mikael Johansson
Latest
Improved Step-Size Schedules for Proximal Noisy Gradient Methods
Eco-Fedsplit: Federated Learning with Error-Compensated Compression
A Flexible Framework for Communication-Efficient Machine Learning
Compressed Gradient Methods With Hessian-Aided Error Compensation
Convergence Bounds for Compressed Gradient Methods with Memory Based Error Compensation
Distributed learning with compressed gradients
Gradient compression for communication-limited convex optimization
The convergence of sparsified gradient methods
Mini-batch gradient descent: Faster convergence under data sparsity
Cite
×