Home
Experience
Accomplishments
Publications
Posts
Contact
CV
Sarit Khirirat
Latest
Improved Step-Size Schedules for Proximal Noisy Gradient Methods
First-Order Algorithms for Communication Efficient Distributed Learning
Eco-Fedsplit: Federated Learning with Error-Compensated Compression
Zeroth-order randomized subspace Newton methods
A Flexible Framework for Communication-Efficient Machine Learning
Compressed Gradient Methods With Hessian-Aided Error Compensation
First-Order Algorithms for Communication Efficient Distributed Learning
Convergence Bounds for Compressed Gradient Methods with Memory Based Error Compensation
Distributed learning with compressed gradients
Gradient compression for communication-limited convex optimization
The convergence of sparsified gradient methods
Mini-batch gradient descent: Faster convergence under data sparsity
Cite
×