Skip to content

Weiyu-USTC/ICML2020

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ICML2020

For directory, see ICML2020-Accepted List.pdf

In this pdf file, papers highlighted in YELLOW/ORANGE are in this repository.

PINK highlighted means interesting papers not available online.


List of papers in topics: (without replacement, so each paper in the most revelant topic)

Decentralized SGD

  • A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
  • Adascale_sgd_a_scale_invariant_algorithm_for_distributed_training-Original Pdf
  • Improving the Sample and Communication Complexity for Decentralized Non-Convex Optimization - A Joint Gradient Estimation and Tracking Approach
  • Is Local SGD Better than Minibatch SGD/ On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent
  • On the Noisy Gradient Descent that Generalizes as SGD
  • The_Complexity_of_Finding_Stationary_Points_with_Stochastic_Gradient_Descent

Attacks

  • Min-Max Optimization without Gradients - Convergence and Applications to Adversarial ML
  • Second-Order Provable Defenses against Adversarial Attacks
  • Zeno++

Variance Reduction

Almost Tune-Free Variance Reduction

Acceleration

  • Acceleration for Compressed Gradient Descent in Distributed Optimization
  • Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization
  • Momentum Improves Normalized SGD
  • Statistically Preconditioned Accelerated Gradient Method for Distributed Optimization
  • Universal Average-Case Optimality of Polyak Momentum

Federated Learning

  • COMMUNICATION-EFFICIENT FEDERATED LEARNING WITH SKETCHING
  • Federated Learning with Only Positive Labels
  • From Local SGD to Local Fixed Point Methods for Federated Learning
  • SCAFFOLD Stochastic Controlled Averaging for Federated Learning

High-dimensional Stat

  • High-Dimensional Robust Mean Estimation via Gradient Descent
  • Safe Screening Rules for L0-regression
  • Spectral Graph Matching and Regularized Quadratic Relaxation I - The Gaussian Model
  • WONDER Weighted One-shot Distributed Ridge Regression in High Dimensions

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published