Back to Results
First PageMeta Content
Stochastic optimization / Operations research / Convex optimization / Quantum field theory / Stochastic gradient descent / Stochastic approximation / Least squares / Regularization / Subgradient method / Mathematical optimization / Numerical analysis / Mathematical analysis


Dual Averaging Method for Regularized Stochastic Learning and Online Optimization Lin Xiao Microsoft Research, Redmond, WA[removed]removed]
Add to Reading List

Document Date: 2013-03-25 14:07:10


Open Document

File Size: 499,88 KB

Share Result on Facebook

City

Washington DC / Banff / /

Company

SIAM Journal / Neural Information Processing Systems / Cambridge University Press / MIT Press / Microsoft / /

Country

Canada / /

Currency

SGD / /

/

Facility

Catholic University of Louvain / /

IndustryTerm

online algorithm∑is / Online Optimization Lin Xiao / Large scale online learning / sparse online learning / much more sparse solutions / iterative shrinkage-threshold algorithm / regularized online optimization / above regularized online optimization problem / sparse solution / online version / sparse solutions / stochastic gradient descent algorithms / online prediction / batch optimization solutions / online optimization problems / online optimization / online iteration / online algorithm / online optimization problem / online algorithms / closed-form solution / online setting / /

MarketIndex

set 10 / /

Organization

Cambridge University / MIT / Catholic University of Louvain / Technion / Center for Operations Research and Econometrics / /

Person

Lin Xiao / /

/

Position

Y. Singer / and T. Chandra / /

ProvinceOrState

Alberta / Massachusetts / /

PublishedMedium

Machine Learning / Journal of Machine Learning Research / /

Technology

online algorithm / three online algorithms / online algorithms / stochastic gradient descent algorithms / 2 Algorithm / dom / machine learning / iterative shrinkage-threshold algorithm / two online algorithms / /

URL

http /

SocialTag