Site menu:
OPT2019
We welcome you to participate in the 11th OPT Workshop on Optimization for Machine Learning. This year's OPT workshop will be run as an independent event, co-located with NeurIPS in Vancouver. It will happen on December 14th, overlapping with the NeurIPS workshops, to facilitate easy switching between the two venues for attendees.
We are looking forward to an exciting OPT 2019!
Location: Exchange Hotel Vancouver
Call for Participation
Important Dates
- Deadline for submission of papers: September 30, 2019, anywhere on earth (October 1, 5:00AM, pacific time)
- Notification of acceptance: October 24, 2019 (TBC)
- Camera-Ready Papers Due: Dec 1, 2019
- Workshop date: December 14, 2019
Invited Talks
- John Duchi (Stanford University)
- Swati Gupta (Georgia Tech)
- Mert Gürbüzbalaban (Rutgers University)
- Yin-Tat Lee (University of Washington)
- Mengdi Wang (Princeton University)
Overview
Optimization lies at the heart of many machine learning algorithms and enjoys great interest in our community. Indeed, this intimate relation of optimization with ML is the key motivation for the OPT series of workshops. We aim to foster discussion, discovery, and dissemination of state-of-the-art research in optimization relevant to ML.
We invite participation in the 11th International Workshop on "Optimization for Machine Learning", to be held as an independent event, co-located with NeurIPS. We invite high quality submissions for presentation as spotlights or poster presentations during the workshop. We are especially interested in participants who can contribute theory, algorithms, applications, or implementations with a machine learning focus.
All accepted contributions will be listed on the workshop webpage (though there are no archival proceedings) and are expected to be presented as a poster during the workshop. A few submissions will in addition be selected for contributed talks or for short spotlight presentations.
The main topics are, including, but not limited to:
- Nonconvex Optimization
- Local and global optimality theory
- The role of overparameterization
- Architecture dependent optimization techniques
- The interface of generalization and optimization
- Convex concave decompositions, D.C. programming
- Approximation Algorithms
- Other topics in nonconvex optimization
- Stochastic, Parallel and Online Optimization:
- Large-scale learning, massive data sets
- Distributed and decentralized algorithms
- Distributed optimization algorithms, and parallel architectures
- Optimization using GPUs, Streaming algorithms
- Decomposition for large-scale, message-passing, and online optimization
- Stochastic approximations
- Algorithms and Techniques (application oriented)
- Global and Lipschitz optimization
- Algorithms for nonsmooth optimization
- Linear and higher-order relaxations
- Polyhedral combinatorics applications to ML problems
- Combinatorial Optimization
- Optimization in Graphical Models
- Structure learning
- MAP estimation in continuous and discrete random fields
- Clustering and graph-partitioning
- Semi-supervised and multiple-instance learning
- Other discrete optimization models and algorithms
- Other optimization techniques
- Hashing based optimization, sketching techniques
- Optimization in statistics, statistical/computational tradeoffs
- Optimization on manifolds, metric spaces; optimal transport
- Polynomials, sums-of-squares, moment problems
- Optimization techniques for Reinforcement Learning
- Numerical optimization
- Optimization software
- Integration with deep learning software, accelerator hardware and systems
- Crucial implementation details (architecture, language, etc.)
Submission Instructions:
- Submission website: CMT
- Page limit: 4 pages (without references)
- Submission format: No special style is required for the submission. Authors may use the NeurIPS style file, but are also free to use other styles as long as they use standard font size (11 pt), one column and margins (1 in). Formatting instructions for accepted papers will follow.
- Supplementary material: may be included, limited to a reasonable amount (max. 20 pages in addition to the main submission). Reviewers are not required to check the supplementary material, hence the paper should be self-contained.
- The submission must be sufficiently anonymized for double-blind reviewing.
- Dual submissions:
- We discourage dual submission to concurrent NeurIPS workshops, please choose the most suited workshop for your submission.
- Extended abstracts of papers under review at other conferences/journals can be submitted if this is ok for the conference/journal in question (if in doubt, please check with them first). Accepted papers will be posted on the webpage, but the workshop does not have archival proceedings.
Submission Instructions for Camera-Ready Version:
- Submission website: CMT
- Style file: the OPT2019 style file must be used.
- Page limit: 5 pages (without references and supplementary material).
- Camera-Ready deadline: December 1, 2019
Please note that at least one author of each accepted paper must be available to present the paper at the workshop.
Looking forward to another great OPT workshop!
Organizing Committee:
- Alexandre d’Aspremont (ENS, INRIA)
- Mark Schmidt (UBC)
- Suvrit Sra (MIT)
- Sebastian Stich (EPFL)
with senior advisory support from Arkadi Nemirovskii.