Optimization seminar

×

Modal title

Modal content

Spring Semester 2016

Date / Time Speaker Title Location
29 February 2016
16:30-17:30
Prof. Dr. Nisheeth Vishnoi
EPF Lausanne, CH
Event Details

Optimization Seminar

Title Slime Molds, Linear Programming and Sparse Recovery
Speaker, Affiliation Prof. Dr. Nisheeth Vishnoi, EPF Lausanne, CH
Date, Time 29 February 2016, 16:30-17:30
Location HG G 19.1
Abstract Inspired by computational abilities of the slime mold, several researchers proposed dynamical systems aimed at solving fundamental problems such as min-cost flow and, more generally, linear programming. In this talk, I will present some of our recent work on Physarum dynamics which proves that these dynamics provably work and, surprisingly, can be interpreted as an interior point method in linear programming and an iteratively reweighted least square method in sparse recovery; implying the convergence of these methods. Together, these results not only shed some light on how nature might be solving instances of perhaps the most complex problem in P, but also suggest new algorithmic approaches to them.
Slime Molds, Linear Programming and Sparse Recoveryread_more
HG G 19.1
7 March 2016
16:30-17:30
Prof. Dr. Volker Kaibel
Otto von Guericke Universität, Magdeburg, Germany
Event Details

Optimization Seminar

Title Describing Integer Points in Polyhedra
Speaker, Affiliation Prof. Dr. Volker Kaibel, Otto von Guericke Universität, Magdeburg, Germany
Date, Time 7 March 2016, 16:30-17:30
Location HG G 19.1
Abstract Linear mixed integer models are fundamental in treating combinatorial problems via Mathematical Programming. In this lecture we are going to discuss the question how small such formulations one can obtain for different problems. It turns out that for several problems including, e.g., the traveling salesman problem and the spanning tree problem, the use of additional variables is essential for the design of polynomial sized integer programming formulations. In fact, we prove that their standard exponential size formulations are asymptotically minimal among the formulations based on incidence vectors only. We also treat bounds for general sets of 0/1-points and briefly discuss the question for the role of rationality of coefficients in formulations. (Joint work with Stefan Weltge)
Describing Integer Points in Polyhedraread_more
HG G 19.1
17 May 2016
17:15-18:00
Prof. Dr. Michael Kapralov
EPF Lausanne
Event Details

Optimization Seminar

Title Sparse Fourier Transform in $O(k\log N)$ Samples and Sublinear Time
Speaker, Affiliation Prof. Dr. Michael Kapralov, EPF Lausanne
Date, Time 17 May 2016, 17:15-18:00
Location HG G 19.2
Abstract We give an algorithm for computing the Sparse Fourier Transform of a length $N$ signal in dimension $d$ using $O_d(k\log N)$ measurements of the input signal and $k\log^{O(d)} N$ runtime. This matches the optimal sample complexity of a recent algorithm due to [IK14], which used $\Omega(N)$ runtime, while using only $k \log^{O(1)} N$ runtime in any fixed dimension. The result of [IK14] achieved the optimal sample complexity bound by structuring the iterative refinement process common in Sparse FFT algorithms so that different iterations can rely on the same set of measurements of the input signal taken at the beginning of the algorithm. A crucial component of the approach was a powerful brute force (i.e. $\Omega(N)$ time) decoding primitive that ensured that the execution path of the algorithm is dominated by a simple $\ell_\infty$ norm reduction process and the updates to the signal were confined to a small set of coordinates {\em independent of the measurements taken}, allowing measurement reuse. A recent work of the author [Kap16] gave a refinement of these ideas (via $\ell_1$ as opposed to $\ell_\infty$ norm reduction) that allowed measurement reuse in sublinear time. However, the result of [Kap16] could not confine the updates to a fixed set of coordinates, requiring an $\Omega(\log\log N)$ loss in sample complexity to correct for this. In this work we remove this deficiency and achieve an optimal result. The main technical contribution is an analysis of a noisy recovery process that runs for a large number of iterations without requiring fresh randomness at any stage.​
Sparse Fourier Transform in $O(k\log N)$ Samples and Sublinear Timeread_more
HG G 19.2

Note: if you want you can subscribe to the iCal/ics Calender.

Organizers: Robert Weismantel, Rico Zenklusen

JavaScript has been disabled in your browser