We are delighted to have with us four distinguished plenary speakers:
- Olgica Milenkovic , Associate Professor, Electrical and Computer Engineering University of Illinois at Urbana-Champaign
- Stephen J. Wright, Professor, Computer Sciences Department University of Wisconsin Madison
- Yannis Paschalidis, Professor, College of Engineering Boston University
- Dean Foster, Professor, Statistics Department (currently at Amazon.com) University of Pennsylvania
Olgica Milenkovic, Associate Professor, Electrical and Computer Engineering, University of Illinois at Urbana-Champaign. Coding Techniques for Emerging DNA-Based Storage Systems Abstract. Despite the many advances in traditional data recording techniques, the surge
of Big Data platforms and energy conservation issues have imposed new challenges to the storage
community in terms of identifying extremely high volume, non-volatile and durable recording
media. The potential for using macromolecules for ultra-dense storage was recognized as early as in
the 1960s, when the celebrated physicists Richard Feynman outlined his vision for nanotechnology in
the talk "There is plenty of room at the bottom." Among known macromolecules, DNA is unique in so
far that it lends itself to implementations of non-volatile recoding media of outstanding integrity
(one can still recover the DNA of species extinct for more than 70,000 years) and extremely high
storage capacity (a human cell, with a mass of roughly 3 pg, hosts DNA with encoding 6.4 GB of
information).
Biography. Olgica Milenkovic is a professor of Electrical and Computer Engineering at the University of Illinois, Urbana-Champaign (UIUC), and Research Professor at the Coordinated Science Laboratory. She obtained her Masters Degree in Mathematics in 2001 and PhD in Electrical Engineering in 2002, both from the University of Michigan, Ann Arbor. Prof. Milenkovic heads a group focused on addressing unique interdisciplinary research challenges spanning the areas of algorithm design and computing, bioinformatics, coding theory, machine learning and signal processing. Her scholarly contributions have been recognized by multiple awards, including the NSF Faculty Early Career Development (CAREER) Award, the DARPA Young Faculty Award, the Dean's Excellence in Research Award, and several best paper awards. In 2013, she was elected a UIUC Center for Advanced Study Associate and Willett Scholar. In 2015, she became Distinguished Lecturer of the Information Theory Society. From 2007 until 2015, she has served as Associate Editor of the IEEE Transactions of Communications, Transactions on Signal Processing and Transactions on Information Theory. In 2009, she was the Guest Editor in Chief of a special issue of the IEEE Transactions on Information Theory on MolecularBiology and Neuroscience. |
|
Stephen J. Wright, Professor, Computer Sciences Departmentat University of Wisconsin Madison. Fundamental Optimization Methods in Data Analysis Abstract. Optimization formulations and algorithms are vital tools for solving problems in data analysis. There has been particular interest in some fundamental, elementary, optimization algorithms that were previously thought to have only niche appeal. Stochastic gradient, coordinate descent, and accelerated first-order methods are three examples. We outline applications in which these approaches are useful, discuss their basic properties, and survey some recent developments in the analysis of their convergence behavior. Biography. Stephen J. Wright is the Amar and Balinder Sohi Professor of Computer Sciences at the University of Wisconsin-Madison. His research is on computational optimization and its applications to many areas of science and engineering. Prior to joining UW-Madison in 2001, Wright was a Senior Computer Scientist at Argonne National Laboratory (1990-2001), and a Professor of Computer Science at the University of Chicago (2000-2001). He has served as Chair of the Mathematical Optimization Society and as a Trustee of the Society for Industrial and Applied Mathematics (SIAM). He is a Fellow of SIAM. In 2014, he won the W.R.G. Baker award from IEEE. Wright is the author or coauthor of widely used text / reference books in optimization including "Primal Dual Interior-Point Methods" (SIAM, 1997) and "Numerical Optimization" (2nd Edition, Springer, 2006, with J. Nocedal). He has published widely on optimization theory, algorithms, software, and applications. Wright is editor-in-chief of the SIAM Journal on Optimization and has served as editor-in-chief or associate editor of Mathematical Programming (Series A), Mathematical Programming (Series B), SIAM Review, SIAM Journal on Scientific Computing, and several other journals and book series. |
|
Yannis Paschalidis, Professor, College of Engineering Boston University. Predictive Health Analytics Abstract.
In 2014, the United States spent $3 trillion in health care, equivalent to 17.2% of the US
GDP. About one third of this amount ($971.8 billion) is attributed to hospital care. Evidently, even
modest efforts for preventing and/or streamlining care in a hospital setting matter.
Biography.
Yannis Paschalidis is a Professor and Distinguished Faculty Fellow of Electrical and Computer
Engineering, Systems Engineering, and Biomedical Engineering at Boston University. He is the
Director of the Center for Information and Systems Engineering (CISE). He obtained a Diploma (1991)
from the National Technical University of Athens, Greece, and an M.S. (1993) and a Ph.D. (1996)
from the Massachusetts Institute of Technology (MIT), all in Electrical Engineering and Computer
Science. He has been at Boston University since 1996. His current research interests lie in the
fields of systems and control, networks, applied probability, optimization, operations research,
computational biology, medical informatics, and bioinformatics.
|
|
Dean Foster, Professor, Statistics Department (currently at Amazon.com) University of Pennsylvania Linear methods for large data Abstract. Using random matrix theory, we now have some very easy to understand and fast to use methods of computing low rank representations of matrices. I have been using these methods as a hammer to improve several statistical methods. I'll discuss several of these in this talk. First, I'll show how these ideas can be used to speed up regression. Then I'll turn to using them to construct new linear features motivated by CCA's. Finally, I'll use these methods to get a fast way of estimating an HMM. Biography.
Dean has pioneered two areas in game theory: stochastic evolutionary game dynamics and calibrated
learning. In both cases he worked on the theory necessary to show convergence to equilibrium. The
calibrated learning strategies he developed grew out of his work on individual sequences. In his
work with Rakesh Vohra he coined the ideas of no-internal-regret and calibration. It is these
learning rules that can be shown to converge to correlated equilibrium.
|