Dissipation of Information in Channels with Input ConstraintsDate: 2014-06-25 Add to Google Calendar
Time: 3:30 p.m.
Location: Holmes Hall 389
Speaker: Yury Polyanskiy, Assistant Professor, Electrical Engineering and Computer Science, MIT
One of the basic tenets in information theory, the data processing inequality, states that output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, e.g., Dobrushin's coefficient for the total variation. This work investigates channels with average input cost constraint. It is found that while the contraction coefficient typically equals one, the information nevertheless dissipates. A certain non-linear function, a Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed. Applications in stochastic control, uniqueness of Gibbs measures and noisy circuits are discussed. Based on a joint work with Y. Wu (UIUC).
Yury Polyanskiy is an assistant professor of Electrical Engineering and Computer Science and a member of LIDS at MIT. Yury received the M.S. degree in applied mathematics and physics from the Moscow Institute of Physics and Technology, Moscow, Russia in 2005 and the Ph.D. degree in electrical engineering from Princeton University, Princeton, NJ in 2010. In 2000-2005 he lead the development of the embedded software in the Department of Surface Oilfield Equipment, Borets Company LLC (Moscow). Currently, his research focuses on basic questions in information theory, error-correcting codes, wireless communication and fault-tolerant and defect-tolerant circuits. Dr. Polyanskiy won the 2013 NSF CAREER award and 2011 IEEE Information Theory Society Paper Award. In 2012 he was selected to hold a Robert J. Shillman (1974) Career Development Professorship of EECS.