My research focuses on communications and information processing for networks. Particular areas of interest include information theory, coding, compression, and statistical inference. A list of my publications can be found here.

Here is a sample of my work:

Optimal Source and Channel Coding via Sparse Linear Regression.

Codes based on high-dimensional linear regression were recently introduced for communication over channels with additive Gaussian noise. These codes achieve rates approaching the channel capacity with computationally efficient encoding and decoding. In recent work, we have used the sparse regression framework to design codes for lossy data compression. These codes attain the optimal compression rate (the rate-distortion function) for Gaussian sources with low-complexity encoding and decoding algorithms. We have also shown that the source and channel codes constructed above can be combined to yield fast, rate-efficient codes for several canonical models in network information theory.


  • Slides from SPCOM talk, IISc Bangalore, July 2014.
  • "Sparse Regression Codes: Recent Results and Future Directions", Proc. 2013 IEEE Information Theory Workshop (invited). [PDF] [Slides]
  • "Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding", IEEE Transactions on Information Theory, vol. 60, no. 6, pp. 3254-3264, June 2014. [PDF] [Slides from ISIT '12]
  • "Lossy Compression via Sparse Linear Regression: Computationally Efficient Encoding and Decoding", IEEE Transactions on Information Theory, vol. 60, no. 6, pp. 3265-3278, June 2014. [PDF] [Slides from ITA '13]
  • "Sparse Regression Codes for Multi-terminal Source and Channel Coding", Allerton 2012. [PDF] [Slides]


Codes for Deletion and Insertion Models.

The problem of synchronization from insertions and deletions is important in several applications, such as file sharing, online editing, and distributed storage. My work in this area includes a) designing low-complexity codes to efficiently correct synchronization errors, and b) characterizing fundamental limits, i.e., bounds on the capacity of channels that introduce deletions and insertions.


  • Slides from talk at the Banff Workshop on Interactive Information Theory, Jan. 2012.
  • Slides from talk at ITW, Seville, Sep. 2013.
  • "Low-Complexity Interactive Algorithms for Synchronization from Deletions, Insertions, and Substitutions" [PDF]
  • "Achievable Rates for Channels with Deletions and Insertions", IEEE Transactions on Information Theory, vol. 59, no.11, pp. 6990-7013, November 2013. [PDF]


Feedback & Feed-forward.

Feedback is an important resource that is often available in both wireless and wired communication, but the question of how to optimally use feedback in a multi-terminal setting is not well understood. Broadly speaking, feedback induces correlation between the distributed transmitters and receivers in the network. We have developed coding schemes that effectively harness this correlation for multiple-access and broadcast channels with feedback. These schemes significantly improve on the best-known rates for these channels. In my Ph.D thesis, I also investigated the role of feed-forward in lossy data compression, for both point-to-point and multi-terminal models.


  • Slides from talk at HP Labs, Palo Alto, March 2010.
  • Slides from talk at ISIT, Austin, July 2010.
  • Slides from thesis defense. (Source coding with feed-forward)
  • "An Achievable Rate Region for the Broadcast Channel with Feedback", IEEE Transactions on Information Theory, vol. 59, no.10, pp. 6175-6191, October 2013. [PDF]
  • "A New Achievable Rate Region for the Discrete Memoryless Multiple-Access Channel with Feedback", IEEE Transactions on Information Theory, vol. 57, pp. 8038-8054, December 2011. [PDF]
  • "Source coding with feedforward: Rate-distortion theorems and error exponents for a general source", IEEE Transactions on Information Theory, vol. 53, pp. 2154-2179, June 2007. [PDF]
  • "Achievable rates for multiple descriptions with feed-forward", IEEE Transactions on Information Theory, vol. 57, pp. 2270-2277, April 2011. [PDF]


Codes for Data Storage.

Many non-volatile memory technologies such as Phase Change Media and Flash use `rewrites', i.e., it is possible to write multiple times on a memory cell until a desirable output is obtained. Since rewrites consume extra power and shorten the lifetime of the memory, there is a basic trade-off: what is the storage capacity of the memory subject to a fixed rewrite budget? Further, how do we design codes that optimally exploit the rewrite option? In collaboration with researchers from the Memory Technologies group at IBM, we have developed effiicient coding schemes for some basic rewritable channel models.


  • "Rewritable storage channels with hidden state", IEEE JSAC, vol. 32, no. 5, pp. 815-824, May 2014. [PDF]
  • "Coding Strategies for the uniform noise rewritable channel with hidden state", ISIT 2012. [PDF] [Slides]