My research broadly focuses on communications and information processing for networks. Particular areas of interest include network information theory, coding, statistical inference and learning. A list of my publications can be found here. If you are interested in a Ph.D or postdoctoral position at Cambridge, please read this.


Here is a sample of my work:

Optimal Source and Channel Coding via Sparse Linear Regression.

Codes based on high-dimensional linear regression were recently introduced for communication over Gaussian channels. These codes achieve rates approaching the channel capacity with computationally efficient encoding and decoding. In recent work, we have used the sparse regression framework to design codes for data compression. These codes are the first to attain the optimal compression rate (the rate-distortion function) for Gaussian sources with low-complexity encoding and decoding algorithms. We have also shown that the source and channel codes constructed above can be combined to yield fast, rate-efficient codes for a variety of network problems.


  • "Sparse Regression Codes: Recent Results and Future Directions", Proc. 2013 IEEE Information Theory Workshop (invited). [PDF] [Slides]
  • "Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding"
    [PDF] [Slides from ISIT '12]
  • "Lossy Compression via Sparse Linear Regression: Computationally Efficient Encoding and Decoding" [PDF] [Slides from ITA '13]
  • "Sparse Regression Codes for Multi-terminal Source and Channel Coding", Allerton 2012. [PDF] [Slides]


Codes for Deletion and Insertion Models.

The problem of synchronization from insertions and deletions is important in several applications, such as file sharing, online editing, and distributed storage. My work in this area includes designing low-complexity codes to efficiently correct synchronization errors as well as computing fundamental limits, i.e., bounds on the capacity of channels that introduce deletions and insertions.


  • Slides from talk at ITW, Seville, Sep. 2013.
  • Slides from talk at the Banff Workshop on Interactive Information Theory, Jan. 2012.
  • "Achievable Rates for Channels with Deletions and Insertions", IEEE Transactions on Information Theory, vol. 59, no.11, pp. 6990-7013, November 2013. [PDF]
  • "Efficient Interactive Algorithms for File Synchronization under General Edits", Proc. 51st Allerton Conf. on Communication, Control, and Computing , 2013. [PDF]
  • "Interactive Low-Complexity Codes for Synchronization from Deletions and Insertions", Proc. 48th Annual Allerton Conference on Communication, Control, and Computing, Sep. 2010. [PDF]


Feedback & Feed-forward.

Feedback is an important resource available in both wireless and wired networks, but exploiting it in a multi-terminal setting is not well-understood. Broadly speaking, feedback induces correlation between the distributed transmitters and receivers in the network. We have developed coding schemes that effectively leverage this correlation in multiple-access and broadcast settings. These schemes significantly improve on the best-known rates for these channels. In my Ph.D thesis, I also investigated the role of feed-forward in lossy data compression, for both point-to-point and multi-terminal models.


  • Slides from talk at HP Labs, Palo Alto, March 2010.
  • Slides from talk at ISIT, Austin, July 2010.
  • Slides from thesis defense. (Source coding with feed-forward)
  • "An Achievable Rate Region for the Broadcast Channel with Feedback", IEEE Transactions on Information Theory, vol. 59, no.10, pp. 6175-6191, October 2013. [PDF]
  • "A New Achievable Rate Region for the Discrete Memoryless Multiple-Access Channel with Feedback", IEEE Transactions on Information Theory, vol. 57, pp. 8038-8054, December 2011. [PDF]
  • "Source coding with feedforward: Rate-distortion theorems and error exponents for a general source", IEEE Transactions on Information Theory, vol. 53, pp. 2154-2179, June 2007. [PDF]
  • "Achievable rates for multiple descriptions with feed-forward", IEEE Transactions on Information Theory, vol. 57, pp. 2270-2277, April 2011. [PDF]


Codes for Data Storage.

Many non-volatile memory technologies such as Phase Change Media and Flash use `rewrites' - while storing data, it is possible to write multiple times on a memory cell until a desirable output is obtained. Since rewrites consume extra power and shorten the lifetime of the memory, there is a basic trade-off: what is the storage capacity of the memory subject to a fixed rewrite budget? Further, how do we design codes that optimally exploit the rewrite option? In collaboration with researchers from the Memory Technologies group at IBM, we have developed effiicient coding schemes for some basic rewritable channel models.


  • "Rewritable storage channels with hidden state", IEEE JSAC, May 2014 [PDF]
  • "Coding Strategies for the uniform noise rewritable channel with hidden state", ISIT 2012. [PDF] [Slides]