I am Reader in Information Theory in the Probability and Statistics Group in the School of Mathematics at Bristol University. I help run the Centre for Doctoral Training in Communications (offering 10 fully funded PhD places per year). I was previously Max Newman Research Fellow of the Statistical Laboratory of Cambridge University and Clayton Fellow of Christ's College, and Associate Director of the Heilbronn Institute for Mathematical Research.
Much of my research applies Information Theory to understand limit theorems in probability. Recently this has involved thinning discrete random variables - including Poisson and compound Poisson - in terms of maximum entropy, monotonicity and approximation. I also research interference alignment in interference networks, studying sum capacity and practicality.
I published a book about my older research, entitled Information Theory and the Central Limit Theorem. Here is a list of known errata. I have also published on the Poincaré constant, match lengths, quantum data compression and the entropy power inequality. My preprints and academic links are available on this site.