What's Next?

What's Next?

With information theory mastered, you are ready for:

  • Machine Learning — cross-entropy loss, mutual information feature selection, variational autoencoders
  • Statistics — maximum likelihood, Bayesian inference, the connection between KL divergence and likelihood ratios
  • Cryptography — Shannon's perfect secrecy, one-time pads, and entropy in key generation
  • Signal Processing — source coding, channel coding, and the Shannon-Hartley theorem
← Previous