In this module, we will study information entropy. Information entropy provides a way to mathematically quantify randomness. The higher the randomness, the greater the information entropy. And when it is deterministic or there's no randomness, the information entropy is equal to zero. An important application of entropy is, it's used to quantify the strength of the key or the secret information that access the inputs to drive the cryptosystem. This is not the only use for information entropy. As entropy is also useful in data compression and in other security contexts such as anomaly detection and networking analysis, and anomaly detection software executables. But, our main use of entropy in this course will be to quantify the strength of the key or the secret that drives the cryptosystem. In such a cryptosystem, the key, or the secret, provides an information asymmetry between the key holders, or the legitimate parties, and the non-key holders including the bad guys. The information asymmetry also affects the computation. With the key, the decryption becomes deterministic, which corresponds to zero entropy. But without the key, the decryption is random. In such system, information entropy can be used to quantify the difficulty for the attacker to derive the key. In this module, you will define randomness and describe the difference between deterministic versus randomness. You will review examples of randomness and entropy. And, you will mathematically compute the information entropy given a random distribution and describe how entropy is a function of the number of possible outcomes, or the alphabet size, and the occurrence distribution.