Logarithms Explained
A logarithm answers the question: what exponent do I need to raise this base to, to get this number? Logarithms are the inverse of exponentiation — and they appear in pH scales, earthquake magnitudes, sound levels, compound interest, and throughout information theory.
In this lesson
1 What a Logarithm Is
log_b(x) = y means b^y = x. Read as "log base b of x equals y." The logarithm asks: to what power must I raise b to get x?
Example: log₂(8) = 3 because 2³ = 8. log₁₀(1000) = 3 because 10³ = 1000. log₅(25) = 2 because 5² = 25.
Exponentiation: base^exponent = result. Logarithm: log_base(result) = exponent. They're inverse operations. Just as subtraction undoes addition, logarithms undo exponentiation.
2 Log Notation and the Three Forms
Every logarithmic equation has an equivalent exponential form:
Common log: log(x) without a base means log₁₀(x). Used in pH, decibels, earthquake scales.
Natural log: ln(x) means log_e(x) where e ≈ 2.71828. Used in calculus, exponential growth, information theory.
3 Logarithm Rules
4 Natural Log and e
e ≈ 2.71828 is Euler's number — the base of natural logarithms. It arises naturally (hence the name) in continuous growth and decay problems, compound interest at continuous compounding, and is the unique number where d/dx(eˣ) = eˣ.
ln(x) = log_e(x). Properties: ln(e) = 1, ln(1) = 0, ln(eˣ) = x, e^(ln x) = x.
The natural log is essential in calculus: ∫(1/x)dx = ln|x| + C. This is why e appears throughout advanced mathematics — it makes calculus clean.
5 Real-World Applications
pH scale: pH = −log₁₀[H⁺]. The negative log compresses the enormous range of hydrogen ion concentrations (from 1 to 0.0000001 mol/L) into the familiar 0-14 scale. Each unit is a factor of 10.
Earthquakes: the Richter scale is logarithmic. A magnitude 7 earthquake is 10× more powerful than magnitude 6, and 100× more than magnitude 5.
Decibels: sound intensity in dB = 10·log₁₀(I/I₀). Human hearing spans a factor of 10 trillion in intensity — logarithms compress this to 0-130 dB.
Information theory: entropy (information content) = −Σ p·log(p). Shannon's information entropy underlies data compression, cryptography, and machine learning.
Practice Problems
📚 Further Reading & Resources
Go deeper with these trusted free resources.