Moore’s Law refers to the observation that the number of transistors on a microchip, a proxy for computing power, doubles approximately every two years, while the cost of computing falls. Coined by Intel co-founder Gordon Moore in 1965, the principle explains decades of rapid growth in digital technology, enabling exponential increases in processing speed, memory capacity, and efficiency.
« Back to Glossary Index