A microprocessor is a programmable digital electronic component that incorporates the functions of a central processing unit (CPU) on a single semiconducting integrated circuit (IC). The microprocessor was born by reducing the word size of the CPU from 32 bits to 4 bits, so that the transistors of its logic circuits would fit onto a single part. One or more microprocessors typically serve as the CPU in a computer system, embedded system, or handheld device.

Microprocessors made possible the advent of the microcomputer in the mid-1970s. Before this period, electronic CPUs were typically made from bulky discrete switching devices (and later small-scale integrated circuits) containing the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages (containing the equivalent of thousands or millions of discrete transistors), the cost of processor power was greatly reduced. Since the advent of the IC in the mid-1970s, the microprocessor has become the most prevalent implementation of the CPU, nearly completely replacing all other forms. See History of computing hardware for pre-electronic and early electronic computers.
The evolution of microprocessors has been known to follow Moore's Law when it comes to steadily increasing performance over the years. This law suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 24 months. This dictum has generally proven true since the early 1970s. From their humble beginnings as the drivers for calculators, the continued increase in power has led to the dominance of microprocessors over every other form of computer; every system from the largest mainframes to the smallest handheld computers now uses a microprocessor at its core.