Skip to content. | Skip to navigation

home

Follow us on: facebook logo

History of Computers: Microchips and microprocessors

MicrochipsComputers used to fill up a room – and were less powerful than your mobile phone. Find out how they slimmed down.

The circuits of the earliest electronic computers were put together like the electrics in your house, with individual components joined together by wires. This way of building computers greatly limited the number of components that could fit in a space, stopping computers from becoming smaller or more powerful.

This changed with the invention of the integrated circuit, also known as the microchip. This builds the components and the circuit directly on a single piece of semiconductor. It’s almost like printing out the entire circuit onto the chip.

This advance led to the development of the microprocessor. A microprocessor is a single microchip which contains the entire central processing unit (CPU) – the part of the computer that actually carries out the instructions contained in a program.

By getting rid of the need for separate components, it became possible to make computers much smaller. Since the creation of the microprocessor, improvements in manufacturing technology have made it possible to put more and more components on an integrated circuit more and more cheaply. This is described by an idea called Moore’s Law, which says that the number of transistors that can be put cheaply on a chip doubles every two years. This trend has carried on since 1958.

Related links