Why are computers getting smaller (& smaller)?

Computers in the 1950s were the size of a house – literally. Today, your vastly more powerful smartphone fits in your jeans pocket. How could anything shrink in size while growing in power? The big breakthrough was the invention of the transistor, a tiny device that controls electronic signals. Like a nerve cell in the human brain, a transistor works with other transistors to store and process information in computing devices (and other gadgets). Transistors were installed on silicon microchips, which replaced much larger vacuum tubes in the 1950s. Many consider the transistor the greater invention of the 20th century. In 1965, Gordon E. Moore, co-founder of the high-tech Intel Corporation, predicted that the number of transistors that could fit on a microchip would double every two years. Known as Moore’s Law, his prediction held true. In 1971, computer makers could fit only about 4,000 transistors on a chip; by 2011, they could cram in over 2.5 billion. Today, engineers are searching for the transistor’s successor.

 

Picture Credit : Google