1961: Expensive progress
As the microchip began to be distributed more widely, it began to be used by the United States Air Force to manufacture missiles, and by NASA in the Apollo projects. At this stage, a single microchip cost US$31.
1965: Moore’s Law
Co-founder of Intel Gordon E. Moore claimed that the number of transistors on a microchip doubled every two years, though the cost of computers was halved. This statement, which then became known as Moore’s Law, suggested that computers would become cheaper as their capabilities increased.
1971: Lower costs with mass supply chain production
Half a decade later Moore’s Law was proved correct. Thanks to the American government’s investment, the mass production of microchips reduced their cost to US$1.25.
“It was the government that created the large demand that facilitated mass production of the microchip” explained Fred Kaplan, author of 1959: The Year Everything Changed.
1986: Managing costs with the Semiconductor Agreement
Moore had not, however, considered how competing international interests and trade wars would affect microchip manufacturing. The Semiconductor Agreement between the USA and Japan made sure to fix prices for manufacturing so that the supply chain competition did not get out of hand.
1998: First microchip is implanted into a human
The first microchip-to-human experiment took place at the end of the 20th century. Professor Kevin Warwick, Cybernetics Director at the University of Reading, was the first human in history to have a microchip implanted into their body.
After one week the microchip was removed. Warwick said that smart-card activated doors had opened for him, and lights would blink around him.

