Happy returns — microchip, an integrated circuit used in almost all electronic gadgets and equipment now-a-days, has turned 50 today.
The first working microchip was invented by Jack Kilby of Texas Instruments on September 12, 1958 — it consisted of a strip of germanium with one transistor and other components all glued to a glass slide.
In fact, the story dates back to July 1958 when Kilby was a newly employed engineer at Texas Instruments who did not yet have the right to a summer vacation.
So, he was compelled to spend the summer working on the problem in circuit design that was commonly called the "tyranny of numbers" and finally came to the conclusion that manufacturing the circuit components in mass in a single piece of semiconductor material could provide a solution.
On September 12, he presented his findings to the management of Texas Instruments— he showed them a piece of germanium with an oscilloscope attached, pressed a switch, and the oscilloscope showed a continuous sine wave, proving that his integrated circuit worked and that he solved the problem.
Also Read
Kilby's rough device, measuring seven 16ths of an inch by one 16th of an inch, revolutionised electronics, and the world.
The microchip virtually created the modern computer industry, and the Internet would be unthinkable without it. Modern communications, transport, medicine, manufacturing and commerce are all based on the remarkable processing power of microchips.
"Integrated circuits are so woven into our lives that it would be hard to imagine a world without them. The integrated circuit is the engine of the information age," Chief of Research at Gartner Jim Tully was quoted by 'The Times' as saying.


