Views Bangladesh Logo

The tale of chip war: Part 4

Intel's revolutionaries: A new age of silicon

Mahmud  Hossain

Mahmud Hossain

In 1968, student protests in Berkeley and communist unrest in Beijing shook the world. Meanwhile, the Palo Alto Times published a small but groundbreaking piece of news titled "Founders leave Fairchild: Build their electronics company."

Bob Noyce and Gordon Moore, fed up with Fairchild's corporate meddling and dissatisfied with limited stock options, decided to build something new. It wasn't a rebellion, but a way to shake up the system. They founded Intel, believing that the transistor would one day be the most widely manufactured product in history, and that trillions of microscopic elements would power the future. They envisioned a world where semiconductors would not only support society but also become its backbone, and where small cities like Palo Alto and Mountain View would become global technology hubs.

Within just two years, Intel had introduced its first major product, the DRAM (dynamic random access memory) chip. Until then, computer memory had been based on magnetic cores-small coils of metal wire that stored data using magnetism. As the demand for digital memory grew, the limitations of this old method became apparent.

That's when IBM engineer Robert Dennard came up with a solution. He proposed a device that could hold an electrical charge to represent a 1 or 0 in digital systems by pairing a transistor with a capacitor. Since the capacitor had to be refreshed repeatedly as it discharged its charge, it was called 'dynamic' memory. The DRAM chip remains at the heart of computer memory to this day.

The most attractive aspect of the DRAM chip was that it could be made by etching it on silicon, without having to be hand-assembled. Intel bet big on DRAM because it was small, fast, reliable, and cheap. They believed that, according to Moore's Law, transistor density would double every two years, so Intel would always be ahead of the competition.

Unlike logic chips, DRAM chips do not have to be designed separately for each device. They were generic - meaning they could be used in anything from calculators to computers, reducing production costs, establishing Intel's dominance in the sector.

But Bob Noyce took a challenge. In 1969, a Japanese calculator company asked Intel to make 12 custom chips for their new calculator. This required 24,000 transistors. Noyce gave this task to an engineer named Todd Hoff, who was an expert in neural networks and computer architecture. Hoff saw a new possibility in this work: instead of making 12 custom chips, what if one simple chip could be made that could do any job with software! This idea became the beginning of another revolution.

From this idea was born the world's first microprocessor, the Intel 4004. It was a programmable computer in a single chip. Although other companies had previously secretly made some microprocessors for military projects, Intel was the first to bring it to the commercial market. It could be used in all kinds of products, and it started the computing revolution. Intel was the most advanced chip manufacturer in the world at the time. Now they had a product in their hands that could radically change all kinds of industries.

The impact of this innovation on the future was most understood by Caltech professor Carver Mead. He was close to Gordon Moore and also worked as a consultant to Intel. He was the first to coin the term ‘Moore’s Law’ and became one of the thinkers of the digital age. In 1972, Mead predicted: “Within the next ten years, automation will be present in every aspect of our lives in some way.” He envisioned: phones, cars, home appliances—everything would have a microprocessor or small computer.

This was not just a technological revolution, but also a social and political transformation. It was understood that whoever created and controlled computing systems would be the future powerhouses. Knowledge of software and hardware would be the key to future dominance.

Humanity had moved beyond the industrial age and entered the digital age, relying on silicon chips and binary code. At that juncture, Intel engineers were not just technologists; they were revolutionaries.

(Adapted and abridged from Chapter 9 ('The Transistor Salesman') of Chris Miller's acclaimed book 'Chip War')

Leave A Comment

You need login first to leave a comment

Trending Views