Intel: The Godfather of Modern Computers

At the heart of your phone, tablet and computer
lies the microprocessor, a tiny chip home to billions of transistors capable of processing
an immense amount of information. Without the microprocessor, modern technology
could not exist, which is why this week we’ll be looking at the company that started
it all, Intel. December 23, 1947. After two years of restless labor at Bell
Laboratories, these three men stood in awe of the transistor, their greatest invention. The man in the middle was William Shockley,
an entrepreneurial fellow who realized what a fortune he could make from this new technology. In 1956 he moved to the west coast, establishing
the first silicon device company in what came to be known as Silicon Valley. He couldn’t convince any of his former colleagues
at Bell Labs to leave with him and so he resorted to hiring fresh university graduates. In an ironic twist of fate, just one year
later eight of his brightest employees got together and left the company in the same
way that he had left Bell Laboratories. Under the patronage of industrialist Sherman
Fairchild, the “Traitorous Eight”, as they were called, founded Fairchild Semiconductor. Much to Shockley’s dismay, Fairchild became
one of the leaders of the industry while his own venture failed. In 1959 one of the original “Traitorous
Eight”, Robert Noyce, created the first integrated circuit. Like the transistor before, the integrated
circuit was a technology with huge potential, and he knew that. In 1968 he left Fairchild to start his own
company and he was joined by his colleague and fellow ‘traitor’ Gordon Moore, who
had famously postulated Moore’s law. To fund their venture they went to Arthur
Rock, the acclaimed investor who had arranged their original deal with Sherman Fairchild
a decade earlier. With $3 million of initial capital and the
creative portmanteau of integrated electronics, Noyce and Moore founded Intel on July 18,
1968. Behind their venture was the ambitious plan
to build large-scale integrated semiconductor memories. Back then, they were ten times more expensive
than standard magnetic core memories, which were much slower and less efficient. Nine months after its creation, Intel had
developed its first product: the 3101 Schottky bipolar memory. It was the world’s first solid state memory
device and it was capable of storing a whopping 64 bits. One year later, Intel became pioneers in dynamic
random access memory, or DRAM, by creating the first commercially available DRAM chip,
the 1103. Its success marked the beginning of the end
for magnetic memory and established DRAM as the primary storage medium of modern computers. Intel’s reputation grew quickly, and not
just in the United States. A Japanese calculator company called Busicom
had reached out to Intel in 1969 with a request to build integrated circuits for their calculators. While working on this project, Intel engineer
Ted Hoff figured out a way to build a central processing unit onto a single chip. By cramming 2,300 transistors onto a one-eighth-
by one-sixth-inch chip, Hoff’s invention had the same power as the ENIAC computer that
was the size of a room. Intel had unwittingly stumbled upon the foundation
of modern computing, the microprocessor. They called it the 4004 and started selling
it in 1971. A year later, Intel unveiled the 8008, an
8-bit microprocessor. Intel’s first general-purpose microprocessor,
the 8080, came in 1974 and it essentially became the industry standard, finding its
way into almost every cash register, calculator and traffic light of its day. Interestingly enough, the 8080 was designed
for almost everything except computers. At the time, computers were manufactured entirely
in-house, with a single company building its own terminals, compilers, and operating systems. The 8080, however, became so popular that
the manufacturers, starting with Hewlett Packard, eventually began designing their systems around
it. In 1978 Intel released the 8086, a 16-bit
processor that would eventually become Intel’s saving grace. Up until that point Intel’s revenues were
coming almost entirely from their DRAM division, but Japan’s rising semiconductor industry
was quickly eating away at their profits. Intel’s only way forward was microprocessors,
and they went all in by partnering up with IBM. We’ve already covered IBM in a previous
video, but just to recap, in the early 1980s IBM were struggling to catch up with the rise
of the personal computer. At first, IBM didn’t think PCs would be
worth it to the average person, but once that started happening anyway, IBM’s bureaucracy
made developing their own PC a nightmare. They ended up partnering with Intel for their
processor and with Microsoft for their operating system, which allowed them to develop their
IBM PC in just under a year. It was released in 1981 and it became the
dominant personal computer of its time, establishing Intel as the chief supplier of processors. The IBM PC used a modified 8086 processor,
and although IBM eventually lost the personal computer market to cheap compatible copycats,
Intel remained at the heart of every personal computer made over the next decade. The legacy of the 8086 remains to this day,
as the vast majority of modern computers are based on its derivative x86 architecture. During the 1980s Intel emerged as the most
profitable hardware supplier to the rising PC industry. They reached $1 billion of revenue in 1983,
and the same amount as net income just nine years later. In 1993 Intel released the Pentium line, their
fifth generation of processors. For this generation Intel started building
dedicated motherboards alongside its processors, a move that kept them ahead of their competition
and doubled their net income that year to $2.3 billion. Throughout the 90s Intel continued to develop
more powerful processors, more or less in accordance with Moore’s law. In 1998 Intel branched out into the value-PC
market by releasing the cheap, low-performance Celeron line. The new millennium, however, would be a much
more difficult time for Intel. The dot-com crash and fierce competition from
AMD saw Intel fall below 80% market share for the first time in decades. The situation became so bad that in 2001 Intel’s
profits had slumped by a stunning 87%. By that point it became clear that racing
to build faster and faster processors wasn’t the way to go, especially when most people
were using their computers just to read their email or browse the web. Intel shifted their focus accordingly, building
a more efficient, less power-hungry line called Centrino. Released in 2003, the Centrino wasn’t actually
a processor but a fully functional platform, complete with a chipset and wireless network. It worked extremely well on portable computers
just around the time when laptops were finally starting to take off, lifting Intel back to
the top of the industry. In line with their new philosophy, Intel began
developing multi-core processors, releasing their first dual-core in 2005. In general, the past few generations have
been split into three main categories based on processing power: i3, i5, and i7. Up until last year, Intel were operating on
a “Tick-Tock” model, where they either shrink the size of the current microarchitecture
to make it more efficient or release an entirely new one every 18 months. The performance of the last two generations
hasn’t improved that much though, and Intel have also attracted a lot of antitrust litigation. In 2009 the European Union fined Intel more
than one and a half billion dollars for bribing computer manufacturers to use their processors. Similar accusations have sprung up in the
US, Japan and South Korea. Despite the lawsuits, Intel’s business has
been going great, and they’ve been able to branch out into various other tech markets,
usually through acquisitions. Among other things they’re working on solid-state
drives, machine learning and autonomous vehicles. Some of these projects are more successful
than others, but it’s unlikely that they’ll be replacing Intel’s main microprocessor
business any time soon. Thanks for watching and a big thank you to
all of our patrons for supporting this video! Be sure to subscribe if you haven’t already
and to check out the full Behind the Business playlist for the interesting stories of other
companies. Once again, thanks a lot for watching, and
as always: stay smart.

  1. intel az jobb. de internet nélkul egy szart sem ér a AI :)) internetfuggö zombie :)))))mint ahogyan a média gyaártja a sok média zombiet:))))))))))))

  2. 4 bit is 16 instructions. 8 bit is 256 instructions. 16 bit is 64k instructions. 32 bit is 4 billion instructions. do we need a 64 bit? nope. but how else will you sell those 6 core CPUs to the plebs?

  3. the X86 made all the difference for Intel. Before the X86 software had to be re-written for new different platform/architectures. The X86 architecture was the first backwards compatible processor family

  4. Rise and fall of the the Sinclair ZX Spectrum would be interesting until its demise in the face of emergence of Microsoft and Apple

  5. "unwittingly stumbled on the foundation of modern computing" – by very intentionally setting out to develop the CPU. who writes this crap?

  6. Riiight… except Cyrix outbenchmarked intel every single fucking time.. and Intel bullied them to bankrupt with court cases with their fatter wallet….

    fuck off normies


  8. And now compare this to a company like Apple, whose value comes only from the fact that their smartphones are alittle bit more shiny than other phones.

  9. There's no way to do a video on Intel in under 10 minutes without completely leaving out the back and forth with AMD. Any proper video including them would have to run an hour.

  10. There is still so much more to this subject, but it is a good scratch of the surface. 🙂 For example, in the early days, perhaps the biggest driver of this development was the military and a lot of of the money to that field came from the different type of government projects and research. I doubt we would have much of the computing technology today without WW2.

    Then when personal computers started to really take off, the hardest part was already done. After that it has been a lot about iterating the same idea, making the transistors smaller and fitting more on the same chip. The biggest revolutions here have probably been in the manufacturing technology.

    Why they stopped increasing the clock frequency is largely because it increases the power consumption so heavily. The increase of calculation power itself did not end, the means to increase it are just different than by increasing the chip operating frequency. Power consumption is a problem in two days. It directly defines how much heat the chip is going to generate, and therefore defines the size of the cooling system ( you cannot have a massive cooler for example in a laptop ). Secondly for mobile devices one big problem even today is the battery life. Energy consumption in relation to the computation power is very important parameter for servers and data centers as well. By other words, how much you can calculate with some unit of energy consumed.

    General purpose processors nowadays adjust their clock frequency up and down constantly depending of the load, to generate less heat, and to consume less energy. They also turn parts of the chip completely off when they are not being used, for the very same reasons.

  11. AMD came out with 386 and 486 just like Intel. All keyboard makers and HDD makers made generic parts to conform to the PC standards. Intel sued and court told them 486 is just a number. 586 was labelled pentium so that AMD was forced to find its own number. While AMD was company of geeks, Intel was full of Jewish #### with business acumen. They made sure AMD never had support from motherboard manufacturer, RAM and PC manufacturer. AMD couldn't be fully killed as that would make Intel a bully and raise eyebrows. Today Ryzen rocks but its motherboard and memory is also easy to get with a simple search on Amazon.

  12. 4:22 40 years later the limited edition 8086K (released this year). I almost got it but decided to spend less on the almost as powerful 8700K. (I dont want to seem spoiled… I was gonna have an i5 and a 1060 3gb but i got a gtx 1080 for free, meaning i didnt have to include a gpu price in my budget)

  13. And Indian computer scientist 'Vinod Dham' working at Intel, is famously known as 'Father of Pentium chips'… He developed the first pentium chip..

  14. As a sound engineer bruv, this vid had your voice without any de-esser on it and it's hugely irritating to me considering that your sibilants are sharp af and are borderline whistles.

  15. More interesting story would've been about the "Traitorous Eight" and how they founded Fairchild Semiconductors that established Silicon Valley in what it is today seeding the area with talent people who left and made their own companies. Robert Noyce in particular. He was known as The Mayor of Silicon Valley for a reason. I think Fairchild Semiconductor was one of the the most influential company of 20th century.

  16. You should have mentioned about Vinod Dham, this video is incomplete without mentioning his name same like you mentioned about other Engineers

  17. The CPU on the Phones are different, they're not x86, nor x64, nor x32, but A.R.M. and Intel tried to make A.R.M. CPU's but failed… And the only A.R.M. CPU Brands are, Qualcomm, Samsung Exynos, and Apple A series (maybe Not arm…)

  18. Having a CPU those days was the equivalent of driving a great car. I remember the first intel up I had, was a Prescott, fast and strong, had a great grip and could break any code, then I had an athlon amd CPU. Was perfect, playing games on it was smooth like hot knife cutting butter. It was just beautiful to see a motherboard with no north bridge… eventually and stopped doing good coups and I had a sandy bridge. The first of the first core 2. This was a new era. Amd took years to catch up…

  19. At the heart of your phone, tablet and …. "lies" the microprocessor … yes "lies", because it's ARM not Intel's 🙂

  20. Покажу, как легко выводuть раз в сутки до 50 долларов на свою карту. Изучите вuдео у меня канале

  21. Xeros made the biggest mistake in the history of computers, not understanding wat they actually invented, GUI. Apple stole from it and microsoft stole from apple

  22. I stoped watching the video when i noticed the the intel doesnt have a moral background and nothing to be inspired

  23. i just finished watching silicon cowboys on netflix and wanted to know more about the history of intel. the early years sound like a story for a movie/documentary too!

  24. This is BS! Where are the COMMODORE, APPLE II and SINCLAIR computers I was playing with as a kid in this historical progression?

  25. 5:27 hey there's a Tandy 1000 running Deskmate there. Neat. It seems no one else remembers it. Was my first GUI. Booted off a floppy.

  26. The very first Pentiums were 5 volt, 800 Nanometer heat monsters. Not many people know the first gen pentium evolved quite a bit over it's lifetime. The last P54CS(?) Pentiums were 350nm.

Leave a Reply

Your email address will not be published. Required fields are marked *