TechJunkie is a BOX20 Media Company

Home PC A detailed history of the processor

A detailed history of the processor

Pentium II (1997)

Intel made some major changes to the processor scene with the release of the Pentium II. They had the PentiumMMX and Pentium Pro’s out into the market in a strong way, and they wanted to bring the best of both into one chip. As a result, the Pentium II is kind of like the child of a Pentium MMX mother and the Pentium Pro Father. But like real life, it doesn’t necessarily combine the best of it’s parents. Pentium II is optimized for 32-bit applications. It also contains the MMX instruction set, which is almost a standard by this time. The chip uses the dynamic execution technology of the Pentium Pro, allowing the processor to predict coming instructions, accelerating work flow. It actually analyzes program instruction and re-orders the schedule of instructions into an order that can be run the quickest. Pentium II has 32KB of L1 cache (16KB each for data and instructions) and has a 512KB of L2 cache on package. The L2 cache runs at ½ the speed of the processor, not at full speed. Nonetheless, the fact that the L2 cache is not on the motherboard, but instead in the chip itself, boosts performance.

One of the most noticeable changes in this processor is the change in the package style. Almost all of the Pentium class processors use the Socket 7 interface to the motherboard. Pentium Pro’s use Socket 8. Pentium II, however, makes use of “Slot 1”. The package-type of the P2 is called Single-Edge contact (SEC). The chip and L2 cache actually reside on a card which attaches to the motherboard via a slot, much like an expansion card. The entire P2 package is surrounded by a plastic cartridge. In addition to Intel’s departure into Slot 1, they also patented the new Slot 1 interface, effectively barring the competition from making competitor chips to use the new Slot 1 motherboards. This move, no doubt, demonstrates why Intel moved away from Socket 7 to begin with – they couldn’t patent it.

The original Pentium II was code-named “Klamath”. It ran at a paltry 66 MHz bus speed and ranged from 233MHz to 300MHz. In 1998, Intel did some slight re-working of the processor and released “Deschutes”. They used a 0.25 micron design technology for this one, and allowed a 100MHz system bus. The L2 cache was still separate from the actual processor core and still ran at only half speed. They would not rectify this issue until the release of the Celeron A and Pentium III. Deschutes ran from 333MHz to up to 450 MHz.

Celeron (1998)

About the time Intel was releasing the improved P2’s (Deschutes), they decided to tackle the entry level market with a stripped down version of the Pentium II, the Celeron. In order to decrease costs, Intel removed the L2 cache from the Pentium II. They also removed the support for dual processors, an ability that the Pentium II had. Additionally, they ditched the plastic cover which the P2 had, leaving simply the processor on the Slot 1 style card. This, no doubt, reduced the cost of the processor quite a bit, but performance suffered noticeably. Removing the L2 cache from a chip seriously hampers its performance. On top of that, the chip was still limited to the 66MHz system bus. As a result, competitor chips at the same clock speeds could still outperform the Celeron. What was the point?

Intel had realized their mistake with the next edition of the Celeron, the Celeron 300A. The 300A came with 128KB of L2 cache on board. The L2 cache was on-die with the 300A, meaning it ran at full processor speed, not half speed like the Pentium II. This fact was great for Intel users, because the Celerons with full speed cache operated much better than the Pentium II’s with 512 KB of cache running at half speed. With this fact, and the fact that Intel unleashed the bus speed of the Celeron, the 300A became well-known in overclocking enthusiast circles. It quickly became known for the cheap chip you could buy and crank up to compete with the more expensive stuff.

The Celeron is available in two formats. The original Celerons used the patented Slot 1 interface. But, Intel later switched over to a PPGA format, or Plastic Pin Grid Array, also known as Socket 370. This new interface allowed reduced costs in manufacturing. It also allowed cheaper conversion from Socket 7 boards to Socket 370. Motherboard manufacturers found it easier to swap out a Socket 7 socket for a Socket 370 socket, more or less leaving the rest of the board the same. It was more involved to change designs over to a slotted board. Slot 1 Celerons ranged from the original 233MHz up to 433 MHz, while Celerons 300MHz and up were available in Socket 370.

AMD K6-2 & K6-3 (1998)

AMD was a busy little company at the time Intel was playing around with their Pentium II’s and Celerons. In 1998, AMD released the K6-2. The “2” shows that there are some enhancements made onto the proven K6 core, with higher speeds and higher bus speeds. They probably were also taking a page out of the Pentium “2” book. The most notable new feature of the K6-2 was the addition of 3DNow technology. Just as Intel created the MMX instruction set to speed multimedia applications, AMD created 3DNow to act as an additional 21 instructions on top of the MMX instruction set. With software designed to use the 3DNow instructions, multimedia applications get even more boost. Using 3DNow, larger L1 cache, on-die L2 cache and Socket 7 usability, the K6-2 gained ranks in the market without too much trouble. When used with Socket 7 boards that contained L2 cache on board, the integrated L2 cache on the processor made the motherboard cache considered L3 cache.

The K6-3 processor was basically a K6-2 with 256 KB of on-die L2 cache. The chip could compete well with the Pentium II and even Pentium III’s of the early variety. In order to eek out the full potential of the processor core, though, AMD fine tuned the limits of the processor, leading the K6-2 and K6-3 to be a bit picky. The split voltage requirements were pretty rigid, and as a result AMD held a list of “approved” boards that could tolerate such fine control over the voltages. Processor cooling was also an important issue with these chips due to the increased heat. In that regard, they were a bit like the Cyrix 6x86MX processors.

Pentium III (1999)

Intel released the Pentium III “Katmai” processor in February of 1999, running at 450 MHz on a 100MHz bus. Katmai introduced the SSE instruction set, which was basically an extension of MMX that again improved the performance on 3D apps designed to use the new ability. Also dubbed MMX2, SSE contained 70 new instructions, with four simultaneous instructions able to be performed simultaneously. This original Pentium III worked off what was a slightly improved P6 core, so the chip was well suited to multimedia applications. The chip saw controversy, though, when Intel decided to include integrated “processor serial number” (PSN) on Katmai. the PSN was designed to be able to be read over a network, even the internet. The idea, as Intel saw it, was to increase the level of security in online transactions. End users saw it differently. They saw it as an invasion of privacy. After taking a hit in the eye from the PR perspective and getting some pressure from their customers, Intel eventually allowed the tag to be turned off in the BIOS. Katmai eventually saw 600 MHz, but Intel quickly moved on to the Coppermine.

In April of 2000, Intel released their Pentium III Coppermine. While Katmai had 512 KB of L2 cache, Coppermine had half that at only 256 KB. But, the cache was located directly on the CPU core rather than on the daughtercard as typified in previous Slot 1 processors. This made the smaller cache an actual non-issue, because performance benefited. Coppermine also took on a 0.18 micron design and the newer Single Edge Contact Cartridge 2 (SECC 2) package. With SECC 2, the surrounding cartridge only covered one side of the package, as opposed to previous slotted processors. What’s more, Intel again saw the logic they had when they took Celeron over to Socket 370, so they eventually released versions of Coppermine in socket format. Coppermine also supported the 133 MHz front side bus. Coppermine proved to be a performance chip and it was and still is used by many PCs. Coppermine eventually saw 1+ GHz.

Click here: Next Page

Recover Deleted Files From Computer On Surface Pro 4 (Solution)

Read Next 

30 thoughts on “A detailed history of the processor”

Pingback: die besten 4k fernseher 65 zoll 2018 im vergleich idealo magazin
James says:
Credit to callum instone or as i call him King CPU love u callum
James says:
Credit to callum instone or as i call him King CPU love u callum
James says:
Credit to callum instone or as i call him King CPU love u callum
roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
slimm says:
thanks for the notess
slimm says:
thanks for the notess
slimm says:
thanks for the notess
prajjwol says:
what is the significances of the number like 8086 in the processor
prajjwol says:
what is the significances of the number like 8086 in the processor
prajjwol says:
what is the significances of the number like 8086 in the processor
jake norfield says:
8085
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

steven says:
Really your services are good we like it please keep it up.
steven says:
Really your services are good we like it please keep it up.
steven says:
Really your services are good we like it please keep it up.
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: The World’s Most Powerful Telescopes and Best Observatories | Travel in the Sky
Pingback: The World’s Most Powerful Telescopes and Best Observatories | Travel in the Sky

Leave a Reply

Your email address will not be published. Required fields are marked *


Adam

Dec 15, 2016

643 Articles Published

More