TechJunkie is a BOX20 Media Company

Home PC A detailed history of the processor

A detailed history of the processor

AMD APU’s (2011 – present)

AMD launches a new line of processors called the Accelerated Processing Unit (APU). It is, of course, a line of 64-bit processors, but is innovative because it’s designed to act as a CPU and GPU on a single chip (so, you’d have your regular CPU, but also an on-die GPU). The first generation of APUs announced in 2011 was Llano and Brazos. The former was designed for high performance situations while the latter was geared towards low-power devices. Trinity and Brazos-2 was announced in 2012 — Trinity for a high performance solution and Brazos-2, again, for a low-power offering. Kaveri was the third generation core, announced in early 2014 for high-performance. In the summer of 2013, Kabini and Temash were announced and intended for low-power hardware.

The AMD APU started out as just a project — the AMD Fusion project. It all started in 2006 when AMD wanted to create a system-on-a-chip (SoC) that combined the CPU with a on-die GPU. And that’s how the AMD APU got started.

There’s a lot of neat technology embedded in it — out-of-order execution, SSE5/AVX4 instruction, and they came on both the FM1 and FM2 sockets. It wouldn’t necessarily be surprising if you hadn’t heard of the AMD APU before, but despite that, it’s likely many tech enthusiasts and average gamers use the chip everyday. Both Sony’s PlayStation 4 and Microsoft’s Xbox One use custom versions of third generation low-power APUs.

AMD FX (2010 – present)

And then you have the AMD FX microprocessor. It’s most definitely not a successor to the AMD APUs, but something sold alongside them that directly compete with Intel’s Sandy Bridge and Ivy Bridge architectures. The AMD FX processors are actually geared more towards the high performance market, while the AMD APUs have a wider range of markets (low power and high performance) to cover.

One major difference between Intel’s Sandy Bridge and Ivy Bridge is that the AMD APUs don’t have integrated graphics. The integrated GPU is something AMD is keeping with its APU line. Still, the AMD FX is built off of the 32nm processor and AMD actually calls the FX series the first native 8-core desktop processor. As far as sockets go, AMD — for the most part — uses a single socket, the AM3+ for the FX series.  Some other things worth mentioning is that the FX series has the FMA instruction set and supports Open CL.

When it initially launched in 2011, it was built off of the Bulldozer microarchitecture. In 2012, the Piledriver architecture succeeded that. Both of these architectures use a modular design to put two-cores on a single module. But, another successor is coming in 2017 — the Zen microarchitecture. It will use the 14nm process, feature SMT (a version of Intel’s hyperthreading) and will employ the AM4 socket, which provides support for DDR4 RAM.

You can actually get a significant amount of performance out of the AMD FX series. All of the cores (or CPUs) in this series are all unlocked and overclockable, allowing you to seriously push the clock speed on these processors. For instance, using liquid nitrogen for cooling, the AMD FX-8370 was able to set a world record for clock speed — 8722.78MHz or a little over 8.7GHz.

Since the FX series are high performance processors, they also have a high TDP — up to a whopping 220W.

Intel offers some serious power with their currently line of Core i7 processors, but the AMD FX series takes the cake for the highest performance chips for consumer PCs. The drawback is that there’s no onboard GPU, but when you’re seeking power like this, you might rather have a dedicated video card anyway. It’ll certainly be interesting to see what 2017 and beyond brings with the competition between AMDs upcoming Zen microarchitecture and Intel’s Kaby Lake and Cannonlake architecture.

Closing

And that wraps up the timeline of the many different processors out there, at least for the time of this writing. Processor technology is an interesting concept, and if you read about the different CPUs, you’ll notice the trend of them getting smaller, yet more powerful. It’ll no doubt be interesting to see what we have in another 10 or twenty years down the road.

Keep in mind that this is a timeline we plan on keeping updated, so as new CPU generations release, be sure to check back here for new information!

Recover Deleted Files From Computer On Surface Pro 4 (Solution)

Read Next 

30 thoughts on “A detailed history of the processor”

Pingback: die besten 4k fernseher 65 zoll 2018 im vergleich idealo magazin
James says:
Credit to callum instone or as i call him King CPU love u callum
James says:
Credit to callum instone or as i call him King CPU love u callum
James says:
Credit to callum instone or as i call him King CPU love u callum
roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

roger crouch says:
This article lacks credibility. The first chip of the series was the 8080, then the 8085 was made (the 5 indicating it only needed +5v and ground instead of +-5 and +12) https://www.quora.com/What-is-the-difference-between-8085-and-8086 So the only true thing that can be said about the 8086 was that it was 16bit 8080 processor with improved IC features and more command set.
Mike Spooner says:
From certain perspectives, the “first chip of the series” was the 4004 (1971), or pehaps the 8008 (1972), the 4040 (1974), 8080 (1974), or…

In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).

The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.

So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.

The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…

amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
amandu benard says:
i love the notes they are precise and straight to key needed aspects thank you very much
slimm says:
thanks for the notess
slimm says:
thanks for the notess
slimm says:
thanks for the notess
prajjwol says:
what is the significances of the number like 8086 in the processor
prajjwol says:
what is the significances of the number like 8086 in the processor
prajjwol says:
what is the significances of the number like 8086 in the processor
jake norfield says:
8085
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
Mary Alice Thauvette says:
This article was posted 23-Mar-01. That was nine years ago. It is time to update the article. Or, at least change the title of the last section from :1999 – Present” to “1999 – March 2001”
chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

chelle-marie says:
that is great i loved the little joke:

“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”

sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle

steven says:
Really your services are good we like it please keep it up.
steven says:
Really your services are good we like it please keep it up.
steven says:
Really your services are good we like it please keep it up.
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: History of processor!!! - Raymond.CC Forum
Pingback: The World’s Most Powerful Telescopes and Best Observatories | Travel in the Sky
Pingback: The World’s Most Powerful Telescopes and Best Observatories | Travel in the Sky

Leave a Reply

Your email address will not be published. Required fields are marked *


Adam

Dec 15, 2016

643 Articles Published

More