A detailed history of the processor
Intel Core 2 (2006)
The Intel Core 2 is a brand that houses a variety of different 64-bit X86-64 CPUs. This includes single-core, dual-core and quad-core processor based on Intel’s Core microarchitecture. The Core 2 brand encompassed a lot of different CPUs, but to give you an idea, you had the Solo (a single-core CPU), the Duo (a dual-core CPU), Quad (a quad-core CPU) and then later on, they had Extreme (a dual- or quad-core processor aimed at hardware enthusiasts).
The Intel Core 2 line was really the first multi-core processors. This was a necessary route for Intel to go, as true multi-core processors are essentially a single component, but with two or more independent processing units. They’re often referred to as cores. With multiple cores like this, Intel is able to increase overall speed for programs, and therefore, opening the path to more demanding programs as we could see today. That’s not to say Intel or AMD are responsible for demanding programs today, but without high-end processors and breakthroughs in technology by them, we really wouldn’t have the hardware that can run those programs.
Core 2 branded processors came with a lot of neat technology. For instance, you had Intel’s own virtualization technology, 64-bit architecture, low power, and SSE4 (Streaming SIMD Extensions 4, a processor instruction set).
AMD Phenom & Phenom II (2007)
AMD began the Phenom family of processors in 2007. It was a 64-bit desktop processor based off of AMD’s K10 microarchitecture. The Phenom family is an interesting. AMD actually considered the quad-core Phenoms (AMD made dual-core and triple-cores versions of the Phenom as well) to be the first processor with a true quad-core design. This is because all of the Phenom’s cores are on the same die. If you like at Intel’s Core 2 Quad processor, it features a multi-chip module design instead.
There were some issues with early Phenom processors where the system would lock-up in extremely rare instances. This is because of a flaw discovered in the translation lookaside buffer (TLB). Pretty much all early versions of the Phenom processor were affected, as it wasn’t fixed until version B3 of the Phenom processor in 2008. The processors without the bug also had a “xx50” model number (so, there would be the number “50” at the end of every model number, indicating that this was a processor without the bug).
After these issues, AMD eventually went ahead and launched a successor at the end of 2008, the Phenom II. The Phenom II comes in a lot of versions. They made dual-core, triple-core and quad-core variants in early 2009, but an improved quad-core model and a hex-core model came in around early to mid 2010. Again, it’s based off of the K10 microarchitecture, but it’s also built off of the 45nm semiconductor manufacturing process. The Phenom II initially launched on the Socket AM2+, but Socket AM3 versions launched in early 2009 with with DDR3 support.
The Phenom II is a really neat processor. Just a year before, the Phenom launched with a meager L3 Cache Size of 2MB. The Phenom II tripled that, bringing it up to 6MB. It also has the SSE4a instruction set. Black Edition’s of AMD’s Phantom II CPUs also offered some crazy overclocking potential. At CES 2009 in Las Vegas, in a public demonstration, it was able to achieve an overclock of a whopping 6.5GHz. In a separate instance, a group called LimitTeam was able to achieve 7.127GHz
Click here: Next Page
30 thoughts on “A detailed history of the processor”
In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).
The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.
So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.
The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…
In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).
The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.
So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.
The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…
In fact, the 8080 external interface was distinctly different from the 8086, in idea, not just width – for example, 8080 pin 21 (DMA acknowledge).
The 8086 was (almost) binary compatible with the 8080 for “regular programs” ie: not ones that twiddled ports nor relied on specific interrupt/trap behaviour.
So where do you draw the line? Where does Bob draw it? WHere does Fiona draw it? All in different places, I suspect.
The author obviously chose to draw their line at the 8086, probably because delving back beyond the original IBM PC machines might not be worthwhile given a presumed intended audience…
“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”
sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle
“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”
sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle
“The following chips are considered the dinosaurs of the computer world. PC’s based on these processors are the kind that usually sit around in the garage or warehouse collecting dust. They are not of much use anymore, but us geeks don’t like throwing them out because they still work. You know who you are.”
sounds just like my tech teacher becouse he is always complaining about how things have changed and shows us pictures from back when computers still used tapes and how he used to get paid to change the tapes every two hours for a hospitle