1999 vs. 2009 Then And Now – The CPU
In a few short months it’s going to be 2009, and a ton of stuff has changed in the world of computing over the past almost-ten years. Some of the modern advancements have proven to be a notable improvement while others still produce the same crapola they did nearly ten years ago.
In this installment we’ll be looking at something everyone has in their computer, a Central Processing Unit, better known by its abbreviation as the CPU.
In the last article written about this on PCMech (which was a really long time ago), microprocessors were discussed up to the 386, so we’ll start from the 486 to present.
[hidepost=1]
~ ~ ~
486
The Intel 80486 processor, known also as i486 or just 486 introduced in 1990, is not all that dissimilar to the 386 in the way it works. The 486 has a few extra instructions but it is far superior in performance compared to the 386. Things like the on-chip floating-point unit and the enhanced bus made the 486 a powerhouse processor. This was a no-brainer upgrade when they were introduced to market.
Intel made this processor for a very long time and still does. Although they announced production would cease at the end of September 2007, it’s still manufactured for production use in embedded systems (meaning not for desktop PCs but other smaller systems that do not require powerful processors) as of this writing.
Pentium (586, 686, 786, 886)
The Pentium processor is named as such solely for the reason that you cannot copyright a number. In accordance with the law, Intel decided to use a name representing the number 5 loosely by use of “pent” in the beginning of the product name.
For example, the Pentastar is a Chrysler automotive logo, named as such because it is a star with 5 points, hence the “pent” in the title.
Because the Pentium processor started with model number 586, “pent” was used to represent the 5 in 586.
However bear in mind the name Pentium is used to represent any many Intel microprocessors well after the 586.
The first Pentium processors were released in 1993 and were clocked and had 60 and 66MHz offerings. Said honestly, most people did not see the need to upgrade at the time being the 486 could still do the job well (remember, this is before Windows 95).
Where most people purchased new computers or upgraded their existing systems is after 1995. Intel at that time had 120MHz and 133MHz Pentium processors at the ready that year.
The timeline of Pentium processors goes like this (from 1996 forward):
- 1996 – Pentium II
- 1997 – Pentium MMX
- 1998 – Celeron
- 1999 – Pentium III
- 2000 – Pentium IV, Celeron II
- 2008 – Intel Core
AMD
The AMD company also had their fair share of offerings over the years as well, following along closely with Intel.
Traditionally, AMD processors are always less in price compared to Intel, which is what makes them initially attractive to prospective buyers. In addition there are more than a few who swear by AMD as they “only processor they’ll ever use”. The choice of which to go with is always left to the buyer (you). When building a PC – even to this day – going with AMD will usually give you the same performance with less cost involved.
In addition, AMD has had several firsts ahead of Intel. See list below for details.
- 1995 – AMD-K5
- 1997 – AMD-K6
- 1998 – AMD-K6-2 and AMD-K6-3 and AMD Athlon
- 1999 – AMD Athlon series becomes first seventh-generation microprocessor for Microsoft Windows computing.
- 2000 – AMD Duron introduction, AMD first to break 1000MHz with AMD Athlon processor, AMD Athlon MP introduced
- 2003 – Opteron/Athlon 64 introduced
- 2004 – Athlon XP-M introduced (low-powered by design and slow but worthy of note)
- 2005 – AMD introduces world’s first x86 dual-core processor and Athlon 64 X2 introduced.
- 2007/2008 – Phenom
What’s changed, what hasn’t
What has changed the most with processors is not necessarily the speed but rather how many tasks it can perform. Multi-core technology is being pushed hard with all processor manufacturers as “the way to go”, so instead of seeing 5GHz single core processors, a 2.5GHz two-core processor would theoretically be able to perform the same tasks – and do them better by employing more multi-threading.
Intel has already produced a test-bed 80-core processor – and it worked. This is a fantastic achievement. Will we ever see 80-core processors on our desktops? Maybe, but not for many years. However it would be realistic to see 16-core processors on new home computers before 2015.
What hasn’t changed is that first-generation technology has still proven to be buggy or “par” at best. Whenever a new type of processor is introduced it usually isn’t widely supported. So even if you have the latest/greatest thing, it might take six months to a year before the software (including your operating system) catches up.
A general rule of thumb is to not buy into first-generation technology. A good example of this is the Core 2 series from Intel. The first release was called “Conroe”, the second “Allendale”. The Allendale is the more desirable because the L2 cache is not disabled. Allendale was released shortly after Conroe, so it was worth it to wait to get it.
The current Core 2 series as of this writing is the Yorkfield, being a dual-die quad core design and the fastest of the lot – as of now.
Final Notes
As noted above, it’s not necessarily about speed at this point but how much a processor can handle per its multitasking ability.
When shopping for a processor it’s highly recommended to purchase one that has both speed and the best multitasking. The end result will be a processor you can keep for at least 3 to 5 years before it goes obsolete.
With Intel, you will spend more but at present it is the better company to go with as far as your purchase is concerned. In addition it is more supported than the AMD rival.
If cost savings are all you’re looking for, AMD will serve you well.
[/hidepost]
2 thoughts on “1999 vs. 2009 Then And Now – The CPU”
Core2 was a major change from previous P4/Pentium-M/Core. Suggest that 2006 is an appropriate date of changeover instead of 2008.
The higher the frequency the smaller the antenna needed to radiate that frequency. That’s the reason why the highest frequencies outside of any chip on the motherboard are generally limited to megahertz: If they were any higher, the connective tracks on or between the layers of the motherboard would radiate the power away as radio-waves before it ever reached the next component. If the in-chip frequencies became too high then even the connections inside the chip would act as antennae and the chip itself would cease to function, regardless of the design of the transistors themselves.
Secondly; what happens when you put a dinner into a microwave oven? It cooks, yes?: What’s happening is that the high-frequency microwaves of several gigahertz are bombarding the food and exciting the (water) molecules to vibrate sympathetically, causing them to heat up. (I think the frequency used is 5 point something gigahertz.)
The above example of food heating is a secondary effect on something near the source of microwave emission, rather than the source itself.
When you’re talking gigahertz; the higher the frequency (The more gigahertz.) the greater the heat generated. Go figure. The cost of fabricating a chip small enough to function at these frequencies around 5GHz, as well as the cooling system it would require, doesn’t even bear thinking about: It’s just totally impractical.
So once you’re getting above around 4+GHz frequency you’re starting to fight a losing battle. Logically if you can’t go upwards you go outwards. Think outside the box like AMD did: Add another core operating at an identical frequency on the same die and you theorhetically and loosely have twice the frequency without having to have twice the frequency, if you catch my drift?
(In actual fact it’s not quite that simple: The overall performance gain works out at somewhere just above 1.7 times rather than double; but I’m not going to type a load of complex calculus-laden quantumlinear algorithms here to prove a point, even if I could remember them.)
Finally; correct me if I’m wrong, but wasn’t the Intel Core2 series initially produced in 2007?