Where Did AMD Go Wrong

Blastwave

Ready Player One
Sep 3, 2011
5,051
0
42
USS Voyager

I’ve been a PC gamer all my life. I have fond memories of my dad and me building my first PC. It was a 486DX2 with 8 megs of ram with a Soundblaster 16bit sound card. I spent hours playing Doom, Duke Nukem 3D, and Raptor on that machine. It was glorious and cemented me as a lifelong PC gamer.

Over the years I built other rigs, got into game modding and dabbled in overclocking. Back in the early days of the internet it was much harder to get quality information on hardware than it is today. You can imagine my awe when one of my friends built a PC with an AMD processor. “What’s that? A new Pentium?” I would ask. You see, I didn’t have any way of knowing of the larger hardware world beyond Intel at the time. It had never even occurred to me that someone else made processors. I remember going to his house after school and being amazed at this weird and alien hardware. It was an Athlon, now known as the Athlon Classic. A slot based processor not unlike the interface Intel used for the Pentium 2 and early Pentium 3. That day opened up a new world for me and for the better part of the next decade I was running an AMD system.

AMD History 101

AMD is best known for their golden age from the late 90’s to mid-2000’s, but it’s almost as old as Intel being formed less than a year later in may of 1969. Also like Intel, AMD was founded by former Fairchild employees. It didn’t take long for them to get into the new microprocessor market, created by Intel, and soon began making reverse-engineered clones of Intel chips like the 8080. In those days, the processor wars hadn’t started yet, and Intel and AMD were on excellent terms. Customers like IBM needed second sources of chips to keep up with the demand for PCs and AMD was able to secure an x86 license, something nearly impossible to do today.

As time went on AMD moved from just making straight clones of Intel chips to designing their own. Things started off simple enough, they would release a clone of the 8086 with a higher clock speed, but eventually designs started become more and more unique from stock Intel offerings. Some of this was due to AMD stretching its legs into the market, but some of it was also due to Intel’s changing attitude. As the PC market grew and became the defacto standard for consumer computers AMD, and Intel’s relationship started to sour. AMD had licensed Intel’s 386 and 486 architectures, but Intel wouldn’t license anything newer. Then AMD lost the ability to use Intel sockets and buses and had to design their own standards with the introduction of the Athlon. Finally after a slew of back and forth lawsuits AMD and Intel reached an uneasy truce and acquiesced to having to coexist in the market.

The Golden Years

The introduction of the Athlon is where AMD began a multiyear period of growth and dominance over Intel. In the years prior, AMD had invested heavily in R&D and talent. In 1996, AMD bought fellow x86 chipmaker NextGen who had fallen on hard times. The NextGen team brought a lot of innovative ideas with them not least of which was the clever usage of an internal RISC processor core, that would have x86 operations decoded into RISC “micro-operations” for the core to read. This allowed AMD to make a chip that was compatible with the Pentium even without a Pentium license and at the same time have full control over the design of the processor architecture without being limited by x86.
AMD continued to recruit industry talent with the acquisition of nearly the entire DEC Alpha team. In 1998, DEC was sold to Compaq. Most of the microprocessor engineers were attracted to AMD instead. In the process of gaining the Alpha team, AMD also gained access to the EV6 bus the Alpha processor used. This was crucial for AMD, who now had to design their motherboards and chipsets and could no longer piggyback on Intel platforms.



The Athlon came as a major shock to Intel. It was the first time AMD was competitive against Intel’s flagship products.​


Hiring the Alpha team proved to be the correct decision. The Ev6 bus was much faster than the current 100mhz GTL+ bus Intel was using and the Alpha team brought with them expertise in high-end server processors, something Intel had yet to compete with on this level. The Athlon was, of course, a massive success. AMD was no longer stuck competing with Intel purely on price/performance but on raw performance. The Athlon was neck and neck with the Pentium III and both Intel and AMD started the now infamous race to 1Ghz. Intel got there first, technically but ran into massive yield and reliability problems in the process. 1Ghz Pentium IIIs were rare and in the 1.13Ghz models, unreliable. AMD, on the other hand, had a much easier time hitting the 1Ghz mark and was able to keep up with demands.

The Athlon architecture proved to scale well, and AMD was able to update the chips with smaller die shrinks, more cache, and higher clock speeds several times. By 2003, the original Athlon family, now called Athlon XP, had scaled from 500Mhz to 2.33Ghz and 250nm to 130nm. At the point, however, the Athlon was running out of steam. Higher clock speeds were causing more and more heat and the EV6 bus just couldn’t compete with Intel’s new quad pumped bus with the Pentium 4. A new design was needed.

Enter the K8, a new 64bit redesign of the K7, lead by Jim Keller one of the former DEC engineers. K8 codenamed Hammer brought many improvements with it. The memory controller was now on die with the CPU itself which lowered latency and allowed the memory controller to operate at the same clock speed as the CPU instead of the FSB speed. Speaking of the FSB, the EV6 bus of the K7 was replaced with the new HyperTransport bus that was twice as fast as the best EV6 bus.

These features, combined with many lower level improvements to the chip architecture allowed the Athlon 64 a healthy performance margin over the Pentium 4. To compete, all Intel could do was continue to raise the clock speed. At first this seemed like the right approach, the Pentium 4 was made for high clocks after all. But Intel’s optimistic road maps proved to be unfeasible. The Pentium 4 never broke 3.8Ghz stock that is a far cry from the lofty 10Ghz initially claimed. While overclockers have been able to hit over 8Ghz, that was with exotic cooling.
Losing Their Grip

AMD’s sales were at their highest during this period. The K8 was a massive hit and proved to be the architecture of the future. The Pentium 4 was a technological dead end, and Intel scrambled to replace the dead Netburst with something better. What they came up with was the Core architecture.

Core was based on the Pentium M which had taken over mobile duties from the Mobile Pentium 4 after it proved to be too hot and power hungry for laptops. Intel’s Israel team had taken the Tualatin Pentium 3 and had upgraded it with the Pentium 4’s bus as well as an improved decoder, branch unit and bigger cache. The result was a sales success that spawned a new sub-market for laptops, the thin and light.

In July of 2006, Intel released the Core 2 Duo for the desktop. The new processor proved to turn the tables on K8 and for the first time in many years Intel had a processor that was faster clock for clock than AMD. The gap was so wide that reviews were showing that only the top of the line FX-62 could hold its own against even the weakest 1.8Ghz Core 2 Duo.

AMD needed a response, and they needed one fast. Core 2 had beat K8 without breaking a sweat and Intel was going all out into the new architecture. There was one major problem, though. That same year AMD had purchased video card maker ATI. The acquisition was a drawn out and messy affair and it that took far longer than anticipated to fully integrated ATI into AMD. Due to the cost and difficulties surrounding the purchase, AMD was unable to give their processor division the full attention it needed when it needed it the most.

At the end of 2007 AMD’s new Phenom chips debuted to lukewarm reviews. They did compete better against the Core 2, but not nearly to the degree needed to. Under the hood, the reason was apparent. AMD had only made fairly minor improvements over K8 whereas Intel had drastically overhauled Pentium M to become Core. AMD had another problem as well; Intel was gaining a significant lead on fabrication process technology. This means they could produce chips with smaller transistors, which means smaller dies and lower power consumption and heat. In January of 2008, Intel released a 45 nanometer die shrink to their Core 2 Dou line. AMD wasn’t able to do the same until 2009.


The Fall





The K8’s successors were not strong performers compared to Intel.​



The K10 family soldiered on with an update in 2009 to 45nm, but by that point Intel had moved on to it’s new generation of Core chips codenamed Nehalem. In 2011, AMD had moved on from K10 to their entirely new Bulldozer architecture. Bulldozer was a new direction for AMD, focussing on highly parallel designs eschewing the high per-thread performance strategy of Intel and instead going for more threads. This proved to be a massive miscalculation. While multithreaded programs became more and more frequent, AMD was creating chips that had more threads than existing games and applications could use.

The benchmarks showed the grim truth. Not only did Intel easily outperform AMD’s efforts, but even the older Phenom II chips performed better with the existing software. It’s one thing to lose to your competitor, but it’s something entirely different to lose to your older chips. AMD’s one saving grace has been, ironically enough, overclocking. Where Intel used to have the Ghz crown in the Pentium 4 days, their newer Core based chips have not overclocked as well. In fact the smaller intel’s fabrication process, the harder it is to overclock. AMD, on the other hand, has held the last four overclocking world records. It’s a strange twist of fate that AMD, always behind Intel on raw clock speed in the past, now has the highest clock chips.


The Future


AMD is currently in a bad way. As it stands, Q1 and Q2 financial reports show shrinking earnings, falling by $100 million each quarter. This stems from an even larger gap between them and Intel and stronger competition from Nvidia on the graphics front. For the first time in years, AMD’s revenue is under $1 billion, and it continues to fall. To anyone who has kept up with the enthusiast PC market, it is not surprising. AMD simply is not competitive with Intel. I haven’t seriously considered an AMD build for almost 10 years.

All hope is not however, lost. Jim Keller, the architect behind the K8, returned to AMD in 2012. Keller has a reputation in the industry as a miracle worker who jumps from company to company designing new, successful architectures and overhauling their engineering culture. Less than a month ago he left AMD, his job seemingly done. By all accounts, AMD’s new architecture dubbed Zen is complete and will be available sometime next year.

Zen is a huge gamble for AMD. The future of the company as a legitimate competitor to Intel rides on the success of Zen. If Zen succeeds then, AMD will once again have a chance to repair its battered reputation and will see a much-needed influx of cash. If Zen fails, then AMD will likely be consigned to the annals of PC history with fellow chipmakers like NextGen, Cyrix, Via, and Transmeta. The future of AMD isn’t the only thing riding on Zen, however. If AMD were to fall then Intel’s current trend of miniscule improvements every year will continue. These improvements, in the face of no competition, have been so small than Sandy Bridge, released in 2011, is still sufficient for gaming. Without AMD around we can only expect the processor market to stagnate even more. Intel, for all the good and bad they do, is best when forced to release killer products like they did when Core initially hit.

For the last year or so I have noticed a common call by PC gamers to buy AMD chips if for nothing else than to prevent that bleak, dystopian PC future I described. I cannot in good conscience recommend spending money on inferior hardware. But at the same time I hope for the success of Zen. Not just to fulfill some sense of proper justice in storytelling, or for the sake of nostalgia but because of the horrible implications of a world without AMD.


SOURCE: Where Did AMD Go Wrong
 

Mohsin Z

Intermediate
Apr 6, 2010
161
0
21
Kohat
oh man, that's really thoughtful. I wish AMD could get back on its feet and fight this uphill battle against Intel. More competition is always better for consumers, and in the case of no competition, it would be a world i don't want to live in. These miniscule improvements of Intel are really hurting the enthusiasts. For the sake of us ( we poor consumers of a poor country), pray for AMD.
 
General chit-chat
Help Users
We have disabled traderscore and are working on a fix. There was a bug with the plugin | Click for Discord
  • No one is chatting at the moment.
    faraany3k faraany3k: Tears of Kingdom saal pehle shuru ki thee, ab tk pehle area se nai nikla. Life sucks donkey balls.