The duopoly of x86

How has a market dynamic manufactured by strategy and circumstance led to the creation of a less innovative space, and how have other players woken up to it to work on changing it.

Yatharth Sood
9 min readJul 17, 2020
A modern Intel x86 CPU inside a motherboard socket. Credit: wccftech

The x86 Instruction Set Architecture is the fundamental building block behind your computer you are on at the moment, if you are not reading this on a phone, that is. If not being powered by an x86 processor, it probably exists because of x86’s presence, it’s own peculiarities, and more likely, due to the fact that you can’t use it for yourself. Of course, you can buy something utilising x86, like the laptop that this very article is written on, but that processor inside it, is from only one of two corporations: Intel or Advanced Micro Devices. Why is that, and what has this exclusivity led to?

Let us have a proper introduction. As proper as it can be without making you lose your minds.

or interest, to be honest.

Intel 8008 processor, a very early microprocessor from 1972. Credit: Jonas Svidras — Pexels

A computing device needs a processor to run basic instructions. It does the thinking, or computation for you, and it typically is the “heart” of the computer. In that way, it’s requirements determine how the rest of the system will be laid out, what other components it will have, and how it will balance it’s size and power needs with the tasks it is made to, and speed at which it operates on those tasks. It is quite the queen bee of the component list.

Now this processor is quite a complicated device. You can do a masters’ in my discipline (Electrical Engineering) specialising in designing them. But to cut to the chase, it has to be designed to suit an “architecture”, which also controls the manner in which it executes instructions, making this architecture a highly important endeavour that many major corporations invest into heavily to get an edge over their competition so as to achieve efficiency in the applications it has. That is the most concise explanation of computer architecture, and will serve the purposes of this article quite well. But for further information, I can show you directions:

Simple English Wikipedia: Computer Architecture.

Techquickie (YouTube): Coding Communication & CPU Microarchitectures as Fast As Possible

Where were we again?

IBM PC 5150 with peripherals. Credit: Wikimedia Commons

The late 20th century saw the development of the Personal Computer, a computing device that prioritises cost-effectiveness and practicality over throwing everything possible to achieve specific tasks at faster speeds, something that only enterprises and researchers aspire to have. There was also a need to have a system that managed to be compatible with more software than just the ones it came with, something we almost take for granted today, and this led to Intel developing 8086, a processor that used the architecture that Intel evolved into the x86 architecture. This was also used in the original IBM PC, the personal computer of the last century! From there, the personal computer and the -86 derivatives became synonymous. The involvement of third-parties that could easily make software, design useful peripherals and provide support made the PC a platform similar to what we have today, multiplying its success and making the 8086 extremely significant in the development of personal computers. Intel could not keep up with the demand of processors, and they started providing licensing to other manufacturers to design their own 8086 clones. Apparently, now, everyone can have a piece of the x86 pie! Right?

Past x86 manufacturers with their respective fate in parentheses. From Wikipedia: List of x86 Manufacturers

There were more than ten separate companies at different points over the years 1981 to the late 1990s, but as the title of this article states, there are just 2 manufacturers now*. What happened in the interim? The PC, even without hindsight, was such a competitive and “happening” space. Everyone would be foolish to give it all up like this, right?

*Deliberately excluding VIA Technologies/Zhaoxin as they are still in a nascent stage in development, and only operate in the Chinese market. I hope to cover them in a later story as well.

Surely, no one would come up with a world-beater of a CPU now!

A 1996 Pentium. Credit: Wikimedia Commons

The answer to this competition suppression is Pentium. Intel’s fifth iteration of the 8086, and Intel’s first trademarked form of the 8086-based processors. It became an instant success, as it was reasonably priced, and quite powerful. The many successors to the Pentium, with even more features dominated the 1990s, while a lot of the competition pulled out, and just went with Pentiums themselves in their own PCs. The only significant competition to the Pentium by the middle of the 90s were Cyrix, and of course, AMD.

The Cyrix 6x86. Credit: Appaloosar (Wikimedia Commons)

Cyrix is the smaller story to cover here, as they fell out of relevance soon after, not without a fight, though. Their most notable chip, the 6x86 was a Pentium “clone” with a claimed higher performance than the Pentium model of that time. Cyrix tried to charge a premium for that very claim, but their focus was towards integer-based operations as they predicted that all office applications will require them more than the floating-point ones. But (and this could be a small article on its own), the game Quake in particular worked terribly on these Cyrix processors, and Pentium won again. After a lot of litigation shots fired to and from Intel by Cyrix for alleged design infringements and so on, they merged with National Semiconductor, ran into financial troubles, and eventually stepped off, making way for Intel and AMD to be the juggernauts of the 2000s who leapfrogged each other in innovations and performance improvements.

Duopoly times!

With Cyrix gone, we enter the 2000s. For this period I was actually alive too, although I wouldn’t have been great at keeping track of the Intel vs. AMD fight brewing along at that age. But, we have the whole battle chronicled by a lot of people who have it all on the Internet in 2020 as well, so we can talk about it now.

AMD’s Opteron, the first CPU with AMD64 features. Credit: Wikimedia Commons

Intel’s x86 was now being used under license by AMD, and AMD alone was making CPUs that were direct competitors in the PC space (PC being separate from Apple’s Macintosh line) as they were the only manufacturers big and established enough to go toe-to-toe against each other, and this battle was quite intense, with a major win for AMD coming in 2000, when they expanded the x86 architecture’s capabilities into 64-bit, creating AMD64 or x86–64, clearing the way for x86 well ahead into the future with minimal teething problems. Intel’s attempt at doing the same failed miserably, as they tried to create an entirely new architecture from scratch with 64-bit capabilities, the ill-fated IA-64. AMD then licensed x86–64 to Intel, thus creating the cross-licensing deal that is behind the duopoly of the x86 architecture.

What does this kind of a market landscape really lead to?

more correctly, what it has led to?

Apple, just last month in their WWDC 2020, announced a new roadmap for their Mac lineup. They decided to move away from x86 in favour of their own chips that use the ARM architecture, that they have tried and tested on the iPhone and iPad, for their better power efficiency and potentially better performance as well as to unlock the potential of having more control over their hardware so as to have a higher amount of synergy with their software, as well as their own, highly acclaimed microprocessors for other tasks. This big a change hadn’t happened to Macs since they moved to x86 under the leadership of Steve Jobs in the mid-2000s, when they started to make Intel-based Macs that are the ones on sale today.

The 2018, 15" MacBook Pro’s top-end variant was notorious for its thermal throttling issues, which can be seen here as the “Frequency” graph spikes up and down consistenly under load to maintain the 90 degree temperature of the processor. Credit: Dave2D (Dave Lee — YouTube)

The reason for this move from x86 can be clearly attributed to the roadblock that Intel was for Macs, as Apple’s product could only be as good as Intel’s is. A key part here is the decline that AMD faced in the early 2010s after Intel handily beat their products in that space, before AMD finally started to bounce back in the year 2017. Since we were in a duopoly, Intel faced simply no competition for a significant part of the last decade in virtually all price segments of the CPU market (consumer AND enterprise, but we focus on consumer here for now), leading to them stagnating their whole lineup, until AMD forced their hand to pick up the enormous slack with Ryzen in 2017. But by then, the damage probably was done. Apple, in all likelihood, grew tired of waiting for more than 2 years to market a product that could be justified as an upgrade due to plateauing performance gains Intel’s processors showed generation after generation. It also reduces the embarassing amounts of thermal issues MacBooks have been facing since 2018 because of Intel’s power-hungry processors that were being stuffed in extremely compact designs with a thermal design that favoured aesthetics over temperatures. Apple has shown that they can do silicon excellently in their mobile devices, and the move is being done in a very determined fashion, with a planned release of a new Mac with the new Apple processors by the end of 2020. Clearly, they were preparing this all for a while now, as it is not a small undertaking to design, manufacture and announce a product that is that much of a paradigm shift for a company as big as Apple, and a brand as big as Mac.

Older Apple silicon, all used in iPhone models since the very beginning of the iPhone. Apple believes that the same kind of hardware they use in their iPhones can be used in newer Mac systems for more seamlessness between their devices and for greater integration with macOS. Credit: Apple WWDC 2020

Apple’s move is actually a courageous one, unlike their removal of the headphone jack (which actually has managed to become a normal thing now), as they had to carve a path through the mess that compatibility would be, as Apple’s ARM chips will require new macOS software to be written for them, while existing apps that still use x86 would have to be simulated or translated for ARM for them to work on Macs further into the future.

Apple showing an under-development Apple-based Mac computer in their WWDC 2020 live broadcast in June. Note the “Processor” field showing an Apple processor.

Apple’s move is also significant with respect to their competition, as they always have the most amount of eyes on them, and that means that their change also impacts the PC industry. We will see alternatives to Apple’s ARM implementations running Windows, as Windows does already support ARM, as app developers will also scramble to develop ARM compatibility for their apps. And all this will lead to the spearheading of the alternative to x86 that, as time will tell us, will be viable enough for more and more consumers.

Apple considers the transition to their own processors in Macs as their “fourth transition”. The previous three were the move to PowerPC (the predecessor to Intel-based Macs), the software move to Mac OS X, and finally the most current one, the move to Intel from PowerPC in the mid-2000s, which is their main inspiration for their methodology for their move away from Intel.


This is clearly not an exhaustive look into the duopoly of x86 (which I seek to cover in other “stories”), but it establishes the basics of it. The concerns of a monopoly carry over to this duopoly just as well. We have stifled competition, potential competition being deterred from entering this space due to the investments that would be required and proprietary technology that can be used only under license, from the benefactors of the duopoly. The consequences to this are already being addressed by a major player like Apple, as another problem this market dynamic led to, was the difficulty for other corporations dependant on Intel/AMD’s products had in innovating on their own. The x86 duopoly has been a roadblock for development of computers in general for over two decades, and it will be interesting to see how we circumvent it for the advancement of computing technology, especially for a regular consumer like a lot of us.

Note: This is a mostly cursory look at the situations and the industry as a whole, but I wish to analyse this better in subsequent stories, alongside other related topics as well.



Yatharth Sood

Business enquiries: yatharthsood00 (at) gmail (dot) com.