As the computer hardware market further consolidates with Joint-Ventures such as the one between Asus (huge production capacity & name) and Gigabyte (gigantic channel presence), so is the CPU & GPU market starting to merge. I'm quite sure most of you have heard about it ad nauseam: AMD buys ATI. This was a very bold move for a corporation that was already shacky after their complete defeat with the last generation of CPU's. Soon, you will be unable to find ATI Radeons. They should be renamed to AMD Radeons by the end of this year. Most people will not give it much thought. A rose by any other name, right? However, this goes quite a bit further than simply acquiring assets.
I will try to lay out my opinions as logically as possible, but you'll have to excuse me if any quirks of the mind slip by unexplained.
So, here we are. AMD buys ATI. AMD had been in quite a lot of trouble recently, since their last generation of CPU's was clearly inferior to the Intel counterpart (AM2 vs Conroe). Now, this isn't much news as it has happened before a myriad times. However, this time people questioned AMD's survival, as it's doomed to limp one development cycle behind for the time being. A pricewar was announced, but it was public knwoledge that Intel's cash reserves are way bigger than anything that AMD can bring to the table. Not only did they loose in available cash resources, but the production proces was already strained. The profit margin on AMD CPU's were already smaller than Intel's profit margin to begin with, hence Intel could already cut deeper pricecuts without cutting in it's own flesh. AMD was destined to loose the pricewar before it even began. Quite a lot of investors hit the panic button & jumped ship. Intel was one of the top 'hot properties' in Wallstreet, while AMD was being dumped.
Then rumours started floating around in the hardware community: AMD was to buy ATI. Every site talked about it, but almost none could actually believe it. People were baffled by the bad business logic behind it. But was it bad business logic? A desperate attempt?
People were baffled for an endless list of short term reasons. ATI was king of the hill atm, since it provided the best FPS in Crossfire. However, Nvidia was still the public favorite. Why? Crossfire. ATI's Crossfire was far less popular than Nvidia's SLI. No one really cared about the few extra frames per second you coul get on ATI's latest crossfire rig. Now, that sounds strange, since the consumer graphics are fueled by fanboys. Every frame counts for them. However, there was a pretty big problem with the Crossfire platform. AM2 motherboards with Crossfire were ghosts. Sure they were 'available', but pretty much no one had ever seen one in a store. In essence, AMD bought a graphics (and chipset) firm with whome they seemed unable to work well together.
AMD however had worked well in the past with Nvidia, VIA, SiS & consorts. All these guys were players on the chipset & graphics market. They wouldn't be too glad to see AMD grabbing hold of a rival competitor. Nvidia would be the biggest problem, as they were the main suppliers for both chipsets & graphical cards going into AMD PC's. But trouble also stirs at the other side of the spectrum. ATI's best friend was Intel, who had proven a big fan of both the normal graphical cards and the Crossfire platform. They were on the verge of big deals on the subject, but I do think you can kiss those goodbye now. Not to mention that this is probably bad for the industry as a whole, since diversity & keeping the competative edge is one of the driving forces behind innovation.
Surely, AMD & ATI couldn't be stupid enough to miss all this. So what exactly could they have in mind, if the short term prospects are so grim? As you could have surmised from my wording, the long term could be profitable. It is no secret that we are headed for multi-core processors. What does this have to do with ATI or even GPU's? Bear with me for a moment.
Multi-core processors (and when I say multi- or mini-core I'm not talking about 2 or 4 cores, but 16, 32 or even more) are usually a bunch of lil' CPU's replicated & strapped together, sharing their L2 and/or L3 cache. They can do a lot of things working either together or seperately. One of those things could potentially be taking over the job of the GPU. Sun has been developing multi-core processors for years now, trying to satisfy a niche market & trying to get more market share. The Cell processor, was originally supposed to handle the GPU calculations as well, but this was scrapped later on as not feasable yet. Intel just concluded 3 mini-core research projects & is continuing one of those. This project seems to be followed closely by Sony, which seems very interested in it's possible uses in future consoles. It wouldn't be so strange if AMD was trying to acquire GPU knowledge, would it? The GPU knowledge would be needed to be integrated into the multi-core CPU's pipelines.
Not only are multi-core CPU's more capable of dealing with various tasks, they are also easier to develop. You develop a small simpler core & duplicate it. This means that if they wanted, they'd be able to churn out a completely new generation of CPU's each year & still be able to make a lot of variations on the theme aimed for niche markets.
But is this all possible? I doubt we'll be seeing any high end graphics coming from a multicore CPU anytime soon. There are a lot of problems regarding that. One of the biggest would be the memory. System memory has always lagged behind VidRAM. We saw GDDR3 on vid cards before DDR2 had become mainstream. Now GDDR4 is ready to hit the market (although it'll take a while, but the product is finished). To jump the system memory to vid memory would be a difficult & costly undertaking.
Why not take Nvidia then, since Nvidia has always been a reliable partner of AMD? Too expensive & the corporate culture of both don't match at all.
Why doesn't Intel do the same? Intel has the money to internally research this, they simply have no need to buy a big name outsider to acquire the necessary knowledge.
So was AMD right to buy ATI? Fuck if i know...
How do I see the (far) future? Well, I have to admit I think it's possible we'll be seeing the disappearance of the dedicated GPU. (This would also mean the death of the PPU)
It's possible we'll see an accelerated development cycle, leading to a new generation each year. Computers could have 1 multicore-CPU by default & a second open socket for later upgrades. With the CPU('s) balancing both normal calculation load & graphics/physics calculations at the same time.
Of course, this is all just idle speculation & horse shit, but it's always nice to think about the future. How do you see the future of the (game) computer?
I will try to lay out my opinions as logically as possible, but you'll have to excuse me if any quirks of the mind slip by unexplained.
So, here we are. AMD buys ATI. AMD had been in quite a lot of trouble recently, since their last generation of CPU's was clearly inferior to the Intel counterpart (AM2 vs Conroe). Now, this isn't much news as it has happened before a myriad times. However, this time people questioned AMD's survival, as it's doomed to limp one development cycle behind for the time being. A pricewar was announced, but it was public knwoledge that Intel's cash reserves are way bigger than anything that AMD can bring to the table. Not only did they loose in available cash resources, but the production proces was already strained. The profit margin on AMD CPU's were already smaller than Intel's profit margin to begin with, hence Intel could already cut deeper pricecuts without cutting in it's own flesh. AMD was destined to loose the pricewar before it even began. Quite a lot of investors hit the panic button & jumped ship. Intel was one of the top 'hot properties' in Wallstreet, while AMD was being dumped.
Then rumours started floating around in the hardware community: AMD was to buy ATI. Every site talked about it, but almost none could actually believe it. People were baffled by the bad business logic behind it. But was it bad business logic? A desperate attempt?
People were baffled for an endless list of short term reasons. ATI was king of the hill atm, since it provided the best FPS in Crossfire. However, Nvidia was still the public favorite. Why? Crossfire. ATI's Crossfire was far less popular than Nvidia's SLI. No one really cared about the few extra frames per second you coul get on ATI's latest crossfire rig. Now, that sounds strange, since the consumer graphics are fueled by fanboys. Every frame counts for them. However, there was a pretty big problem with the Crossfire platform. AM2 motherboards with Crossfire were ghosts. Sure they were 'available', but pretty much no one had ever seen one in a store. In essence, AMD bought a graphics (and chipset) firm with whome they seemed unable to work well together.
AMD however had worked well in the past with Nvidia, VIA, SiS & consorts. All these guys were players on the chipset & graphics market. They wouldn't be too glad to see AMD grabbing hold of a rival competitor. Nvidia would be the biggest problem, as they were the main suppliers for both chipsets & graphical cards going into AMD PC's. But trouble also stirs at the other side of the spectrum. ATI's best friend was Intel, who had proven a big fan of both the normal graphical cards and the Crossfire platform. They were on the verge of big deals on the subject, but I do think you can kiss those goodbye now. Not to mention that this is probably bad for the industry as a whole, since diversity & keeping the competative edge is one of the driving forces behind innovation.
Surely, AMD & ATI couldn't be stupid enough to miss all this. So what exactly could they have in mind, if the short term prospects are so grim? As you could have surmised from my wording, the long term could be profitable. It is no secret that we are headed for multi-core processors. What does this have to do with ATI or even GPU's? Bear with me for a moment.
Multi-core processors (and when I say multi- or mini-core I'm not talking about 2 or 4 cores, but 16, 32 or even more) are usually a bunch of lil' CPU's replicated & strapped together, sharing their L2 and/or L3 cache. They can do a lot of things working either together or seperately. One of those things could potentially be taking over the job of the GPU. Sun has been developing multi-core processors for years now, trying to satisfy a niche market & trying to get more market share. The Cell processor, was originally supposed to handle the GPU calculations as well, but this was scrapped later on as not feasable yet. Intel just concluded 3 mini-core research projects & is continuing one of those. This project seems to be followed closely by Sony, which seems very interested in it's possible uses in future consoles. It wouldn't be so strange if AMD was trying to acquire GPU knowledge, would it? The GPU knowledge would be needed to be integrated into the multi-core CPU's pipelines.
Not only are multi-core CPU's more capable of dealing with various tasks, they are also easier to develop. You develop a small simpler core & duplicate it. This means that if they wanted, they'd be able to churn out a completely new generation of CPU's each year & still be able to make a lot of variations on the theme aimed for niche markets.
But is this all possible? I doubt we'll be seeing any high end graphics coming from a multicore CPU anytime soon. There are a lot of problems regarding that. One of the biggest would be the memory. System memory has always lagged behind VidRAM. We saw GDDR3 on vid cards before DDR2 had become mainstream. Now GDDR4 is ready to hit the market (although it'll take a while, but the product is finished). To jump the system memory to vid memory would be a difficult & costly undertaking.
Why not take Nvidia then, since Nvidia has always been a reliable partner of AMD? Too expensive & the corporate culture of both don't match at all.
Why doesn't Intel do the same? Intel has the money to internally research this, they simply have no need to buy a big name outsider to acquire the necessary knowledge.
So was AMD right to buy ATI? Fuck if i know...
How do I see the (far) future? Well, I have to admit I think it's possible we'll be seeing the disappearance of the dedicated GPU. (This would also mean the death of the PPU)
It's possible we'll see an accelerated development cycle, leading to a new generation each year. Computers could have 1 multicore-CPU by default & a second open socket for later upgrades. With the CPU('s) balancing both normal calculation load & graphics/physics calculations at the same time.
Of course, this is all just idle speculation & horse shit, but it's always nice to think about the future. How do you see the future of the (game) computer?