Posts Tagged‘x86-64’

AMD’s Kaveri Could Be This Generation’s x86-64.

The story of AMD’s rise to glory on the back of Intel’s failures is well known. Intel, filled with the hubris that can only come from maintaining a dominate market position as long as they had, thought that the world could be brought into the 64bit world on the back of their brand new platform: Itanium. The cost for adopting this platform was high however as it made no attempts to be backwards compatible, forcing you to revamp your entire software stack to take advantage of it (the benefits of which were highly questionable). AMD, seeing the writing on the wall, instead developed their x86-64 architecture which not only promised 64bit compatibility but even went as far as to outclass then current generation Intel processors in 32bit performance. It was then an uphill battle for Intel to play catchup with AMD but the past few years have seen Intel dominate AMD in almost every metric with the one exception of performance per dollar at the low end.

That could be set to change however with AMD announcing their new processors, dubbed Kaveri:

AMD Kaveri CPU-GPU OverviewOn the surface Kaveri doesn’t seem too different from the regular processors you’ll see on the market today, sporting an on-die graphics card alongside the core compute units. As the above picture shows however the amount of on die space dedicated to said GPU is far more than any other chip currently on the market and indeed the transistor count, which is a cool 2.1 billion, is a testament to this. After that however it starts to look more and more like a traditional quad core CPU with an integrated graphics chip, something few would get excited about, but the real power of AMD’s new Kaveri chips comes from the architectural changes that underpin this insanely complex piece of silicon.

The integration of GPUs onto CPUs has been the standard for some years now with 90% of chips being shipped with an on-die graphics processor. For all intents and purposes the distinction between them and discrete units are their location within the computer as they’re essentially identical at the functional level. There is some advantages gained due to being so close to the CPU (usually to do with latency that’s eliminated by not having to communicate over the PCIe bus) but they’re still typically inferior due to the amount of die space that can be dedicated to them. This was especially true of generations previous to the current one which weren’t much better than the integrated graphics cards that shipped with many motherboards.

Kaveri, however, brings with it something that no other CPU has managed before: a unified memory architecture.

Under the hood under every computer is a whole cornucopia of different styles of memory, each with their own specific purpose. Traditionally the GPU and CPU would each have their own discrete pieces of memory, the CPU with its own pool of RAM (which is typically what people refer to) and the GPU with similar. Integrated graphics would typically take advantage of the system RAM, reserving part a section for its own use. In Kaveri the distinction between the CPU’s and GPUs memory is gone, replaced by a unified view where either processing unit is able to access the others. This might not sound particularly impressive but it’s by far one of the biggest changes to come to computing in recent memory and AMD is undoubtedly the pioneer in this realm.

GPUs power comes from their ability to rapidly process highly parallelizable tasks, examples being things like rendering or number crunching. Traditionally however they’re constrained by how fast they can talk with the more general purpose CPU which is responsible for giving it tasks and interpreting the results. Such activities usually involve costly copy operations that flow through slow interconnects in your PC, drastically reducing the effectiveness of a GPU’s power. Kaveri CPUs on the other hand suffer from no such limitations allowing for seamless communication between the GPU and the CPU enabling them both to perform tasks and share results without the traditional overhead.

The one caveat at this point however is that software needs to be explicitly coded to take advantage of this unified architecture. AMD is working extremely hard to get low level tools to support this, meaning that programs should eventually be able to take advantage of it without much hassle, however it does mean that the Kaveri hardware is arriving long before the software will be able to take advantage of it. It’s sounding a lot like an Itanium moment here, for sure, but as long as AMD holds good to their promises of working with tools developers to take advantage of this (whilst retaining the required backwards compatibility) this has the potential to be another coup for AMD.

If the results from the commercial units are anything to go by then Kaveri looks very promising. Sure it’s not a performance powerhouse but it certainly holds its own against the competition and I’m sure once the tools catch up you’ll start to see benchmarks demonstrating the power of a unified memory architecture. That may be a year or two out from now but rest assured this is likely the future for computing and every other chip manufacturer in the world will be rushing to replicate what AMD has created here.

 

So Long Itanium, You Will Not Be Missed.

I’ve worked with a lot of different hardware in my life, from the old days of tinkering with my Intel 80286 through to esoteric Linux systems running on DEC tin until I, like everyone else in the industry, settled on x86-64 as the defacto standard. Among the various platforms I was happy to avoid (including such lovely things as Sun SPARC) was Intel’s Itanium range as it’s architecture was so foreign from anything else it was guaranteed that whatever you were trying to do, outside of building software specifically for that platform, was doomed to failure. The only time I ever came close to seeing it being deployed was on the whim of a purchasing manager who needed guaranteed 100% uptime until they realised the size of the cheque they’d need to sign to get it.

Oh Shiny ItaniumIf Intel’s original dream was to be believed then this post would be coming to you care of their processors. You see back when it was first developed everything was still stuck in the world of 32bit and the path forward wasn’t looking particularly bright. Itanium was meant to be the answer to this, with Intel’s brand name and global presence behind it we would hopefully see all applications make their migration to the latest and greatest 64bit platform. However the complete lack of any backwards compatibility with any currently developed software and applications meant adopting it was a troublesome exercise and was a death knell for any kind of consumer adoption. Seeing this AMD swooped in with their dually compatible x86-64 architecture which proceeded to spread to all the places that Itanium couldn’t, forcing Intel to adopt the standard in their consumer line of hardware.

Itanium refused to die however finding a home in the niche high end market due to its redundancy features and solid performance for optimized applications. However the number of vendors continuing to support the platform dwindled from their already low numbers with it eventually falling to HP being the only real supplier of Itanium hardware in the form of their NonStop server line. It wasn’t a bad racket for them to keep up though considering the total Itanium market was something on the order of $4 billion a year and with only 55,000 servers shipped per year you can see how much of a premium they attract). Still all the IT workers of the world have long wondered when Itanium would finally bite the dust and it seems that that day is about to come.

HP has just announced that it will be transitioning its NonStop server range from Itanium to x86 effectively putting an end to the only sales channel that Intel had for their platform. What will replace it is still up in the air but it’s safe to assume it will be another Intel chip, likely one from their older Xeon line that shares many of the features that the Itanium had without the incompatible architecture. Current Itanium hardware is likely to stick around for an almost indefinite amount of time however due to the places it has managed to find itself in, much to the dismay of system administrators everywhere.

In terms of accomplishing it’s original vision Itanium was an unabashed failure, never finding the consumer adoption that it so desired and never becoming the herald of 64bit architecture. Commercially though it was somewhat of a success thanks to its features that made it attractive to the high end market but even then it was only a small fraction of total worldwide server sales, barely enough to make it a viable platform for anything but wholly custom solutions. The writing was on the wall when Microsoft said that Windows Server 2008 was the last version to support it and now with HP bowing out the death clock for Itanium has begun ticking in earnest, even if the final death knell won’t come for the better part of a decade.

 

All Your Consoles Are Belong To x86.

Ever since the first console was released they have always been at arms length with the greater world of computing. Initially this was just a difference in inputs as consoles were primarily games machines and thus did not require a fully fledged keyboard but over time they grew into being purpose built systems. This is something of a double edged sword as whilst a tightly controlled hardware platform allows developers to code against a set of specifications it also usually meant that every platform was unique which often meant that there was a learning curve for developers every time a new system came out. Sony was particularly guilty of this as the PlayStation 2 and 3 were both notoriously difficult to code for; the latter especially given its unique combination of linear coprocessors and giant non-linear unit.

Playstation 4 Xbox360 Orbis Durango

There was no real indication that this trend was going to stop either as all of the current generation of consoles use some non-standard variant of some comparably esoteric processor. Indeed the only console in recent memory to attempt to use a more standard processor, the original Xbox, was succeeded by a PowerPC driven Xbox360 which would make you think that the current industry standard of x86 processors just weren’t suited to the console environment. Taking into account that the WiiU came out with a PowerPC CPU it seem logical that the next generation would continue this trend but it seems there’s a sea change on the horizon.

Early last year rumours started circulating that the next generation PlayStation, codenamed Orbis, was going to be sporting a x86 based processor but the next generation Xbox, Durango, was most likely going to be continuing with a PowerPC CPU. As it turns out this isn’t the case and Durango will in fact be sporting an x86 (well if you want to be pedantic its x86-64, or x64). This means that its highly likely that code built on the windows platform will be portable to Durango and makes the Xbox the launchpad for the final screen in Microsoft’s Three Screens idea. This essentially means that nearly all major gaming platforms share the same coding base which should make cross platform releases far easier than they have been.

News just in also reveals the specifications of the PlayStation 4 confirming the x86 rumours. It also brings with it some rather interesting news: AMD is looking to be the CPU/GPU manufacturer of choice for the next generation of consoles.

There’s no denying that AMD has had a rough couple years with their most recent quarter posting a net loss of $473 million. It’s not unique to them either as Intel has been dealing with sliding revenue figures as the mobile sector heats up and demand for ARM based processors, which neither of the 2 big chip manufacturer’s provide, skyrockets. Indeed Intel has stated several times that they’re shifting their strategy to try and capture that sector of the market with their most recent announcement being that they won’t be building motherboards any more. AMD seems to have lucked out in securing the CPU for the Orbis (and whilst I can’t find a definitive source it looks like their processor will be in Durango too) and the GPU for both of them which will guarantee them a steady stream of income for quite a while to come. Whether or not this will be enough to reinvigorate the chip giant remains to be seen but there’s no denying that it’s a big win for them.

The end result, I believe, will be an extremely fast maturation of the development frameworks available for the next generation of consoles thanks to their x86 base. What this means is that we’re likely to see titles making the most of the hardware much sooner than we have for other platforms thanks to their ubiquity of their underlying architecture. This will be both a blessing and a curse as whilst the first couple years will see some really impressive titles past that point there might not be a whole lot of room for optimizations. This is ignoring the GPU of course where there always seems to be better ways of doing things but it will be quickly outpaced by its newer brethren. Combine this with the availability of the SteamBox and we could see PCs making a come back as the gaming platform of choice once the consoles start showing their age.

 

Windows 8: First Step to the Realization of Three Screens.

The last two years have seen a major shake up in the personal computing industry. Whilst I’m loathed to admit it Apple was the one leading the charge here, redefining the smart phone space and changing the way many people did the majority of their computing by creating the wildly successful niche of curated computing (read: tablets). It is then inevitable that many subsequent innovations from rival companies are seen as reactions to Apple’s advances, even if the steps that company is taking are towards a much larger and broader goal than competing in the same market.

I am, of course, referring to Microsoft’s Windows 8 which was just demoed recently.

There’s been quite a bit of news about the upcoming release of Windows 8 with many leaked screenshots and even leaked builds that gave us a lot of insight into what we can expect of the next version of Windows. For the most part the updates didn’t seem like anything revolutionary although things like portable desktops and a more integrated web experienced were looking pretty slick. Still Windows 7 was far from being revolutionary either but the evolution from Vista was more than enough to convince people that Microsoft was back on the right track and the adoption rates reflect that.

However the biggest shift that is coming with Windows 8 was known long before it was demoed: Windows 8 will run on ARM and other System on a Chip (SOC) devices. It’s a massive deviation from Microsoft’s current platform which is wholly x86/x86-64 based and this confirms Microsoft’s intentions to bring their full Windows experience to tablet and other low power/portable devices. The recent demo of the new operating system confirmed this with Windows 8 having both a traditional desktop interface that we’re all familiar with and also a more finger friendly version that takes all of its design cues from the Metro interface seen on all Windows Phone 7 devices.

The differences between these two interfaces just don’t stop at what input device they were optimized for either. Whilst all Windows 8 devices will be capable running the huge back catalog of software that has been developed for Windows over the past few decades in the traditional desktop interface mode the new tablet optimized interface relies on applications built using HTML5 and JavaScript. This is arguably done so that they are much more platform independent than their traditional Windows applications cousins who, whilst most likely being able to run since .NET will be ported to the ARM and SOC infrastructures, won’t have been designed for the tablet environment. They’ll still be usable in a pinch of course but you’d still want to rewrite them if a large number of your users were moving to the tablet/smartphone platform.

Looking at all these changes you can’t help but think that they were all done in reaction to Apple’s dominance of the tablet space with their iPad. It’s true that a lot of the innovations Microsoft has done with Windows 8 mirror those of what Apple has achieved in the past year or so however since Windows 8 has been in development for much longer than that not all of them can be credited to Microsoft playing the me-too game. Realistically it’s far more likely that many of these innovations are Microsoft’s first serious attempts at realizing their three screens vision and many of the changes in Windows 8 support this idea.

A lot of critics think the idea of bringing a desktop OS to a tablet form factor is doomed for failure. The evidence to support that view is strong too since Windows 7 (and any other OS for that matter) tablet hasn’t enjoyed even a percentage of the success that the dedicated tablet OS’s have. However I don’t believe that Microsoft is simply making a play for the tablet market with Windows 8, what they’re really doing is providing a framework for building user experiences that remain consistent across platforms. The idea of being capable of completing any task whether you’re on your phone, TV or dedicated computing device (which can be a tablet) is what is driving Microsoft to develop Windows 8 they way they are. Windows Phone 7 was their first steps into this arena and their UI has been widely praised for its usability and design and Microsoft’s commitment to using it on Windows 8 shows that they are trying to blur the lines that current exist between the three screens. The potential for .NET applications to run on x86, ARM and other SOC platforms seals the deal, there is little doubt that Microsoft is working towards a ubiquitous computing platform.

Microsoft’s execution of this plan is going to be vital for their continued success. Whilst they still dominate the desktop market it’s being ever so slowly eroded away by the bevy of curated computing platforms that do everything users need them to do and nothing more. We’re still a long time away from everyone out right replacing all their PCs with tablets and smart phones but the writing is on the wall for a sea change in the way we all do our computing. Windows 8 is shaping up to be Microsoft’s way of re-establishing themselves as the tech giant to beat and I’m sure the next year is going to be extremely interesting for fans and foes alike.