Posts Tagged‘architecture’

Intel and Micron Announce 3D Xpoint Memory.

The never-ending quest to satisfy Moore’s Law means that we’re always looking for ways to making computers faster and cheaper. Primarily this focuses on the brain of the computer, the Central Processing Unit (CPU), which in most modern computers is now how to transistors numbering in the billions. All the other components haven’t been resting on their laurels however as shown by the radical improvement in speeds from things like Solid State Drives (SSDs), high-speed interconnects and graphics cards that are just as jam-packed with transistors as any CPU is. One aspect that’s been relatively stagnant however has been RAM which, whilst increasing in speed and density, has only seen iterative improvements since the introduction of the first Double Data Rate (DDR). Today Intel and Micron have announced 3D Xpoint, a new technology that sits somewhere between DRAM and NAND in terms of speed.

3D_XPoint_Die

Details on the underlying technology are a little scant at the moment however what we do know is that instead of storing information by trapping electrons, like all memory currently does, 3D Xpoint (pronounced cross point) instead stores bits via a change in resistance of the memory material. If you’re like me you’d probably think that this was some kind of phase change memory however Intel has stated that it’s not. What they have told us is that the technology uses a lattice structure which doesn’t require transistors to read and write cells, allowing them to dramatically increase the density, up to 128GB per die. This also comes with the added benefit of being much faster than current NAND technologies that power SSDs although slightly slower than current DRAM, albeit with the added advantage of being non-volatile.

Unlike most new memory technologies which often purport to be the replacements for one type of memory or another Intel and Micron are position 3D Xpoint as an addition to the current architecture. Essentially your computer has several types of memory, all of which are used for a specific purpose. There’s memory directly on the CPU which is incredibly fast but very expensive, so there’s only a small amount. The second type is the RAM which is still fast but can be had in greater amounts. The last is your long term storage, either in the form of spinning rust hard drives or a SSD. 3D Xpoint would sit in between the last two, providing a kind of high speed cache that could hold onto often used data that’s then persisted onto disk. Funnily enough the idea isn’t that novel, things like the XboxOne use a similar architecture, so there’s every chance that it might end up happening.

The reason why this is exciting is because Intel and Micron are already going into production with these new chips, opening up the possibility of a commercial product hitting our shelves in the very near future. Whilst integrating it in the way that they’ve stated in the press release would take much longer, due to the change in architecture, there’s a lot of potential for a new breed of SSD drives to be based on this technology. They might be an order of magnitude more expensive than current SSDs however there are applications where you can’t have too much speed and for those 3D Xpoint could be a welcome addition to their storage stack.

Considering the numerous technological announcements we’ve seen from other large vendors that haven’t amounted to much it’s refreshing to see something that could be hitting the market in short order. Whilst Intel and Micron are still being mum on the details I’m sure that the next few months will see more information make its way to us, hopefully closely followed by demonstrator products. I’m very interested to see what kind of tech is powering the underlying cells as a non-phase change, resistance based memory is something that would be truly novel and, once production hits at-scale levels, could fuel another revolution akin to the one we saw with SSDs all those years ago. Needless to say I’m definitely excited to see where this is heading and I hope Intel and Micron keep us in the loop with the new developments.

So Long Itanium, You Will Not Be Missed.

I’ve worked with a lot of different hardware in my life, from the old days of tinkering with my Intel 80286 through to esoteric Linux systems running on DEC tin until I, like everyone else in the industry, settled on x86-64 as the defacto standard. Among the various platforms I was happy to avoid (including such lovely things as Sun SPARC) was Intel’s Itanium range as it’s architecture was so foreign from anything else it was guaranteed that whatever you were trying to do, outside of building software specifically for that platform, was doomed to failure. The only time I ever came close to seeing it being deployed was on the whim of a purchasing manager who needed guaranteed 100% uptime until they realised the size of the cheque they’d need to sign to get it.

Oh Shiny ItaniumIf Intel’s original dream was to be believed then this post would be coming to you care of their processors. You see back when it was first developed everything was still stuck in the world of 32bit and the path forward wasn’t looking particularly bright. Itanium was meant to be the answer to this, with Intel’s brand name and global presence behind it we would hopefully see all applications make their migration to the latest and greatest 64bit platform. However the complete lack of any backwards compatibility with any currently developed software and applications meant adopting it was a troublesome exercise and was a death knell for any kind of consumer adoption. Seeing this AMD swooped in with their dually compatible x86-64 architecture which proceeded to spread to all the places that Itanium couldn’t, forcing Intel to adopt the standard in their consumer line of hardware.

Itanium refused to die however finding a home in the niche high end market due to its redundancy features and solid performance for optimized applications. However the number of vendors continuing to support the platform dwindled from their already low numbers with it eventually falling to HP being the only real supplier of Itanium hardware in the form of their NonStop server line. It wasn’t a bad racket for them to keep up though considering the total Itanium market was something on the order of $4 billion a year and with only 55,000 servers shipped per year you can see how much of a premium they attract). Still all the IT workers of the world have long wondered when Itanium would finally bite the dust and it seems that that day is about to come.

HP has just announced that it will be transitioning its NonStop server range from Itanium to x86 effectively putting an end to the only sales channel that Intel had for their platform. What will replace it is still up in the air but it’s safe to assume it will be another Intel chip, likely one from their older Xeon line that shares many of the features that the Itanium had without the incompatible architecture. Current Itanium hardware is likely to stick around for an almost indefinite amount of time however due to the places it has managed to find itself in, much to the dismay of system administrators everywhere.

In terms of accomplishing it’s original vision Itanium was an unabashed failure, never finding the consumer adoption that it so desired and never becoming the herald of 64bit architecture. Commercially though it was somewhat of a success thanks to its features that made it attractive to the high end market but even then it was only a small fraction of total worldwide server sales, barely enough to make it a viable platform for anything but wholly custom solutions. The writing was on the wall when Microsoft said that Windows Server 2008 was the last version to support it and now with HP bowing out the death clock for Itanium has begun ticking in earnest, even if the final death knell won’t come for the better part of a decade.

 

All Your Consoles Are Belong To x86.

Ever since the first console was released they have always been at arms length with the greater world of computing. Initially this was just a difference in inputs as consoles were primarily games machines and thus did not require a fully fledged keyboard but over time they grew into being purpose built systems. This is something of a double edged sword as whilst a tightly controlled hardware platform allows developers to code against a set of specifications it also usually meant that every platform was unique which often meant that there was a learning curve for developers every time a new system came out. Sony was particularly guilty of this as the PlayStation 2 and 3 were both notoriously difficult to code for; the latter especially given its unique combination of linear coprocessors and giant non-linear unit.

Playstation 4 Xbox360 Orbis Durango

There was no real indication that this trend was going to stop either as all of the current generation of consoles use some non-standard variant of some comparably esoteric processor. Indeed the only console in recent memory to attempt to use a more standard processor, the original Xbox, was succeeded by a PowerPC driven Xbox360 which would make you think that the current industry standard of x86 processors just weren’t suited to the console environment. Taking into account that the WiiU came out with a PowerPC CPU it seem logical that the next generation would continue this trend but it seems there’s a sea change on the horizon.

Early last year rumours started circulating that the next generation PlayStation, codenamed Orbis, was going to be sporting a x86 based processor but the next generation Xbox, Durango, was most likely going to be continuing with a PowerPC CPU. As it turns out this isn’t the case and Durango will in fact be sporting an x86 (well if you want to be pedantic its x86-64, or x64). This means that its highly likely that code built on the windows platform will be portable to Durango and makes the Xbox the launchpad for the final screen in Microsoft’s Three Screens idea. This essentially means that nearly all major gaming platforms share the same coding base which should make cross platform releases far easier than they have been.

News just in also reveals the specifications of the PlayStation 4 confirming the x86 rumours. It also brings with it some rather interesting news: AMD is looking to be the CPU/GPU manufacturer of choice for the next generation of consoles.

There’s no denying that AMD has had a rough couple years with their most recent quarter posting a net loss of $473 million. It’s not unique to them either as Intel has been dealing with sliding revenue figures as the mobile sector heats up and demand for ARM based processors, which neither of the 2 big chip manufacturer’s provide, skyrockets. Indeed Intel has stated several times that they’re shifting their strategy to try and capture that sector of the market with their most recent announcement being that they won’t be building motherboards any more. AMD seems to have lucked out in securing the CPU for the Orbis (and whilst I can’t find a definitive source it looks like their processor will be in Durango too) and the GPU for both of them which will guarantee them a steady stream of income for quite a while to come. Whether or not this will be enough to reinvigorate the chip giant remains to be seen but there’s no denying that it’s a big win for them.

The end result, I believe, will be an extremely fast maturation of the development frameworks available for the next generation of consoles thanks to their x86 base. What this means is that we’re likely to see titles making the most of the hardware much sooner than we have for other platforms thanks to their ubiquity of their underlying architecture. This will be both a blessing and a curse as whilst the first couple years will see some really impressive titles past that point there might not be a whole lot of room for optimizations. This is ignoring the GPU of course where there always seems to be better ways of doing things but it will be quickly outpaced by its newer brethren. Combine this with the availability of the SteamBox and we could see PCs making a come back as the gaming platform of choice once the consoles start showing their age.

 

Apple Eyeing ARM For Their Desktop Line.

The name of the game for all large technology companies is platform unification and domination, with each of them vying to become the platform that consumers choose. Microsoft has been on a long and winding road to accomplishing this since they first talked about it 3 years ago and Apple has been flirting with the idea ever since it started developing its iOS line of products with features like the App Store making its way back into OSX. Neither of them are really there yet as Windows 8/WinRT are still nascent and requiring a lot more application development before the platform can be considered unified and there is still a wide schism betwen iOS and OSX that Apple hasn’t really tried to bridge.

Predominately that’s because Apple understands that they’re two different markets and their current product strategy doesn’t really support bridging those two markets. The iOS space is pretty much a consumer playground as whilst you can do some of the things that Apple’s previous OS was known for on there its far from being the creative platform that OSX was (and still is, to some extent). Indeed attempts to bridge their previous products with more consumer orientated versions have been met with heavy criticism from their long time fans and their failure to provide meaningful product updates to their creative powerhouse the Mac Pro has also drawn the ire of many creative professionals.

If I’m honest I didn’t really think that Apple would turn their backs on the creative niche that is arguably responsible for making them what they are today. It’s understandable from the company’s point of view to focus your attention on the most profitable sectors, much like games developers do with the whole console priority thing, but it almost feels like the time when Apple still considered itself a player in the enterprise space, only to quietly withdraw from it over the course of a couple years. Whilst there isn’t much evidence to support this idea the latest rumours circulating that they may be considering a switch to ARM for their desktop line doesn’t help to dispel that idea.

ARM, for the uninitiated, is a processor company based out of Cambridge that’s responsible for approximately 95% of all the processors that power today’s smartphones. They are unquestionably the kings of the low power space with many of their designs being able to achieve incredible efficiencies which is what enables your phone to run for hours instead of minutes. Whilst they may no longer be the supplier for the chips that powers Apple’s current line of iOS products their technology is still the basis for them. Suffice to say if you’ve got any piece of mobile technology it’s likely that there’s some kind of ARM processor in there and it’s the reason why Microsoft chose it as their second platform for the WinRT framework.

Apple switching platforms is nothing new as they made the switch to x86/Intel back in 2006. The reason back then was that PowerPC, made by IBM, was not able to keep pace with the rapid improvements in performance that Intel was making but was also because of the performance-per-watt of their processors which was arguably why AMD wasn’t considered. Apple’s direction has changed considerably since then and their focus is much more squarely aimed at portable experiences which is far better served by the low power processors that ARM can deliver. For things like the MacBook and the Air lower power means a longer battery life, probably the most key metric by which these portable computers are judged by.

There’s no doubt that Apple will be able to make the transition however I’m not sure that the cost to them, both in real and intangible terms, would be worth it. Forgetting all the technical challenges in getting all your third parties to re-architect their applications the unfortunate fact is that ARM doesn’t have a processor that’s capable of performing at the same level that Intel’s current line is. This means for creative applications like photo/video editing, graphic design and the like their current software suites will simply not be viable on the ARM platform. Since the transition is a ways off its possible that ARM might be able to design some kind of high power variant to satisfy this part of the market but traditionally that’s not their focus and since the desktop sector is one of Apple’s smallest revenue generators I can’t see them wanting to bother doing so.

This is not to say that this would be a bad move for Apple at large however. Being able to have a consistent architecture across their entire line of products is something that no other company would be able to achieve and would be an absolute boon to those seeking a ubiquitous experience across all their devices. It would also be a developer’s wet dream as you could make a cross-platform applications far more easily than you could with other platforms. Considering that Apple makes the majority of its money from ARM based platforms it doesn’t come as much surprise that they might be considering a move to it, even if that’s at the cost of creative sector that brought them back from the graveyard all those years ago.

I don’t usually comment on Apple rumours, mostly because they’re usually just a repeat of the same thing over and over again, but this one caught my attention because if it turns out to be true it will mark Apple’s final step away from its roots. Whilst the creative professionals may lament the disappearance of a platform they’ve been using for over 2 decades the saving grace will be the fact that on a feature level the Windows equivalents of all their programs are at feature parity. Apple will then continue down the consumer electronics road that it has been for the past 10+ years and where it will go from there will be anyone’s guess.

A Photographic Jaunt.

I had a rather fun and interesting weekend in terms of photography. As part of my whole pursuing my passions business I’ve set about trying to better myself as a photographer and part of that is challenging myself each week (or as close to that as I can) to take on a photographic challenge. The first one was something I was already comfortable with, landscapes, and flush with victory for that I decided to take on something that I haven’t really seriously tackled before: architecture. I had a few locations here in Canberra scouted out and so after swinging by the computer fair to pick up a new router I jumped straight in, looking to find that unique view of some Canberran architecture that’d catch my eye.

To put it simply the day didn’t go quite as I had expected. I figured buildings would be much like landscapes, big things that don’t move or complain so they’d make for easy photographic pickings. It’s completely the opposite of course as for landscapes you’re usually taking things from a great distance away and for buildings and architecture you usually don’t have the luxury of distance, especially if you’re in the middle of a city like I was. I haven’t had the chance to fully review all the pictures I took but suffice to say none of them really impressed me after I took them, so I definitely know there’s room for improvement there.

However during my journey I made a quick sojourn up to the iconic Parliament House as no photographic trip focused on architecture would be complete without a picture or two of it in there. As I approached it however I noticed a group of people out the front with many holding signs and a loud speaker amplifying the words of a lone spokesman. Intrigued I approached them and from what I could tell (many of the signs and speeches were in Arabic, I believe) were protesting the current asylum seeker legislation. Figuring this would be a good time to hone my photojournalisitc skills, which didn’t exist prior to this, I started snapping pictures. No one complained about having their pictures taken but it did bring back some horrible memories of stories of fellow photographers who had had some bad experiences doing the same thing.

Generally speaking if you’re on public property you have every right to take a picture of what you see, especially if what you’re taking a picture of is on public property as well. There have been numerous cases of people being harassed by police when taking photos of them (there were police at this protest too, but I didn’t want to invite trouble by photographing them) but you’re well within your rights to do that as well. There are of course exceptions to these rules as the link describes but for the most part as long as you’re sensible about what kinds of pictures you’re taking you won’t be any legal trouble.

Still it’s always something that niggles at the back of my head and I think that’s one of the reasons I’ve shied away from any kind of photography in a public place.I know I’m in the right legally I can’t shake the feeling that I’ll get accosted by people when taking photos near them, even if I’m not pointing the camera directly at them. I did get a couple looks (the Canon 60D with a 24-105mm F/4L lens and a 480 EX II SpeedLite can be a rather imposing beast to look at) but no one really seemed to care that much that I was taking pictures so that feeling is probably just more of my introverted side coming out more than anything else. Maybe my next challenge should be street photography to work that part out.

I don’t have anything to show you just yet as I don’t like to shoot in JPG + RAW because it seems redundant and the codec pack I’ve got for viewing RAWs doesn’t work on the x64 version of Windows 7. I’ve bought myself a copy of Lightroom though so once I get that installed and get comfortable with the interface you’ll then be relentlessly spammed with all the photographs from the weekend and I might update this post with a few choice shots from my little jaunt. Whilst it might not have been the most pleasurable experience (it was rather cold) it was definitely a learning one and it’s something that I’ll be looking to repeat in the not too distant future.