Posts Tagged‘processor’

Carbon Nanotubes Break Barriers for Moore’s Law.

In the last decade there’s been a move away from raw CPU speed as an indicator of performance. Back when single cores were the norm it was an easy way to judge which CPU would be faster than the other in a general sense however the switch to multiple cores threw this into question. Partly this comes from architecture decisions and software’s ability to make use of multiple cores but it also came hand in hand with a stalling CPU speeds. This is mostly a limitation of current technology as faster switching meant more heat, something most processors could not handle more of. This could be set to change however as research out IBM’s Thomas J. Watson Research Center proposes a new way of constructing transistors that overcomes that limitation.

Carbon Nanotube Transistors

Current day processors, whether they be the monsters powering servers or the small ones ticking away in your smartwatch, are all constructed through a process called photolithography. In this process a silicon wafer is covered in a photosensitive chemical and then exposed to light through a mask. This is what imprints the CPU pattern onto the blank silicon substrate, creating all the circuitry of a CPU. This process is what allows us to pack billions upon billions of transistors into a space little bigger than your thumbnail. However it has its limitations related to things like the wavelength of light used (higher frequencies are needed for smaller features) and the purity of the substrate. IBM’s research takes a very different approach by instead using carbon nanotubes as the transistor material and creating features by aligning and placing them rather than etching them in.

Essentially what IBM does is take a heap of carbon nanotubes, which in their native form are a large unordered mess, and then aligns them on top of a silicon wafer. When the nanotubes are placed correctly, like they are in the picture shown above, they form a transistor. Additionally the researchers have devised a method to attach electrical connectors onto these newly formed transistors in such a way that their electrical resistance is independent of their width. What this means is that the traditional limitation of increasing heat with increased frequency is now decoupled, allowing them to greatly reduce the size of the connectors potentially allowing for a boost in CPU frequency.

The main issue such technology faces is that it is radically different from the way we currently manufacture CPUs today. There’s a lot of investment in current lithography based fabs and this method likely can’t make use of that investment. So the challenge these researchers face is creating a scalable method with which they can produce chips based on this technology, hopefully in a way that can be adapted for use in current fabs. This is why you’re not likely to see processors based on this technology for some time, probably not for another 5 years at least according to the researchers.

What it does show though is that there is potential for Moore’s Law to continue for a long time into the future. It seems whenever we brush up against a fundamental limitation, one that has plagued us for decades, new research rears its head to show that it can be tackled. There’s every chance that carbon nanotubes won’t become the new transistor material of choice but insights like these are what will keep Moore’s Law trucking along.

The Real Winner of the Console Wars: AMD.

In the general computing game you’d be forgiven for thinking there’s 2 rivals locked in a contest for dominance. Sure there’s 2 major players, Intel and AMD, and whilst they are direct competitors with each other there’s no denying the fact that Intel is the Goliath to AMD’s David, trouncing them in almost every way possible. Of course if you’re looking to build a budget PC you really can’t go past AMD’s processors as they provide an incredible amount of value for the asking price but there’s no denying that Intel has been the reigning performance and market champion for the better part of a decade now. However the next generation of consoles have proved to be something of a coup for AMD and it could be the beginnings of a new era for the beleaguered chip company.

AMD LogoBoth of the next generation consoles, the PlayStation 4 and XboxOne, both utilize an almost identical AMD Jaguar chip under the hood. The reasons for choosing it seem to align with Sony’s previous architectural idea for Cell (I.E. having lots of cores working in parallel rather than fewer working faster) and AMD is the king of cramming more cores into a single consumer chip. Although the reasons for going for AMD over Intel likely stem from the fact that Intel isn’t too crazy about doing custom hardware and the requirements that Sony and Microsoft had for their own versions of Jaguar could simply not be accommodated. Considering how big the console market is this would seem like something of a misstep by Intel, especially judging by the PlayStation4’s day one sales figures.

If you hadn’t heard the PlayStation 4 managed to move an incredible 1 million consoles on its first day of launch and that was limited to the USA. The Nintendo Wii by comparison took about a week to move 400,000 consoles and it even had a global launch window to beef up the sales. Whether the trend will continue or not considering that the XboxOne just got released yesterday is something we’ll have to wait to see but regardless every one of those consoles being purchased now contains in it an AMD CPU and they’re walking away with a healthy chunk of change from each one.

To put it in perspective out of every PlayStation 4 sale (and by extension every XboxOne as well) AMD is taking away a healthy $100 which means that in that one day of sales AMD generated some $100 million for itself. For a company who’s annual revenue is around the $1.5 billion mark this is a huge deal and if the XboxOne launch is even half that AMD could have seen $150 million in the space of a week. If the previous console generations were anything to go by (roughly 160 million consoles between Sony and Microsoft) AMD is looking at a revenue steam of some $1.6 billion over the next 8 years, a 13% increase to their bottom line. Whilst it’s still a far cry from the kinds of revenue that Intel sees on a monthly basis it’s a huge win for AMD and something they will hopefully be able to use to leverage themselves more in other markets.

Whilst I may have handed in my AMD fanboy badge after many deliriously happy years with my watercooled XP1800+ I still think they’re a brilliant chip company and their inclusion in both next generation consoles shows that the industry giants think the same way. The console market might not be as big as the consumer desktop space nor as lucrative as the high end server market but getting their chips onto both sides of the war is a major coup for them. Hopefully this will give AMD the push they need to start muscling in on Intel’s turf again as whilst I love their chips I love robust competition between giants a lot more.

 

So Long Itanium, You Will Not Be Missed.

I’ve worked with a lot of different hardware in my life, from the old days of tinkering with my Intel 80286 through to esoteric Linux systems running on DEC tin until I, like everyone else in the industry, settled on x86-64 as the defacto standard. Among the various platforms I was happy to avoid (including such lovely things as Sun SPARC) was Intel’s Itanium range as it’s architecture was so foreign from anything else it was guaranteed that whatever you were trying to do, outside of building software specifically for that platform, was doomed to failure. The only time I ever came close to seeing it being deployed was on the whim of a purchasing manager who needed guaranteed 100% uptime until they realised the size of the cheque they’d need to sign to get it.

Oh Shiny ItaniumIf Intel’s original dream was to be believed then this post would be coming to you care of their processors. You see back when it was first developed everything was still stuck in the world of 32bit and the path forward wasn’t looking particularly bright. Itanium was meant to be the answer to this, with Intel’s brand name and global presence behind it we would hopefully see all applications make their migration to the latest and greatest 64bit platform. However the complete lack of any backwards compatibility with any currently developed software and applications meant adopting it was a troublesome exercise and was a death knell for any kind of consumer adoption. Seeing this AMD swooped in with their dually compatible x86-64 architecture which proceeded to spread to all the places that Itanium couldn’t, forcing Intel to adopt the standard in their consumer line of hardware.

Itanium refused to die however finding a home in the niche high end market due to its redundancy features and solid performance for optimized applications. However the number of vendors continuing to support the platform dwindled from their already low numbers with it eventually falling to HP being the only real supplier of Itanium hardware in the form of their NonStop server line. It wasn’t a bad racket for them to keep up though considering the total Itanium market was something on the order of $4 billion a year and with only 55,000 servers shipped per year you can see how much of a premium they attract). Still all the IT workers of the world have long wondered when Itanium would finally bite the dust and it seems that that day is about to come.

HP has just announced that it will be transitioning its NonStop server range from Itanium to x86 effectively putting an end to the only sales channel that Intel had for their platform. What will replace it is still up in the air but it’s safe to assume it will be another Intel chip, likely one from their older Xeon line that shares many of the features that the Itanium had without the incompatible architecture. Current Itanium hardware is likely to stick around for an almost indefinite amount of time however due to the places it has managed to find itself in, much to the dismay of system administrators everywhere.

In terms of accomplishing it’s original vision Itanium was an unabashed failure, never finding the consumer adoption that it so desired and never becoming the herald of 64bit architecture. Commercially though it was somewhat of a success thanks to its features that made it attractive to the high end market but even then it was only a small fraction of total worldwide server sales, barely enough to make it a viable platform for anything but wholly custom solutions. The writing was on the wall when Microsoft said that Windows Server 2008 was the last version to support it and now with HP bowing out the death clock for Itanium has begun ticking in earnest, even if the final death knell won’t come for the better part of a decade.

 

Apple Eyeing ARM For Their Desktop Line.

The name of the game for all large technology companies is platform unification and domination, with each of them vying to become the platform that consumers choose. Microsoft has been on a long and winding road to accomplishing this since they first talked about it 3 years ago and Apple has been flirting with the idea ever since it started developing its iOS line of products with features like the App Store making its way back into OSX. Neither of them are really there yet as Windows 8/WinRT are still nascent and requiring a lot more application development before the platform can be considered unified and there is still a wide schism betwen iOS and OSX that Apple hasn’t really tried to bridge.

Predominately that’s because Apple understands that they’re two different markets and their current product strategy doesn’t really support bridging those two markets. The iOS space is pretty much a consumer playground as whilst you can do some of the things that Apple’s previous OS was known for on there its far from being the creative platform that OSX was (and still is, to some extent). Indeed attempts to bridge their previous products with more consumer orientated versions have been met with heavy criticism from their long time fans and their failure to provide meaningful product updates to their creative powerhouse the Mac Pro has also drawn the ire of many creative professionals.

If I’m honest I didn’t really think that Apple would turn their backs on the creative niche that is arguably responsible for making them what they are today. It’s understandable from the company’s point of view to focus your attention on the most profitable sectors, much like games developers do with the whole console priority thing, but it almost feels like the time when Apple still considered itself a player in the enterprise space, only to quietly withdraw from it over the course of a couple years. Whilst there isn’t much evidence to support this idea the latest rumours circulating that they may be considering a switch to ARM for their desktop line doesn’t help to dispel that idea.

ARM, for the uninitiated, is a processor company based out of Cambridge that’s responsible for approximately 95% of all the processors that power today’s smartphones. They are unquestionably the kings of the low power space with many of their designs being able to achieve incredible efficiencies which is what enables your phone to run for hours instead of minutes. Whilst they may no longer be the supplier for the chips that powers Apple’s current line of iOS products their technology is still the basis for them. Suffice to say if you’ve got any piece of mobile technology it’s likely that there’s some kind of ARM processor in there and it’s the reason why Microsoft chose it as their second platform for the WinRT framework.

Apple switching platforms is nothing new as they made the switch to x86/Intel back in 2006. The reason back then was that PowerPC, made by IBM, was not able to keep pace with the rapid improvements in performance that Intel was making but was also because of the performance-per-watt of their processors which was arguably why AMD wasn’t considered. Apple’s direction has changed considerably since then and their focus is much more squarely aimed at portable experiences which is far better served by the low power processors that ARM can deliver. For things like the MacBook and the Air lower power means a longer battery life, probably the most key metric by which these portable computers are judged by.

There’s no doubt that Apple will be able to make the transition however I’m not sure that the cost to them, both in real and intangible terms, would be worth it. Forgetting all the technical challenges in getting all your third parties to re-architect their applications the unfortunate fact is that ARM doesn’t have a processor that’s capable of performing at the same level that Intel’s current line is. This means for creative applications like photo/video editing, graphic design and the like their current software suites will simply not be viable on the ARM platform. Since the transition is a ways off its possible that ARM might be able to design some kind of high power variant to satisfy this part of the market but traditionally that’s not their focus and since the desktop sector is one of Apple’s smallest revenue generators I can’t see them wanting to bother doing so.

This is not to say that this would be a bad move for Apple at large however. Being able to have a consistent architecture across their entire line of products is something that no other company would be able to achieve and would be an absolute boon to those seeking a ubiquitous experience across all their devices. It would also be a developer’s wet dream as you could make a cross-platform applications far more easily than you could with other platforms. Considering that Apple makes the majority of its money from ARM based platforms it doesn’t come as much surprise that they might be considering a move to it, even if that’s at the cost of creative sector that brought them back from the graveyard all those years ago.

I don’t usually comment on Apple rumours, mostly because they’re usually just a repeat of the same thing over and over again, but this one caught my attention because if it turns out to be true it will mark Apple’s final step away from its roots. Whilst the creative professionals may lament the disappearance of a platform they’ve been using for over 2 decades the saving grace will be the fact that on a feature level the Windows equivalents of all their programs are at feature parity. Apple will then continue down the consumer electronics road that it has been for the past 10+ years and where it will go from there will be anyone’s guess.

The New iPad: A Better Screen…and That’s About It.

Well another year has gone by since my last post on the iPad so that must mean its time for Apple to release another one. The tech media has been all abuzz about what Apple had in store for us today (like there was any doubt) ever since Apple sent out invites to the event that, as of writing, is still taking place. Speculation has been running rampant as to what will be included in the next version and what will be left by the wayside. Not wanting to disappoint their fans Apple has announced the next version of the iPad (strangely bereft of any nomenclature denoting its version) and it’s pretty much met expectations.

Usually I’d chuck a photo of the device up here for good measure but the new iPad is basically identical to the last one as far as looks go, being only slightly thicker and heavier than its predecessor. Honestly there’s little room for innovation in looks as far as tablets go, just look at any other tablet for comparison, so it’s no surprise that Apple has decided to continue with the same original design. Of course this might come to the dismay of Apple fans out there, but there’s at least one defining feature that will visually set the new iPad apart from its predecessors.

That feature is the screen.

If you cast your mind back a year (or just read the first linked post) you’ll notice that rumours of a retina level screen for the iPad have been circulating around for quite some time. At the time many commented that such a resolution would be quite ludicrous, like near the resolution of Apple’s 30″ cinema dislpays kind of ludicrous. Sure enough the now current generation of iPad sports a 2048 by 1536 resolution display which gives it a PPIof 264, double that of the iPad 2. Whilst everyone is calling this a “retina” level display its actually far from it as the screen in the iPhone 4s sports 326 PPI or about 20% more pixels. The display will still look quite incredible, hell even monitors with a lower resolution and an order of magnitude more size manage to look great, but calling it a retina display is at best disingenuous.

Of course to power that monster of a screen Apple has had to upgrade the processor. The new chip is dubbed the A5X and sports a dual core general CPU and a quad core graphics chip. As always Apple is keeping the gritty details a closely guarded secret but it’s safe to assume that it sports a faster clock rate and has more integrated RAM than its predecessor. I wouldn’t be surprised if it was something along the lines of 1.2GHz with 1024MB of RAM as that would put it on par with many other devices currently on the market. We’ll have to wait for the tear downs to know for sure though.

Apart from that there’s little more that’s changed with the new iPad. The camera is slightly better being able to take 5MP stills and film 1080p video. Whilst you won’t find Siri on this yet you will now be given the option of doing speech-to-text on the iPad. That’s pretty much it for what’s new with the iPad and whilst I wouldn’t think that’d be a compelling reason to upgrade from the 2 I’m sure there will be many who do exactly that.

I’ll be honest with you, I’ve been eyeing off an iPad for quite some time now. I had had my eye on many an Android tablet for a while but the fact remains that the iPad has the greatest tablet ecosystem and for the use cases I have in mind (read: mostly gaming) there’s really no competition. The new iPad then, whilst not being worth the upgrade in my opinion, has reached a feature level where it represents good value for those looking to enter into the tablet market. If you’re just looking for a general tablet however there are many other options which would provide far more value, bar the insanely high resolution screen.

Apple’s yearly release schedule seems to be doing wonders for them and the new iPad will not likely be an exception to that. Past the screen and new processor there’s really nothing new about the now current generation iPad but I can see many people justifying their purchase based on those two things alone. The really interesting thing to watch from now will be how Apple goes about developing their ecosystem as whilst the iPad can boast the best tablet experience Google’s not too far behind, just waiting for the chance at the crown.

Samsung Galaxy S3 Specs Revealed, Nerd Chills Had.

7 months down the line and I’m still a big fan of my Samsung Galaxy S2. It’s been a great phone, combining large screen size with a slim, lightweight shell that I sometimes have to check for to remind myself that its still in my pocket. It’s surprisingly resilient as well, having taken more than a couple drops from pretty decent heights and coming out the other end with only minor scuffs and nary a scratch on the screen. Sadly I can’t say much more for the battery life as it seems that the more apps I pile on there the worse it gets, but I can’t really blame the phone for my app hoarding ways.

However I always knew that this relationship would be temporary, I mean how could it not? It started with geek wunderlust and as it is with all relationships that start like that it’s inevitable that my eyes would begin to wander, and so they have with this announcement:

…Ladies and gentlemen, here is the Samsung Galaxy S III:

  • 1.5GHz quad-core Samsung Exynos processor
  • 4.8-inch “full HD” 1080p resolution with 16:9 aspect ratio display
  • A 2-megapixel front-facing camera and an 8-megapixel rear camera
  • Ceramic case
  • 4G LTE
  • Android 4.0

I’ll spare you the photoshopped Galaxy S2 images that are doing the rounds but suffice to say those specs are pretty darn amazing. They’re also fairly plausible as well given Samsung’s research into the component technologies and current trends for both carriers and the Android platform. The detail that caught my eye however was the ceramic case as that’s not a material that you’d usually expect to see on a mobile phone with plastic and glass being the only 2 real choices. There could be reasoning behind it though and if my suspicions are correct its due to the crazy amount of tech they’ve stuffed under the hood.

Traditionally ceramics are pretty poor heat conductors which is why they make for good mugs and insulation materials. However there are quite a few advanced ceramics that are very capable of moving heat just as efficiently as most metals are, some even better. Now anyone who has a dual core smart phone knows how hot the buggers get when you’re using them for an extended period and since most phones are plastic that heat tends to stick around rather than dissipate. The ceramic case could then be an attempt to mitigate the heat problems that will come with the quad core processor and larger screen. This also has the potential to make the phones somewhat more brittle however (ceramics don’t flex, they shatter) so it will be interesting to see how Samsung compensates for that.

With just those few details though I’m already excited for Samsung’s next instalment in their flagship line of smart phones. Their last 2 iterations of the Galaxy S line have gone from strength to strength, firmly cementing themselves as the number one Android handset manufacturer. The Galaxy S3 looks to continue this trend with specifications that are sure to tempt even the most recent purchasers of the S2. I know I’ll find it hard to resist and I’m thankful that it probably won’t be out for a little while longer.

I don’t think my wallet would appreciate buying 2 phones within 7 months of each other 😉