Posts Tagged‘ibm’

Carbon Nanotubes Break Barriers for Moore’s Law.

In the last decade there’s been a move away from raw CPU speed as an indicator of performance. Back when single cores were the norm it was an easy way to judge which CPU would be faster than the other in a general sense however the switch to multiple cores threw this into question. Partly this comes from architecture decisions and software’s ability to make use of multiple cores but it also came hand in hand with a stalling CPU speeds. This is mostly a limitation of current technology as faster switching meant more heat, something most processors could not handle more of. This could be set to change however as research out IBM’s Thomas J. Watson Research Center proposes a new way of constructing transistors that overcomes that limitation.

Carbon Nanotube Transistors

Current day processors, whether they be the monsters powering servers or the small ones ticking away in your smartwatch, are all constructed through a process called photolithography. In this process a silicon wafer is covered in a photosensitive chemical and then exposed to light through a mask. This is what imprints the CPU pattern onto the blank silicon substrate, creating all the circuitry of a CPU. This process is what allows us to pack billions upon billions of transistors into a space little bigger than your thumbnail. However it has its limitations related to things like the wavelength of light used (higher frequencies are needed for smaller features) and the purity of the substrate. IBM’s research takes a very different approach by instead using carbon nanotubes as the transistor material and creating features by aligning and placing them rather than etching them in.

Essentially what IBM does is take a heap of carbon nanotubes, which in their native form are a large unordered mess, and then aligns them on top of a silicon wafer. When the nanotubes are placed correctly, like they are in the picture shown above, they form a transistor. Additionally the researchers have devised a method to attach electrical connectors onto these newly formed transistors in such a way that their electrical resistance is independent of their width. What this means is that the traditional limitation of increasing heat with increased frequency is now decoupled, allowing them to greatly reduce the size of the connectors potentially allowing for a boost in CPU frequency.

The main issue such technology faces is that it is radically different from the way we currently manufacture CPUs today. There’s a lot of investment in current lithography based fabs and this method likely can’t make use of that investment. So the challenge these researchers face is creating a scalable method with which they can produce chips based on this technology, hopefully in a way that can be adapted for use in current fabs. This is why you’re not likely to see processors based on this technology for some time, probably not for another 5 years at least according to the researchers.

What it does show though is that there is potential for Moore’s Law to continue for a long time into the future. It seems whenever we brush up against a fundamental limitation, one that has plagued us for decades, new research rears its head to show that it can be tackled. There’s every chance that carbon nanotubes won’t become the new transistor material of choice but insights like these are what will keep Moore’s Law trucking along.

An Artificial Brain in Your Pocket.

Artificial neural networks, a computational framework that mimmics biological learning processes using statistics and large data sets, are behind many of the technological marvels of today. Google is famous for employing some of the largest neural networks in the world, powering everything from their search recommendations to their machine translation engine. They’re also behind numerous other innovations like predictive text inputs, voice recognition software and recommendation engines that use your previous preferences to suggest new things. However these networks aren’t exactly portable, often requiring vast data centers to produce the kinds of outputs we expect. IBM is set to change that however with their TrueNorth architecture, a truly revolutionary idea in computing.

DARPA_SyNAPSE_16_Chip_Board

The chip, 16 of which are shown above welded to a DARPA SyNAPSE board, is most easily thought of as a massively parallel chip comprising of some 4096 processes cores. Each of these cores contains 256 programmable synapses, totalling around 1 million per chip. Interestingly whilst the chip’s transistor count is on the order of 5.4 billion, which for comparison is just over double of Intel’s current offering, it uses a fraction of the power you’d expect it to: a mere 70 milliwatts. That kind of power consumption means that chips like these could make their way into portable devices, something that no one would really expect with transistor counts that high.

But why, I hear you asking, would you want a computerized brain in your pocket?

IBM’s TrueNorth chip is essentially the second half of the two part system that is a neural network. The first step to creating a functioning neural network is training it on a large dataset. The larger the set the better the network’s capabilities are. This is why large companies like Google and Apple can create useable products out of them, they have huge troves of data with which to train them on. Then, once the network is trained, you can set it loose upon new data and have it give you insights and predictions on it and that’s where a chip like TrueNorth can come in. Essentially you’d use a big network to form the model and then imprint on a TrueNorth chip, making it portable.

The implications of this probably wouldn’t be immediately apparent for most, the services would likely retain their same functionality, but it would eliminate the requirement for an always on Internet connection to support them. This could open up a new class of smart devices with capabilities that far surpass anything we currently have like a pocket translator that works in real time. The biggest issue I see to its adoption though is cost as a transistor count that high doesn’t come cheap as you’re either relying on cutting edge lithography or significantly reduced wafer yields. Both of these lead to high priced chips, likely even more than current consumer CPUs.

Like all good technology however this one is a little way off from finding its way into our hands as whilst the chip exists the software stack required to use it is still under active development. It might sound like a small thing however this chip behaves in a way that’s completely different to anything that’s come before it. However once that’s been settled then the floodgates can be opened to the wider world and then, I’m sure, we’ll see a rapid pace of innovation that could spur on some wonderful technological marvels.

Chef Watson: Big Data Might Finally be Usable.

The promise of Big Data has been courting many a CIO for years now, the allure being that all the data they have on everything can be fed into some giant engine that will then spit out insights for them. However like all things the promise and the reality are vastly different beasts and whilst there are examples of Big Data providing never before seen insights it hasn’t really revolutionized industries in the way other technologies have. A big part of that is that Big Data tools aren’t push button solutions, requiring a deep understanding of data science in order to garner the insights you seek. IBM’s Watson however is a much more general purpose engine, one that I believe could potentially deliver on the promises that its other Big Data compatriots have made.

IBM-2-of-4

The problem I see with most Big Data solutions is that they’re not generalizable, I.E. a solution that’s developed for a specific data set (say a logistics company wanting to know how long it takes a package to get from one place to another) will likely not be applicable anywhere else. This means whilst you have the infrastructure and capability to generate insights the investment required to attain them needs to be reapplied every time you want to look at the data in a different way or if you have other data that requires similar insights to be derived from it. Watson on the other hand falls more into the category of a general purpose data engine that can ingest all sorts of data and provide meaningful insights, even to things you wouldn’t expect like helping to author a cookbook.

The story behind how that came about is particularly interesting as it showed what I feel is the power of Big Data without the required need to have a data science degree to exploit it. Essentially Watson was fed with over 9000 (ha!) recipes from Bon Appétit‘s database which was then supplemented with the knowledge it has around flavour profiles. It then used all this information to derive new combinations that you wouldn’t typically think of and then provided them back to the chefs to prepare. Compared to traditional recipes the ingredient lists that Watson provided were much longer and involved however the results (which should be mostly attributed to the chefs preparing them) were well received showing that Watson did provide insight that would otherwise have been missed.

That’d just be an impressive demonstration of data science if it wasn’t for the fact that Watson is now being used to provide similar levels of insight across a vast number of industries from medical to online shopping to even matching remote workers with employers seeking their skills. Whilst it’s far short of what most people would class as a general AI (it’s more akin to a highly flexible expert system on the data it’s provided)  Watson has shown that it can be fed a wide variety of data sets and can then be queried in a relatively straightforward way. It’s that last part that I believe is the secret sauce to making Big Data usable and it could be the next big thing for IBM.

Whether or not they can capitalize on that though is what will determine if Watson becomes the one Big Data platform to rule them all or simply an interesting footnote in the history of expert systems. Watson has already proven its capabilities numerous times over so fundamentally it’s ready to go and the responsibility now resides with IBM to make sure it gets in the right hands to further develop it. Watson’s presence is growing slowly but I’m sure a killer app isn’t too far off for it.

IBM’s Watson has an API, and It’s Answering Questions.

In a world where Siri can book you a restaurant and Google Now can tell you when you should head for the gate at the airport it can feel like the AI future that many sci-fi fantasies envisioned is already here. Indeed to some extent it is, many aspects of our lives are now farmed out to clouds of servers that make decisions for us, but those machines still lack a fundamental understanding of, well, anything. They’re what are called expert systems, algorithms trained on data to make decisions in a narrow problem space. The AI future that we’re heading towards is going to be far more than that, one where those systems actually understand data and can make far better decisions based on that. One of the first steps to this is IBM’s Watson and it’s creators have done something amazing with it.

IBM_Watson

Whilst currently only open to partner developers IBM has created an API for Watson, allowing you to pose it a question and receive an answer. There’s not a lot of information around what data sets it currently understands (the example is in the form of a Jeopardy! question) but their solution documents reference a Watson Content Store which, presumably, has several pre-canned training sets to get companies started with developing solutions. Indeed some of the applications that IBM’s partner agencies have already developed suggest that Watson is quite capable of digesting large swaths of information and providing valuable insights in a relatively short timeframe.

I’m sure many of my IT savvy readers are seeing the parallels between Watson and a lot of the marketing material that surrounds anything with the buzzword “Big Data”. Indeed much of the concepts of operation are similar: take big chunks of data, throw them into a system and then hope that something comes out the other end. However Watson’s API suggests something that’s far more accessible, dealing in native human language and providing evidence to back up the answers it gives you. Compare this to Big Data tools, which often require you to either learn a certain type of language or create convoluted reports, and I think Watson has the ability to find widespread use while Big Data keeps its buzzword status.

For me the big applications for something like this come for places where curating domain specific knowledge is a long, time consuming task. Medicine and law both spring to mind as there’s reams of information available to power a Watson based system and those fields could most certainly benefit from having easier access to those vast treasure troves. It’s pretty easy to imagine a lawyer looking for all precedents set against a certain law or a doctor asking for all diseases with a list of symptoms, both queries answered with all the evidence to boot.

Of course it remains to be seen if Watson is up to the task as whilst it’s prowess on Jeopardy! was nothing short of amazing I’ve still yet to see any of its other applications in use. The partner applications do look very interesting, and should hopefully be the proving grounds that Watson needs, but until it starts seeing widespread use all we really have to go on is the result of a single API call. Still I think it has great potential and hopefully it won’t be too long before the wider public can get access to some of Watson’s computing genius.

IBM Isn’t the Solution to Your Enterprise Woes, Apple.

There’s no question that Apple was the primary force behind the Bring Your Own Device (BYOD) movement. It didn’t take long for every executive to find themselves with an iPad in their hands, wondering why they had to use their god damn Blackberry when the email experience on their new tablet was so much better. Unfortunately, as is the case with most Apple products, the enterprise integration was severely lacking and the experience suffered as a result. Today the experience is much better although that’s mostly the result of third party vendors developing solutions, not so much Apple developing the capability themselves. It seems that after decades of neglecting the enterprise Apple is finally ready to make a proper attempt at it, although in the most ass backwards way possible.

Tim Cook and Virgina Rometty

Today Apple announced that it would be partnering with IBM in order to grow their mobility offerings starting with a focus on applications, cloud services and device supply and support. IBM is going to start off by developing 100 “industry specific” enterprise solutions, essentially native applications for the iPhone and iPad that are tailored for specific business needs. They’ll also be growing their cloud offering with services that are optimized for iOS with a focus on all the buzzwords that surround the BYOD movement (security, management, analytics and integration). You’ll also be able to source iOS devices from IBM with warranty backing by Cupertino, enabling IBM to really be your one stop shop for all things Apple related in the enterprise.

At a high level this would sound like an amazing thing for anyone who’s looking to integrate Apple products into their environment. You could engage IBM’s large professional services team to do much of the leg work for you, freeing you from worrying about the numerous issues that come from enabling a BYOD environment. The tailored applications would also seem to solve a big pain point for a lot of users as the only option most enterprises have available to them today is to build their own, a significantly costly endeavour. Plus if you’re already buying IBM equipment their supply chain will already be well known to you and your financiers, lowering the barrier to entry significantly.

Really it does sound amazing, except for the fact that this partnership is about 5 years late.

Ever since everyone wanted their work email on an iPhone there’s been vendors working on solutions to integrate non-standard hardware into the enterprise environment. The initial solutions were, frankly, more trouble than they were worth but today there are a myriad of applications available for pretty much every use case you can think of. Indeed pretty much every single thing that this partnership hopes to achieve is already possible today, not at some undetermined time in the future.

This is not to mention that IBM is also the last name you’d think of when it comes to cloud services, especially when you consider how much business they’ve lost as of late. The acquisition of SoftLayer won’t help them much in this regard as they’re building up an entirely new capability from scratch which, by definition, means that they’re offering will be behind everything else that’s currently available. They might have the supply chains and capital to be able to ramp up to public cloud levels of scalability but they’re doing it several years after everyone else has, in a problem space that is pretty much completely solved.

The only place I can see this partnership paying dividends is in places which have yet to adopt any kind of BYOD or mobility solution which, honestly, is few and far between these days. This isn’t an emerging market that IBM is getting in on the ground floor on, it’s a half decade old issue that’s had solutions from numerous vendors for some time now. Any large organisation, which has been IBM’s bread and butter since time immemorial, will already have solutions in place for this. Transitioning them away from that is going to be costly and I doubt IBM will be able to provide the requisite savings to make it attractive. Smaller organisations likely don’t need the level of management that IBM is looking to provide and probably don’t have a working relationship with Big Blue anyway.

Honestly I can’t see this working out at all for IBM and it does nothing to improve Apple’s presence in the enterprise space. The problem space is already well defined with solid solutions available from multiple vendors, many of which have already have numerous years of use in the field. The old adage of never getting fired for buying IBM has long been irrelevant and this latest foray into a field where their experience is questionable will do nothing to bring it back. If they do manage to make anything of this I will be really surprised as entering a market this late in the piece rarely works out well, even if you have mountains of capital to throw at it.

The Cloud Wars Are About to Begin.

With virtualization now being as much of as a pervasive idea in the datacentre as storage array networks or under floor cooling the way has been paved for the cloud to make its way there as well for quite some time now. There are now many commercial off the shelf solutions that allow you to incrementally implement the multiple levels of the cloud (IaaS -> PaaS -> SaaS) without the need for a large operational expenditure in developing the software stack at each level. The differentiation now comes from things like added services, geographical location and pricing although even that is already turning into a race to the bottom.

The big iron vendors (Dell, HP, IBM) have noticed this and whilst they could still sustain their current business quite well by providing the required tin to the cloud providers (the compute power is shifted, not necessarily reduced) they’re all starting to look to creating their own cloud solutions so that they can continue to grow their business. I covered HP’s cloud solution last week after the HP Cloud Tech day but recently there’s been a lot of news coming out regarding the other big players, both from the old big iron world and the more recently established cloud providers.

First cab off the rank I came across was Dell who are apparently gearing up to make a cloud play. Now if I’m honest that article, whilst it does contain a whole lot of factual information, felt a little speculative to me mostly because Dell hasn’t tried to sell me on the cloud idea when I’ve been talking to them recently. Still after doing a small bit of research I found that not only are Dell planning to build a global network of datacentres (where global usually means everywhere but Australia) they announced plans to build one in Australia just on a year ago. Combining this with their recent acquisition spree that included companies like Wyse it seems highly likely that this will be the backbone of their cloud offering. What that offering will be is still up for speculation however, but it wouldn’t surprise me if it was yet another OpenStack solution.

Mostly because RackSpace, probably the second biggest general cloud provider behind Amazon Web Services, just announced that their cloud will be compatible with the OpenStack API. This comes hot off the heels of another announcement that both IBM and RedHat would become contributers to the OpenStack initiative although no word yet on whether they have a view to implement the technology in the future. Considering that both HP and Dell have are already showing their hands with their upcoming cloud strategies it would seem like becoming OpenStack contributers will be the first step to seeing some form of IBM cloud. They’d be silly not to given their share of the current server market.

Taking all of this into consideration it seems that we’re approaching a point of convergence in the cloud computing industry. I wrote early last year that one of the biggest draw backs to the cloud was its proprietary nature and it seems like the big iron providers noticed that this was a concern. The reduction of vendor lock lowers the barriers to entry for many customers significantly and provides a whole host of other benefits like being able to take advantage of disparate cloud providers to provide service redundancy. As I said earlier the differentiation between providers will then predominately come from value-add services, much like it did for virtualization in the past.

This is the beginning of the cloud war, where all the big players throw their hats into the ring and duke it out for our business. It’s a great thing for both businesses and consumers as the quality of products will increase rapidly and the price will continue on a down hill trend. It’s quite an exciting time, one akin to the virtualization revolution that started happening almost a decade ago. Like always I’ll be following these developments keenly as the next couple years will be something of a proving ground for all cloud providers.

Deep Blue, Watson and The Evolution of AI.

I’m not sure why but I get a little thrill every time I see something that’s been completely automated that used to require manual intervention from start to finish. It’s probably because the more automated something is the more time I have to do other things and there’s always that little thrill in watching something you built trundle along its way, even if it falls over part way through. My most recent experiment in this area was crafting the rudimentary trainer for Super Meat Boy to get me past a nigh on impossible part of the puzzle, co-ordinating the required key strokes with millisecond precision and ultimately wresting me free of the death grip that game held on me.

The world of AI is an extension of the automation idea, using machines to perform tasks that we would otherwise have to do ourselves. The concept has always fascinated me as more and more we’re seeing various forms of AI creeping their way into our everyday lives. However most people won’t recognize them as AI simply because they’re routine, but in reality many of the functions these weak AIs perform used to be in the realms of science fiction. We’re still a long way from having a strong AI like we’re used to seeing in the movies but that doesn’t mean many facets of it aren’t already in widespread use today. Most people wouldn’t think twice when a computer asks them to speak their address but going back only a few decades would see that be classed as the realms of strong AI, not the expert system it has evolved into today.

What’s even more interesting is when we create machines that are more capable than ourselves at performing certain tasks. The most notable example (thus far) of a computer be able to beat a human at a certain non-trivial task is Deep Blue, the chess playing computer that managed to beat the world chess champion Kasparov albeit under dubious circumstances. Still the chess board is a limited problem set and whilst Deep Blue was a super computer in its time today you’d find as much power hidden under the hood of your Playstation 3. IBM’s research labs have been no slouch in developing Deep Blue’s successor, and it’s quite an impressive beast.

Watson, as it has come to be known, is the next step in the evolution of AIs performing tasks that have only been in the realms of humans. The game of choice this time around is Jeopardy a gameshow who’s answers are in the form of a question and makes extensive use of puns and colloquialisms. Jeopardy represents a unique challenge to AI developers as it involves complex natural language processing, searching immense data sets and creating relationships between disparate sources of information to finally culminate in an answer. Watson can currently determine whether or not it can answer a question within a couple seconds but that’s thanks to the giant supercomputer that’s backing it up. The demonstration round showed Watson was quite capable of playing with the Jeopardy champions, winning the round quite with a considerable lead.

What really interested me in this though was the reaction from other people when I mentioned Watson to them. It seemed that a computer playing Jeopardy (and beating the human players) wasn’t really a big surprise at all, in fact it was expected. This was telling about how us humans view computers as most people expect them to be able to accomplish anything, despite the limitations that are obvious to us geeks. I’d say this has to do with the ubiquity of computers in our everyday lives and how much we use them to perform rudimentary tasks. The idea that a computer is capable of beating a human at anything isn’t a large stretch of the imagination if you treat them as mysterious black boxes but it still honestly surprised me to learn this is how many people think.

Last night saw Watson play its first real game against the Jeopardy champions and whilst it didn’t repeat its performance of the demonstration round it did tie for first place. The second round is scheduled to air sometime tomorrow (Australia time) and whilst I’ve not yet had a chance to watch the entire round I can’t tell you how excited I am to see the outcome. Either way the realm of AI has taken another step forward towards the ultimate goal of creating intelligence born not out of flesh, but silicone and whilst some might dread the prospect I for one can’t wait and will follow all developments with baited breath.

The Changing Face of the Modern Geek.

Just as the IT industry continues to reinvent itself every 10 years so it also appears do the people in that industry. Whilst the term IT is relatively new when compared to many other trades it has still managed to capture a stereotype. What is interesting however is how the image of the typical IT geek has progressed over the past few decades from a lab worker to now something completely and utterly different.

ibm_7030

Image courtesy of the Computer History Museum.

In the early days of large computational clusters many technicians would look like this. Well dressed and with an almost business like demeanour. It was part of the culture back then as many of these types of systems were either for large universities or corporations, and with big dollars being shelled out for such systems (this was the IBM 7030 Stretch which would cost around $100 million in today’s dollars) this was kind of expected. I think that’s why the next generation of geeks set the trend for the next couple decades.

bill-gates-marketing

Image courtesy of Microsoft.

A young Bill Gates shows what would become the typical image conjured up in everyone’s heads when the word geek or nerd was uttered for a long time to come. The young, tall and skinny people who delved themselves into computers were the faces of our IT community for a long time, and I think this is when those thick rimmed glasses became synonymous with our kind. It was probably around this time that geeks became associated with a tilt towards social awkwardness, something that many people still joke about today. What’s really interesting though is the next few steps I’ve seen in the changing geek image.

davidfilojerryyangImage courtesy of JustTheLists.

Jerry Yang and David Filo, the first of a generation of what most people call Internet pioneers. Whilst I can’t find a direct link to it Yahoo had a bit of a reputation for a very casual work environment, with t-shirts and sandals the norm. It was probably because of their success from coming straight out of university and into a successful corporate world, where they grew their own business culture. This kind of thing flowed onto many of the other successful Internet companies like Google, who lavishes their employees with almost everything they will ever need.

382px-tom_anderson

Image Courtesy of Robert Scoble.

Tom Anderson, one of the co-founders of MySpace is not what you’d call your typical geek with a degree in Arts and a masters in Film. You’d struggle to find him even associated with such titles, yet he’s behind one of the largest technical companies on the Internet. Truly the face of the modern geek aspires to something more like Tom Anderson then it does to a young Bill Gates.

I found this interesting because of the company that I keep. We all love computer games and the latest bits of tech, but you’d be hard pressed to find among us anyone you could really call your stereotypical geek. I think this is indicative of the maturity that the IT industry has acquired. The term IT Professional no longer conjures up an idea of a basement dwelling console hacker with thick glasses, more it gives the impression that you’d expect from a professional in any industry. Something which carries with it a decent chunk of respect.

I guess the next step is when we start seeing Joe the IT Professional used in political campaigns.