Posts Tagged‘computer’

HP’s “The Machine” Killed, Surprising No One.

Back in the day it didn’t take much for me to get excited about a new technology. The rapid progressions we saw from the late 90s through to the early 2010s had us all fervently awaiting the next big thing as it seemed nearly anything was within our grasp. The combination of getting older and being disappointed a certain number of times hardened me against this optimism and now I routinely attempt to avoid the hype for anything I don’t feel is a sure bet. Indeed I said much the same about HP’s The Machine last year and it seems my skepticism has paid dividends although I can’t say I feel that great about it.

hp-machine-memristor-2015-06-05-01

For the uninitiated HP’s The Machine was going to be the next revolutionary step in computing. Whilst the mockups would be familiar to anyone who’s seen the inside of a standard server those components were going to be anything but, incorporating such wild technologies as memristors and optical interconnects. What put this above many other pie in the sky concepts (of which I include things like D-Wave’s quantum computers as the jury is still out on whether or not they’re providing a quantum speedup) is that it was based on real progress that HP had made in many of those spaces in recent years. Even that wasn’t enough to break through my cynicism however.

And today I found out I was right, god damnit.

The reasons cited were ones I was pretty sure would come to fruition, namely the fact that no one has been able to commercialize memristors at scale in any meaningful way. Since The Machine was supposed to be almost solely based off of that technology it should be no surprise that it’s been canned on the back of that. Now instead of being the moonshot style project that HP announced last year it’s instead going to be some form of technology demonstrator platform, ostensibly to draw software developers across to this new architecture in order to get them to build on it.

Unfortunately this will likely end up being not much more than a giant server with a silly amount of RAM stuffed into it, 320TB to be precise. Whilst this may attract some people to the platform out of curiosity I can’t imagine that anyone would be willing to shell out the requisite cash on the hopes that they’d be able to use a production version of The Machine sometime down the line. It would be like the Sony Cell processor all over again instead of costing you maybe a couple thousand to experiment with it you’d be in the tens of thousands, maybe hundreds, just to get your hands on some experimental architecture. HP might attempt to subsidise that but considering the already downgraded vision I can’t fathom them throwing even more money at it.

HP could very well turn around in 5 or 10 years with a working prototype to make me look stupid and, honestly, if they did I would very much welcome it. Whilst predictions about Moore’s Law ending happen at an inverse rate to them coming true (read: not at all) it doesn’t mean there isn’t a few ceilings we’ve seen on the horizon that will need to be addressed if we want to continue this rapid pace of innovation. HP’s The Machine was one of the few ideas that could’ve pushed us ahead of the curve significantly and its demise is, whilst completely expected, still a heart wrenching outcome.

A Distant Ancestor of the Programmable Computer.

Ask any computer science graduate about the first programmable computer and the answer you’ll likely receive would be the Difference Engine, a conceptual design by Charles Babbage. Whilst the design wasn’t entirely new (that honour goes to J. H. Müller who wrote about the idea some 36 earlier) he was the first to obtain funding to create such a device although he never managed to get it to work, despite blowing the equivalent of $350,000 in government money on trying to build it. Still modern day attempts at creating the engine with the tolerances of the time period have shown that such a device would have worked should have he created it.

But Babbage’s device wasn’t created in a vacuum, it built on the wealth of mechanical engineering knowledge from the decades that proceeded him. Whilst there was nothing quiet as elaborate as his Analytical Engine there were some marvellous pieces of automata, ones that are almost worthy of the title of programmable computer:

http://www.youtube.com/watch?v=FUa7oBsSDk8

The fact that this was built over 240 years ago says a lot about the ingenuity that’s contained within it. Indeed the fact that you’re able to code your own message into The Writer, using the set of blocks at the back, is what elevates it above other machines of the time. Sure there were many other automata that were programmable in some fashion, usually by changing a drum, but this one allows configuration on a scale that they simply could not achieve. Probably the most impressive thing about it is that it still works today, something which many machines of today will not be able to claim in 240 years time.

Whilst a machine of this nature might not be able to lay claim to the title of first programmable computer you can definitely see the similarities between it and it’s more complex cousins that came decades later. If anything it’s a testament to the additive nature of technological developments, each one of them building upon the foundations of those that came before it.

The Shadow IT Department and Its Influence on Corporate IT’s Future.

If there’s one thing that us system administrators loathe more than dealing with users its dealing with users who have a bit of IT smarts around them. On the surface they’re the perfect user, being able to articulate their problems and requirements aptly so we have to spend considerably less time fulfilling their requests. However more often than not they’re also the ones attempting to circumvent safeguards and policies in order to get a system to work the way they want it to. They’re also the ones who will push for much more radical changes to systems since they will have already experimented with such things at home and will again want to replicate that in their work environment.

Collectively such people are known as shadow IT departments.

Such departments are a recent phenomena with a lot of credit (or blame) being levelled at those of my generation, the first to grow up as digital natives. Since the vast majority of us have used computers and the Internet from an early age we’ve come to expect certain things to be available to us when using them and don’t appreciate it when they are taken away. This doesn’t gel too well with the corporate world of IT where lock downs and restrictions are the norm, even if they’re for the user’s benefit, and thus they seek to circumvent such problems causing endless headaches for their system administrators. Still they’re a powerful force for driving change in the work place, enough so that I believe these shadow IT departments are shaping the future of corporate environments and the technologies that support them.

Most recently I’ve seen this occurring with mobility solutions, a fancy way of saying tablets and phones that users want to use on the corporate network. Now it’s hard to argue with a user that doing such a thing isn’t technically feasible but in the corporate IT world bringing in uncontrolled devices onto your network is akin to throwing a cat into a chicken coup (I.E. no one but the cat benefits and you’re left with an awful mess to clean up). Still all it takes is one of the higher ups to request such a thing for it to become a mandate for the IT department to implement. Unfortunately for us IT guys the technology du jour doesn’t lend itself well to being tightly controlled by a central authority so most resort to hacks and work arounds in order to make them work as required.

As the old saying goes the unreasonable person is the one who changes the world to suit themselves and therefore much of the change in the corporate IT world is being made by these shadow IT departments. At the head of these movements are my fellow Gen Y and Zers who are struggling with the idea that what they do at home can’t be replicated at work:

“The big challenge for the enterprise space is that people will expect to bring their own devices and connect in to the office networks and systems,” Henderson said. “That change is probably coming a lot quicker than just five years’ time. I think it will be a lot sooner than that.”

Dr Keiichi Nakata, reader in social informatics at Henley Business School at the University of Reading, who was also at the roundtable, said the university has heard feedback from students who have met companies for interviews and been “very surprised” that technologies they use every day are not being utilised inside those businesses.

It’s true that the corporate IT world is a slow moving beast when compared to the fast paced consumer market and companies aren’t usually willing to wear the risk of adopting new technologies until they’ve proven themselves. Right now any administrator being asked to do something like “bring your own computer” will likely tell you its impossible, lest you open yourselves up to being breached. However technologies like virtualization are making it possible to create a standard work environment that runs practically everywhere and I think this is where a bring your own device world could be possible.

Of course this shifts the problem from the IT department to the virtualization product developer but companies like VMware and CITRIX have both already demonstrated the ability to run full virtual desktop environments on smart phone level hardware. Using such technologies then users would be able to bring in almost any device that would then be loaded with a secure working environment, enabling them to complete the work they are required to do with the device they choose. This would also allow IT departments to become a lot more flexible with their offerings since they wouldn’t have to spend so much time providing support to the underlying infrastructure. Of course there are many other issues to consider (like asset life cycles, platform vetting, etc) but a future where your work environment is independent of the hardware is not so far fetched after all.

The disjunct between what’s possible with IT and what is the norm in computer environments has been one of those frustrating curiosities that has plagued my IT career. Of course I understand that the latest isn’t always the greatest, especially if you’re looking for stability, but the lack of innovation in the corporate space has always been one of pet peeves. With more and more digital natives joining the ranks however the future looks bright for a corporate IT world that’s not too unlike the consumer one that we’re all used to, possibly one that even innovates ahead of it.

 

The Build, The Results and The Tribulations.

So last week saw me pick up the components that would form my new PC, the first real upgrade I have bought in about 3 years. Getting new hardware is always an exciting experience for someone like me which is probably why I enjoy being in the datacenter so much these days, with all that new kit that I get to play with. I didn’t really have the time to build the PC until the weekend though and so I spent a good 5 days with all the parts laid out on the dining table beside me, begging me to put them together right now rather than waiting. My resolve held however and Saturday morning saw me settle down with a cup of coffee to begin the longest build I’ve ever undertaken.

I won’t go over the specifications again since I’ve already mentioned them a dozen times elsewhere but this particular build had a few unique challenges that you don’t see in regular PCs. For starters this would be my first home PC that had a RAID set in it, comprising of 4 1TB Seagate drives that would be held in a drive bay enclosure. Secondly the CPU would be watercooled using a Corsair H70 fully sealed system and since I hadn’t measured anything I wasn’t 100% sure I’d be able to fit it where I thought I could. Lastly with all these drives, watercooling and other nonsense the number of power cables required also posed a unique challenge as I wasn’t 100% sure I could get them all to fit in my mid-sized tower.

The build started off quite well as I was able to remove the old components without issue and give the case a good clean before installing bits and pieces in it. The motherboard, CPU and RAM all went together quite easily as you’d expect but when it came time to affix the mounting bracket for the watercooling I hit a bit of a stumbling block. You see the motherboard I purchased does you the favor of having the old style LGA775 mounting holes, letting you use old style coolers on the newer CPUs. This is all well and good but since the holes are only labelled properly on one side attempting to line up the backing plate with the right holes proved to be somewhat of a nightmare, especially considering that when it did line up it was at a rather odd angle. Still it mounted and fit flush to the motherboard so there was no issues there.

The next challenge was getting all the hard drives in. Taking off the front of my case to to do a dry fit of the drive bay extension showed that there was a shelf right smack bang in the middle of the 4 bays. No problem though it looked to just be screwed in however upon closer inspection it showed that the screws in the front could only be accessed by a right angle screw driver, since the holes that needed to be drilled for a regular driver hadn’t been done. After attempting several goes with a drive bit and a pair of pliers I gave up and got the drill out, leaving aluminium shavings all over the place and the shelf removed. Thankfully the drive bay extender mounted with no complaints at all after that.

Next came the fun part, figuring out where the hell the watercooling radiator would go. Initially I had planned to put it at the front of the case but the hosing was just a bit too short. I hadn’t bought any fan adapters either so mounting it on the back would’ve been a half arsed effort with cable ties and screws in the wrong place. After fooling around for a while I found that it actually fit quite snuggly under the floppy drive bays, enough so that it barely moved when I shook the case. This gave me the extra length to get to the CPU whilst also still being pretty much at the front of the case, although this also meant I could only attach one of the fans since part of the radiator was mere millimeters away from the end of the graphics card.

With everything all put together and wired up it was now the moment of truth, I took a deep breath and pressed the power button. After a tense couple milliseconds (it seemed like forever) the computer whirred into life and I was greeted with the BIOS screen. Checking around in the BIOS though revealed that it couldn’t see the 4 drives I had attached to the external SATA 6Gbps controller so I quickly booted into the windows installer to make sure they were all there. They did in fact come up and after a furious 2 hours of prodding around I found that the external controller didn’t support RAID at all, only the slower ports did. This was extremely disappointing as it was pretty much the reason why I got this particular board but figuring that the drives couldn’t saturate the old SATA ports anyway I hooked them up and was on my merry way with the Windows install being over in less than 10 minutes.

I’ve been putting the rig through its paces over the past week and I must say the biggest improvement in performance comes solely from the SSD. The longest part of the boot process is the motherboard initializing the 3 different controllers with Windows loading in under 30 seconds and being usable instantly after logging in. I no longer have to wait for things to load, every program loads pretty much instantaneously. The RAID array is none too shabby either with most games loading in a fraction of the time they used to.

Sadly with all games being optimized for consoles these days the actual performance improvement in nearly every game I’ve thrown at it has been very minimal. Still Crysis 2 with all the settings set to their maximum looks incredibly gorgeous even if I can’t seem to make it chug even on the biggest multi-player maps. The new mouse I bought (Logitech G700) is quite an amazing bit of kit too and the TRON keyboard my wife got me for my birthday just adds to the feeling that I’m using a computer from the future. Overall I’m immensely satisfied with it and I’m sure it’ll prove its worth once I throw a few more programs at it.

Speaking of which, I can’t wait to code on that beasty.

 

OCZ Vertex 3: Don’t Play With My Heart (Or The SSD Conundrum).

My main PC at home is starting to get a little long in the tooth, having been ordered back in the middle of 2008 and only receiving upgrades of a graphics card and a hard drive since then. Like all PCs I’ve had it suffered a myriad of problems that I just usually put up with until I stumbled across a work around, but I think the vast majority of them can be traced to a faulty motherboard (Can’t put more than 4GB of RAM in it or it won’t post) and a batch of faulty hard drives (that would randomly park the heads causing it to freeze). At the time I had the wonderful idea of buying the absolute latest so I could upgrade cheaply for the next few years, but thanks to the consolization of games I found that wasn’t really necessary.

To be honest it’s not even really necessary now either, with all the latest games still running at full resolution and most at high settings to boot. I am starting to lag on the technology front however with my graphics card not supporting DirectX 11 and everything but the RAM being 2 generations behind (yes, I have a Core 2 Duo). So I took it upon myself to build a rig that combined the best performance available of the day rather than trying to focus on future compatibility. Luckily for me it looks like those two are coinciding.

Just because like any good geek I love talking shop when it comes to building new PCs here are the specs of the potential beast in making:

  • Intel Core i7 2600K
  • Asrock P67 Motherboard
  • Corsair Vengeance 1600MHz DDR3 16GB
  • Radeon HD6950
  • 4 x 1TB Seagate HDD in RAID 10
  • OCZ Vertex 3 120GB

The first couple choices I made for this rig were easy. Hands down the best performance out there is with the new Sandy Bridge i7 chips with the 2600K being the top of the lot thanks to its unlocked multiplier and hyperthreading, which chips below the 2600 lack. The choice of graphics cards was a little harder as whilst the Radeon comes out leagues ahead on a price to performance ratio the NVIDIA cards still had a slight performance lead overall, but hardly enough to justify the price. Knowing that I wanted to take advantage of the new SATA 6Gbps  range of drives that were coming out my motherboard choice was almost made for me as the Asrock P67 seems to be one of the few that has more than 4 of the ports available (it has 6, in fact).

The choice of SSD however, whilst extremely easy at the time, became more complicated recently.

You see back in the initial pre-production review round the OCZ Vertex 3 came out shooting, blasting away all the competition in a seemingly unfair comparison to its predecessors. I was instantly sold especially considering the price was looking to be quite reasonable, around the $300 mark for a 120GB drive. Sure I could opt for the bigger drive and dump my most frequently played games on it but in reality a RAID10 array of SATA 6Gbps drives should be close enough without having to overspend on the SSD. Like any pre-production reviews I made sure to keep my ear to the ground just in case something changed once they started churning them out.

Of course, something did.

The first production review that grabbed my attention was from AnandTech, renowned for their deep understanding of SSDs and producing honest and accurate reviews. The results for my drive size of choice, the 120GB, were decidedly mixed on a few levels with it falling down in several places where the 240GB version didn’t suffer any such problems. Another review confirmed the figures were in the right ballpark although unfortunately lacking a comparison to the 240GB version. The reasons behind the performance discrepancies are simple, whilst functionally the same drives the differences come from the number of NAND chips used to create the drive. The 240GB version has double the amount of the 120GB version which allows for higher throughput and additionally grants the drive a larger scratch space that it can use to optimize its performance¹.

So of course I started to rethink my position. The main reason for getting a real SSD over something like the PCIe bound RevoDrive was that I could use it down the line as a jumbo flash drive if I wanted to and I wouldn’t have to sacrifice one of my PCIe lanes to use it. The obvious competitor to the OCZ Vertex 3 would be something like the Intel 510 SSD but the reviews haven’t been very kind to this device, putting it barely in competition with previous generation devices.

After considering all my options I think I’ll still end up going with the OCZ Vertex 3 at the 120GB size. Whilst it might not be the kind of performance in every category it does provide tremendous value when compared to a lot of other SSDs and it will be in another league when compared to my current spinning rust hard drive. Once I get around to putting this new rig together you can rest assured I’ll put the whole thing through its paces, if at the very least to see how the OCZ Vertex 3 stacks up against the numbers that have already been presented.

¹Ever wondered why some SSDs are odd sizes? They are in fact good old fashioned binary sizes (128GB and 256GB respectively) however the drive reserves a portion of that (8GB and 16GB) to use as scratch space to write and optimize data before committing it. Some drives also use it as a buffer for when flash cells become unwritable (flash cells don’t usually die, you just can’t write to them anymore) so that the drive’s capacity doesn’t degrade.

Deep Blue, Watson and The Evolution of AI.

I’m not sure why but I get a little thrill every time I see something that’s been completely automated that used to require manual intervention from start to finish. It’s probably because the more automated something is the more time I have to do other things and there’s always that little thrill in watching something you built trundle along its way, even if it falls over part way through. My most recent experiment in this area was crafting the rudimentary trainer for Super Meat Boy to get me past a nigh on impossible part of the puzzle, co-ordinating the required key strokes with millisecond precision and ultimately wresting me free of the death grip that game held on me.

The world of AI is an extension of the automation idea, using machines to perform tasks that we would otherwise have to do ourselves. The concept has always fascinated me as more and more we’re seeing various forms of AI creeping their way into our everyday lives. However most people won’t recognize them as AI simply because they’re routine, but in reality many of the functions these weak AIs perform used to be in the realms of science fiction. We’re still a long way from having a strong AI like we’re used to seeing in the movies but that doesn’t mean many facets of it aren’t already in widespread use today. Most people wouldn’t think twice when a computer asks them to speak their address but going back only a few decades would see that be classed as the realms of strong AI, not the expert system it has evolved into today.

What’s even more interesting is when we create machines that are more capable than ourselves at performing certain tasks. The most notable example (thus far) of a computer be able to beat a human at a certain non-trivial task is Deep Blue, the chess playing computer that managed to beat the world chess champion Kasparov albeit under dubious circumstances. Still the chess board is a limited problem set and whilst Deep Blue was a super computer in its time today you’d find as much power hidden under the hood of your Playstation 3. IBM’s research labs have been no slouch in developing Deep Blue’s successor, and it’s quite an impressive beast.

Watson, as it has come to be known, is the next step in the evolution of AIs performing tasks that have only been in the realms of humans. The game of choice this time around is Jeopardy a gameshow who’s answers are in the form of a question and makes extensive use of puns and colloquialisms. Jeopardy represents a unique challenge to AI developers as it involves complex natural language processing, searching immense data sets and creating relationships between disparate sources of information to finally culminate in an answer. Watson can currently determine whether or not it can answer a question within a couple seconds but that’s thanks to the giant supercomputer that’s backing it up. The demonstration round showed Watson was quite capable of playing with the Jeopardy champions, winning the round quite with a considerable lead.

What really interested me in this though was the reaction from other people when I mentioned Watson to them. It seemed that a computer playing Jeopardy (and beating the human players) wasn’t really a big surprise at all, in fact it was expected. This was telling about how us humans view computers as most people expect them to be able to accomplish anything, despite the limitations that are obvious to us geeks. I’d say this has to do with the ubiquity of computers in our everyday lives and how much we use them to perform rudimentary tasks. The idea that a computer is capable of beating a human at anything isn’t a large stretch of the imagination if you treat them as mysterious black boxes but it still honestly surprised me to learn this is how many people think.

Last night saw Watson play its first real game against the Jeopardy champions and whilst it didn’t repeat its performance of the demonstration round it did tie for first place. The second round is scheduled to air sometime tomorrow (Australia time) and whilst I’ve not yet had a chance to watch the entire round I can’t tell you how excited I am to see the outcome. Either way the realm of AI has taken another step forward towards the ultimate goal of creating intelligence born not out of flesh, but silicone and whilst some might dread the prospect I for one can’t wait and will follow all developments with baited breath.

Shit’s Breaking Everywhere, Captain.

So it turns out that my blog has been down for the last 2 days and I, in my infinite wisdom, failed to notice this. It seems like no matter how I set this thing up it will end up causing some problem that inveitably brings the whole server to its knees, killing it quietly whilst I go about my business. Now this isn’t news to anyone who’s read my blog for any length of time but it eerily coinciding with my main machine “forgetting” it’s main partition, leaving me with no website and a machine that refused to boot.

Realistically I’m a victim of my own doing since my main machine is getting a bit long in the tooth (almost 3 years now by my guess) but even before it hit the 6 month mark I was getting problems. Back then it was some extremely obscure issue that only seemed to crop up in certain games where I couldn’t get more than 30 seconds into playing them before the whole machine froze and repeatedly played the last second of sound until I pulled the plug on it. That turned out to be RAM requiring more volts than it said it did and everything seemed to run fine until I hit a string of hard drives that magically forgot partitions (yes, in much the same fashion as my current one did). Most recently it has taken to hating having all of its RAM slots filled even though both of them work fine in isolation. Maybe it’s time this bugger went the way of old yeller.

Usually a rebuild isn’t much of a hassle for someone like me. It’s a pain to be sure but the pay off at the end is a much leaner and meaner rig that runs everything faster than it did before. This time around however it also meant configuring my development environment again whilst also making sure that all my code didn’t suffer in the apparent partition failure. I’m glad to say that whilst it did kill a good couple hours I was otherwise planning to spend lazing about I have got everything functional again and no code was harmed in the exercise.

You might be wondering why the hell I’m bother to post this then since it’s so much of a non-event. Well for the most part it’s to satisfy that part of me that likes to blog every day (no matter how hard I try to quell him) but also it’s to make sure the whole thing is running again and that Google is aware that my site hasn’t completely disappeared. So for those of you who were expecting something more I’m deeply sorry, but until the new year comes along I’m not sure how much blogging I’m going to be doing. Let alone any well thought out pieces that I tend to hit at least a couple times a week 😉

Three Screens and Platform Agnosticity.

For all their faults Microsoft have done some really great work and brought a lot of innovative ideas to fruition. Sure their strategy of embrace, extend and extinguishrightfully earnt them the reputation of being an evil company but despite this they’ve continued to deliver products that are really head and shoulders above the competition. From emerging technologies like the Surfaceto dragging others kicking and screaming into the world of online game consoles Microsoft has shown that when they want to they can innovate just like anyone else can. One of those innovations that, in my opinion, hasn’t received the press it should is the idea of the Three Screens and a Cloud form of computing which Microsoft started talking about almost 2 years ago.

The idea itself is stunningly simple: that the computing experience between the three main screens a user has (their computer, phone and TV) should be connected and ubiquitous. Whilst I still detest using the word cloud computing for anything (it feels like magic hand waving) the idea of all these screens being connected to a persistent cloud back end unlocks potential for innovation on quite a large scale. The devil is in the details of course and whilst such an idea is something to behold the actual implementation of this is what will show whether or not Microsoft knows what it’s talking about, rather than just drumming up some hype with a new industry buzz term.

Microsoft is already making headway into implementing this idea with their Live Mesh range of online services. I’ve been a big user of their remote desktopfeatures that have allowed me to remote in via anywhere with a web browser. They’ve also been hard at work making their Office products more accessible through Office Web Apps, which provide a pretty good experience especially considering they’re free. Their current strategy for getting on the TV sees to be centered around improving the Xbox Live experience and integrating it with the upcoming mobile platform Windows Phone 7. Time will tell if they’ll be able to draw all of these platforms together to fully realise the Three Screens idea, but they’re well on their way to delivering such a service.

Stepping away from Microsoft’s work the idea of a computing experience being agnostic to the platform you’re on has been a fascination of mine for quite some time. You see I make my money based around virtualization which has its roots in the idea of removing dependency on a platform from the software. More recently I’ve been diving into the world of virtual desktops which give you the novel ability of taking your desktop session with you, needing only a USB key for the user. There are quite a few companies offering products that implement this idea but more recently some have taken it a step further like CITRIX and their search for a Nirvana Phone. Realistically I see no reason why you couldn’t interact with that same session directly on the phone or on a TV if you so desired, getting us dangerously close to realising the Three Screens idea.

Although Microsoft is credited with the soundbite that captures this idea they’re not the only ones working towards unifying a computing experience across platforms. Google has made serious inroads into the mobile sector and just this year announced that they would be coming to the last screen they had missed, the TV. They’ve also taken the first steps to integrating the phone and computer experiences with the Chrome to Phone extension for their browser. Whilst Apple had been some what lax about their foray into TV they have since revampedthe idea to be more like their other iOS based products, signalling that they too are looking to unify the user experience.

At it’s heart the notion of Three Screens is of freedom and ubiquity, with users data empowered through it’s ability to transcend restrictions that once plagued it. The true realisation of this idea is still yet to be seen but I know that the unification of the three key computing platforms is not far away. With so many big players vying for dominance I’m sure that we’ll see a platform war of the likes we’ve never seen before and hopefully from that the best products and services will arise victorious, to the benefit of us all.

Computer Fail.

As I’ve said before us IT guys have the most interesting problems when it comes to our personal computers. Mine decided last night, in the middle of doing stuff for the wedding on Monday no less, that it would give up the ghost and stop working completely. No amount of cajoling or begging would bring my computer back from its silent grave and I was relegated to trying to recover my files hastily in case the drives were on the way out.

Turns out either the hard drive itself of or the controller on my motherboard decided that the main boot record and master file table needed to die, and proceeded to oblige me in this request even though I had done nothing to provoke it. I had had problems with it freezing in the past but since there was no data corruption I put it down to spurious windows chicanery, and thought nothing more of it. This assumption has cost me around 3 hours of my life, something which I’m not keen to repeat again in the near future.

This post will be a short and sweet one as the time I usually dedicate to writing out a thoughtful post have been taken away by said computer fail. I will say one thing though, the free file recovery software Recuva is worth its weight in gold, as it was able to scan my drive and recover all the files in a fraction of the time of any other utility I’ve used before. Everything else I tried took at least 15 minutes to get the folder structure right and then couldn’t recover anything past a few measly files. I was able to get a full 21GB off my drive without too much hassle using Recuva, and I’m now just a format and reconfigure away from having a working machine again.

My shopping list now includes a 2TB RAID 1/0 array, because I never want to go through this crap again. 🙂