Monthly Archives: November 2013

Batman Arkham Origins: We’ve Got To Stop Meeting Like This, Bats.

Prior to the release of Batman: Arkham Asylum you’d be hard pressed to find anyone who’d heard of Rocksteady Studios. Primarily this would be because they only had one title to their name before that, Urban Chaos: Riot Response, which wasn’t badly received but at the same time you’d struggle to find anyone who’d played it. Their following two instalments using the Batman IP however catapulted them to fame and their success led to them being acquired by Time-Warner shortly before the release of Arkham City. However the most recent instalment in this series, Batman: Arkham Origins comes to us not from the venerable Rocksteady but instead Warner Bros Games Montreal, a development house that’s familiar with the series (as they worked on the Wii-U port Arkham City). Combine that with the Joker no longer being voiced by Mark Hamill and fans of the series were decidedly nervous as there was no telling how this game would pan out.

Batman Arkham Origins Review Screenshot Wallpaper Title Screen

Arkham Origins takes place long before the world that was established in the previous two games, going back to the beginnings where Bruce Wayne is just beginning his journey as the caped crusader of Gotham City. He’s been at it long enough to attract the attention of some of the city’s more nefarious criminals and this has resulted in Black Mask, a notorious underworld dealer who’s eluded conviction due to the numerous businesses he runs, putting a bounty on Batman’s head. He has also invited 8 different assassins to go after the bounty including many of Batman’s long time rivals. Of course Bruce can’t sit idly by and potentially let others be put in danger for his sake and so begins a long Christmas eve spent putting the beat down on Gotham’s worst.

Visually Arkham Origins is a small step up from its predecessor with the primary limitation of them progressing any further being the fact that it’s still being released on the current console generation. In all honesty though it still looks fantastic with all of the environments having an incredible amount of detail in them. I’m also somewhat thankful for this as my PC hardware is starting to get a little long in the tooth and whilst Arkham Origins looked great there were times when it began to noticeably slow down. However that wasn’t a frequent occurrence, even in the outdoor scenes where you could see far off into the distance.

Batman Arkham Origins Review Screenshot Wallpaper Opening Scenes

Just like the 2 Arkham titles before it Origins keeps the core game play and style the same whilst adding in additional challenges, enemies and tactics to keep it feeling fresh. You’ll still spend most of your time beating the every loving crap out of various different types of enemies, the challenge ratcheting up every so often with the introduction of new types of enemies requiring different techniques to take them down. However you still have the option of being a silent predator at times, swooping through an area and taking out multiple enemies without being seen. Finally the core puzzle mechanics make a come back, albeit with a new mode to make things a little more interesting.

Combat, as always, is fast paced and meaty with every hit you land having a really satisfying feel to it. I always seem to start off feeling rather uncoordinated, getting my combos interrupted all the time by just not noticing the incoming attacks, but it doesn’t take long before I’m hitting huge multipliers and laying waste to everyone. One thing that has always irritated me is the initial lack of a way to take out large groups once you’ve knocked them all down as whilst you can do a ground take down on them all too often that results in you losing your combo string as it seems you can’t counter whilst in the middle of one. Later on of course you’ll unlock some better ways of dealing with them and after that combat starts to feel a lot more fluid.

Batman Arkham Origins Review Screenshot Wallpaper Crime Alley

However one criticism that I’ll level at it, and this has been true of all of the series, is that as you progress through the story the number of different things you can do during combat start to become a little overwhelming. Pretty often you’ll find yourself facing a knife wielder, a guy with a riot shield and probably a tough enemy that needs to be stunned before you can do anything. These require no less than 3 different methods of taking them out and when combined with the dozen or so quick fire gadgets you end up having to remember so many things that you’ll eventually just settle on a couple. They all become somewhat moot however with the introduction of the shock gloves and then all you have to focus on is getting enough charge in them so then you can lay the smack down on everything around you.

The stealth sections feel like they have remained largely unchanged although this could be primarily due to the fact that I didn’t invest many points in that skill tree until very late in the game. They’re still fun and somewhat challenging, especially the ones that have unique mechanics like the Deadshot encounter, but if you were looking for a markedly different or revamped experience you’re not going to find it. There’s also the possibility that I just wasn’t paying attention to some of the prompts and missed some new opportunities but I didn’t really have any problems accomplishing anything  (unlike say in the Mr Freeze battle in Arkham City).

Batman Arkham Origins Review Screenshot Wallpaper Stuck in the walls

The detective mode/puzzles remain largely the same albeit making use of some of the new mechanics granted to you by the various gadgets that weren’t present in the previous titles. There’s also the addition of the crime scene mode which you use to reconstruct crimes to figure out details about how they happened and to track down the people responsible. For the most part it works well however it’s not made entirely clear when you have to move to a new section to continue the investigation, or what the expected behaviour is, so at first it was a little confusing. Still since it’s largely the same mechanic it still functions well even if it doesn’t feel as fresh or different as other aspects of the game are.

However the real problem with Arkham Origins is that whilst it retains the essence of what made the Arkham series so good it’s also marred by numerous bugs and glitches, many of them that are completely game breaking. The screenshot above depicts one of them where upon using certain abilities with knock back you can cement enemies in a wall or other object. They then become unreachable and whilst I was able to dislodge them after trying every gadget I had (I eventually found I needed to get them on an edge and then attempt to stun them so they’d fall backwards out of the box) it was an incredibly frustrating experience. This is not to mention one part in the Penguin’s ship where all the external doors just simply refused to work, making the opening noise but not allowing me through. This broke my trust with all the game mechanics so I spent the vast majority of the game wondering if I had completed a challenge successfully or if I had just encountered another game breaking issue. I’m not alone in thinking this either as my searches into the issue revealed the list of bugs is scarily long and even after it’s been out for this length of time there’s no patch in sight.

Batman Arkham Origins Review Screenshot Wallpaper Joker Beatdown

This, combined with the fact that Arkham Origins isn’t too much different from City in terms of overall play style, is probably the reason why there’s been such an abysmal reaction to it. I did my best to avoid any reviews prior to playing it however I unwittingly found out that Destructoid gave it 3.5 out of 10 and whilst I don’t agree with that score overall I understand the reasoning that went into it. Whilst I feel that Arkham Origins isn’t a bad game overall it is certainly the weakest of the series, showing very clearly that Warner Bros Montreal has a lot to learn before they can deliver a title that can be considered on par with the rest of the Arkham series. Whether or not they’ll get the chance to do so in light of the current reaction to Arkham Origins though remains to be seen.

As for the story I felt like it was a great introduction into the relationship between Batman and the Joker as whilst their relationship has been explored in depth in other mediums it was great to see how the rivalry began. The bucket list of other characters thrown in as assassins was unfortunately less well done as it just felt like a convenient way to throw them in without needing a coherent reason for them to be there. This was only exacerbated by the fact that they either had long, drawn out encounters (like Enigma) which just weren’t that fun to pursue or they were so short (like Anarchy) that you really didn’t have time for them to develop.

Batman Arkham Origins Review Screenshot Wallpaper Eternal Vigilance

Should we judge Batman: Arkham Origins without the knowledge of the titles that followed it previously it would be easy to heap praise on it. The combat is engaging and satisfying, the exploration into the relationship between the Joker and Batman is intriguing and the world is filled with detail that few games manage to achieve. However it’s lineage set a high bar for it to live up to and the fact that it’s not different enough from Arkham City, combined with the numerous game breaking bugs, means that Arkham Origins is the weakest of all of the titles. I certainly enjoyed my time in it but there’s no mistaking that the developers behind it have their work cut out for them if they want to live up to the Rocksteady brand.

Rating: 7..0 / 10

Batman: Arkham Origins is available on PC, Xbox360, PlayStation 3 and WiiU right now for $59.99, $78, $78 and $78 respectively. Game was played on the PC with 13 hours of play time and 26% of the achievements unlocked.

A Distant Ancestor of the Programmable Computer.

Ask any computer science graduate about the first programmable computer and the answer you’ll likely receive would be the Difference Engine, a conceptual design by Charles Babbage. Whilst the design wasn’t entirely new (that honour goes to J. H. Müller who wrote about the idea some 36 earlier) he was the first to obtain funding to create such a device although he never managed to get it to work, despite blowing the equivalent of $350,000 in government money on trying to build it. Still modern day attempts at creating the engine with the tolerances of the time period have shown that such a device would have worked should have he created it.

But Babbage’s device wasn’t created in a vacuum, it built on the wealth of mechanical engineering knowledge from the decades that proceeded him. Whilst there was nothing quiet as elaborate as his Analytical Engine there were some marvellous pieces of automata, ones that are almost worthy of the title of programmable computer:

http://www.youtube.com/watch?v=FUa7oBsSDk8

The fact that this was built over 240 years ago says a lot about the ingenuity that’s contained within it. Indeed the fact that you’re able to code your own message into The Writer, using the set of blocks at the back, is what elevates it above other machines of the time. Sure there were many other automata that were programmable in some fashion, usually by changing a drum, but this one allows configuration on a scale that they simply could not achieve. Probably the most impressive thing about it is that it still works today, something which many machines of today will not be able to claim in 240 years time.

Whilst a machine of this nature might not be able to lay claim to the title of first programmable computer you can definitely see the similarities between it and it’s more complex cousins that came decades later. If anything it’s a testament to the additive nature of technological developments, each one of them building upon the foundations of those that came before it.

A Ray of Hope for the NBN.

The resignation of the National Broadband Network board was an expected move due to the current government’s high level of criticism of the project. Of course while I, and many other technically inclined observers, disagreed with the reasons cited for Turnbull’s request for their resignations I understood that should we want to get the NBN in the way we (the general public) wanted it then it was a necessary move that would allow the Liberal party to put their stamp on the project. However what followed seemed to be the worst possible outcome, one that could potentially see the NBN sent down the dark FTTN path that would doom Australia into remaining as an Internet backwater for the next few decades.

They hired ex-Telstra CEO Ziggy Switkowski.

For anyone who lived through his tenure as the head of Australia’s largest telecommunications company his appointment to the head of the NBN board was a massive red flag. It would be enough to be outraged at his appointment for the implementation of data caps and a whole host of other misdeeds that have plagued Australia’s Internet industry since his time in office but the real crux of the matter is that since his ousting at Telstra he’s not been involved in the telecommunications industry for a decade. Whatever experience he had with it is now long dated and whilst I’m thankful that his tenure as head of the board is only temporary (until a new CEO is found) the fact that he has approved other former Telstra executives to the NBN board shows that even a small amount of time there could have dire implications

Simon Hackett InternodeNews came yesterday however that Turnbull has appointed Simon Hackett, of Internode fame, was appointed to the NBN board. In all honesty I never expected this to come through as whilst there were a few grass roots campaigns to get that to happen I didn’t think that they’d have the required visibility in order to make it happen. However Hackett is a well known name in the Australian telecommunications industry and it’s likely that his reputation was enough for Turnbull to consider him for the position. Best of all he’s been a big supporter of the FTTH NBN since the get go and with this appointment will be able to heavily influence the board’s decisions about the future of Australia’s communication network.

Whilst I was always hopeful that a full review of the feasibility of the NBN would come back with resounding support for a FTTH solution this will almost certainly guarantee such an outcome. Of course Turnbull could still override that but with his staunch stance of going with the review’s decision it’s highly unlikely he’d do that, less he risk some (even more) severe political backlash. The most likely change I can see coming though is that a good chunk of the rollout, mostly for sites where there is no current contracts, will fall to Telstra. Whilst I’m a little on the fence about this (they’d be double dipping in that they’d get paid to build the new network and for disconnecting their current customers) it’s hard to argue that Telstra isn’t a good fit for this. I guess the fact that they won’t end up owning it in the end does make it a fair bit more palatable.

So hopefully with Hackett’s appointment to the NBNCo board we’ll have a much more technically inclined view presented at the higher levels, one that will be able to influence decisions to go down the right path. There’s still a few more board members to be appointed and hopefully more of them are in the same vein as Hackett as I’d rather not see it be fully staffed with people from Telstra.

 

Heroes of the Storm: It’s Not Your Traditional MOBA.

I was never particularly good at RTS games, mostly because I never dug deep into the mechanics or got involved in higher level strategies that would have enabled me to progress my sills. However I found a lot of joy in the custom maps that many RTS games had and this was especially so for WarCraft 3. Inbetween my bouts of Elemental Tower Defense, Footman Frenzy and X Hero Siege I inevitably came across Defense of the Ancients and like many others became hooked on it. Whilst I still favoured the less directly competitive maps, much preferring the spam fest that other customs offered, the original laid the foundation for my current obsession with DOTA2 a game which has claimed almost 1400 hours of my life so far.

Heroes of the StormHowever DOTA2 wasn’t my first reintroduction into the MOBA scene, that honour goes to Heroes of Newerth which I was somewhat intrigued by whilst it was still in beta. I had a small cadre of friends who liked to play it as well but for some reason it just wasn’t enough to keep us interested and eventually fell by the wayside. The same crew and I had tried League of Legends as well but the experience was just too far away from the DOTA we knew and after a couple games our attention was turned elsewhere. If I’m honest though we were mostly excited to hear about Blizzard’s own version of the MOBA genre as that was one of the reasons that WarCraft 3 DOTA was so enjoyable: it had many of the characters we knew and loved.

It was looking like Blizzard DOTA and DOTA2 were going to launch around similar times and indeed once Valve officially announced DOTA2, with the original map maker IceFrog at the helm, news of the work on Blizzard DOTA went silent. Whilst this was partially due to the court battle that Blizzard and Valve became embroiled in afterwards there was little doubt among the community that Blizzard’s original vision for their MOBA title was going to clash heavily with that of Valve and the work we had seen up until that date was to be scrapped. What was less clear however was what they were working on instead as whilst no one doubts the calibre of Blizzard’s work they were going up against 3 already highly polished products, all of which had dedicated communities behind them.

Well it seems that Blizzard has done something completely out of left field, and it looks awesome.

Heroes of the Storm is the final name of Blizzard’s entrance into the MOBA genre (although they’re hesitating to use that term currently) and whilst it shares some base characteristics with other titles it’s really something out of left field. For starters the typical game is slated to last only 20 minutes, something which is a downright rarity in any other MOBA title. Additionally some of the signature mechanics, like individual hero levels and items, don’t exist in the Heroes of the Storm world. It also has different maps, various mechanics for helping a team achieve victory and a talent tree system for heroes that’s unlike any other MOBA I’ve played before. The differences are so vast that I’d recommend you take a look at this post on Wowhead as it goes into the real nitty gritty of what makes it so unique.

From what I’ve seen it looks like Blizzard is aiming Heroes of the Storm primarily at people who aren’t currently MOBA players as it seems like the barrier to entry on this is quite low. Traditionally this is what has turned people off playing such titles as the learning curve is quite steep and quite frankly the communities have never been too welcoming to newer players. Heroes of the Storm on the other hand could be played 3 times in the space of an hour allowing new players to get up to speed much more quickly. At the same time though I think it will appeal to current MOBA players seeking a different experience, whether they’re feeling burn out on their title of choice or just want something different every once in a while.

I’m quite keen to get my hands on it (I’ve signed up for the beta, here) as I think it’ll be quite a bit of fun, especially with my current group of friends who’ve all taken to DOTA2 with fervour. It’s great to hear that it’s going to be a stand alone title rather than a map within StarCraft 2 and I think that will give Blizzard a lot of freedom with developing the idea in the future. Whether or not it can have the same longevity through a competitive scene like all MOBA titles before it thought will remain to be seen but I get the feeling it’ll be something of a LAN favourite for a while to come.

 

Cloud Enhanced Gaming is a Stupendously Bad Idea.

The advent of cloud computing, or more generally the commoditization of computer infrastructure, has provided us with capabilities that few could have accurately predicted. Indeed the explosive growth in the high tech sector can be substantially attributed to the fact that businesses now do not require heavy capital injections in order to validate their ideas, allowing many ideas which wouldn’t have been viable 5 years ago to flourish today. Of course this has also led to everyone seeking to apply the ideals of cloud computing wherever they can, hoping it can be the panacea to their ills. One such place is in the world of gaming and in all honesty the ways in which its being used is at best misguided with most solutions opening us up to a world of hurt not too far down the track.

Square Enix Project FlareI’ve gone on record saying that I don’t believe the general idea of Cloud Gaming, whereby a service runs hardware in a central location and users connect to it with a streaming device, isn’t viable. The problem comes from the requirements placed on that infrastructure, specifically the requirement for a low latency which means a user can’t be too far away from the equipment. That would mean that for it to have global reach it would likely need some kind of hardware in all capital cities which would be a rather capital intensive exercise. At the same time the consolidation ratios for gaming level hardware aren’t particularly great at the moment, although that may change in the future with both NVIDIA and AMD working on cloud GPU solutions. Still the fact that OnLive, a once $1 billion company, failed to make the idea feasible says a lot about it.

That hasn’t stopped companies from attempting to integrate the cloud through other avenues something which I’ve come to call Cloud Enhanced gaming. This is where a game can offload less latency sensitive aspects of the game to servers elsewhere so they can do the calculations, sending the results back down the wire. In theory this allows you to make your game better as you don’t have to worry about the limitations of the platform you’re running on, using more of that local grunt for pretty graphics while all the grunt work is done offsite. The latest entrant into this arena is Square-Enix’s Project Flare which they’re marketing as a technological breakthrough in cloud gaming.

On the surface it sounds like a great idea; consoles would no longer suffer from their hardware limitations and thus would remain viable for much longer than they have in the past. Indeed for a developer that’s looking to do something that’s outside of a consoles capabilities offloading processing into the cloud would seem to be the only way to accomplish it should they want to use a specific platform over the alternatives. However doing so binds that game to that backend infrastructure which means that the game’s life is only as long as the servers that power it. Considering the numerous examples we’ve had recently of game servers and services disappearing (including the infamous Games for Windows Live) the effect of turning off an integral part of the game would be far worse and likely without an easy path for remediation.

The reason why this would be such a big issue is that when compared to traditional game server infrastructure the requirements for a cloud enhanced game are much, much greater. You can happily run dozens of virtual servers that service thousands of clients from a single dedicated box, however try and run physics calculations (like in one of the Project Flare demos) and the number of people you can service per server drops dramatically. This means the time in which those servers remain fiscally viable is dramatically reduced and it’s far more likely that the service will cease to exist much sooner than other game servers would. Moore’s Law goes a little way to remedy this but you can’t really get past the fact that the consolidation ratios achievable with this are a couple of orders of magnitude lower than what developers have traditionally come to expect.

This is not to mention how the system will handle poor Internet connections or overloaded servers, something which is guaranteed to happen with more popular titles. Whilst its not an unsolvable problem it’s definitely something that will lead to sub-par gaming experiences as the two most likely systems (stopping the game to wait for the calculations to arrive or simply not simulating them at all) will be anything but seamless. I’m sure it could be improved over time however the way this is marketed makes it sound like they want to do a lot of computation elsewhere so the console graphics can be a lot prettier leaving not a whole lot of wiggle room when the inevitable happens.

Whilst this idea is far more feasible than running the entire game environment on a server it’s still a long way from being a viable service. It’s commendable that Square-Enix are looking for ways to make their games better, removing restrictions of the platforms that the majority have chosen, however I can’t help but feel it’s going to come around to bite them, and by extension us, in the ass in the not too distant future. As always I’d love to be proven wrong on this but the fact is that farming out core game calculations means that the game’s life is tied to that service and once it’s gone there’s nothing you can do to restore it.

Lilly Looking Through: Charming, if a Tad Confused.

I remember sitting in one of my university classes, it was Game Programming Techniques which I was giddy with excitement to be in, and being proposed a simple yet poignant question: how many of you have tried to code a game? The room was filled with students who had spent much of their past few years at university coding but out of the dozens of people there only a few raised their hands. The answer as to why was the same for all of us, we simply did not know how to go about it. Fast forward to today and thanks to tools like GameMaker and Unity it’s possible for anyone, even non-coders, to be able to create a production quality title. Lilly Looking Through is a great example of how these tools enable people to create, without the necessary background in flipping bits.

Lilly Looking Through Review Screenshot Wallpaper Title Screen

Lilly is just like any other ordinary kid, letting her curiosity run wild as she ventures around her own little world. One day though something strange catches her eye, a piece of cloth that appears to move with a life of its own. However she can never seem to get close to it, the devious strip of cloth always flitting away at the last possible second. Then suddenly the cloth seemingly takes a dark turn, snatching  up Lilly’s brother Ro and whisking him away faster than Lilly can run. What follows is Lilly’s journey to get her brother back, taking her through all sorts of wonderful and whimsical worlds.

Lilly Looking Through has a decidedly Dinsey-esque feeling about it, with the backgrounds all being lovingly hand drawn. It reminded me of the many similar types of games I used to play as a kid like The Magic School Bus and Mario is Missing, albeit with the additional twist of all the animation being done using 3D models. The developers behind Lilly Looking Through should be commended for blending the two elements seamlessly as traditionally it’s usually very obvious where the distinction lies, something that I find quite distracting. The background music is also quite enjoyable, being a great backdrop to the sumptuous visuals.

Lilly Looking Through Review Screenshot Wallpaper A Seemingly Impossible Puzzle

At its core Lilly Looking Through is a 2.5D point and click adventure game albeit without the usual trimmings of an inventory system and the requisite try this item with every other item to see if you can progress. This is quite typical of the indie scene where general mechanics are left to one side in favour of other things and, in all honesty, it’s refreshing to play a game that doesn’t have a cornucopia of things to do in it. Thus the majority of your time in Lilly Looking Through will be spent solving puzzles and drinking in the scenery you find yourself in.

The one twist in Lilly Looking Through’s puzzle mechanics is the use of her goggles you pick up early in the game. These allow you to switch between two different times in the same world, allowing you to accomplish things that would otherwise be impossible. It’s by no means an unique or innovative mechanic but it does do its job well by making you think about how to use the two different worlds effectively. The rest of the puzzles build off this mechanic, playing on the notion of time passing  and setting up things accordingly.

Lilly Looking Through Review Screenshot Wallpaper Paddling Across the Lake

For the most part the puzzles are challenging, encouraging you to look at the scenery around you and figure out how everything interacts in order to unlock the next section. Indeed my favourite puzzle of the lot (shown below) required you to initially play around to figure out what everything did and only then could you approach it scientifically. However the puzzles that rely on understanding colour theory are, to be blunt, unintuitive and just frustrating.  I have a basic understanding of how colours mix together but I know that there’s a major difference between mixing paint and mixing light and trying to figure it out intuitively just doesn’t work. It would be ok if this was just a single puzzle but the last few all rely on the colour mixing mechanic.

The story is also pretty simplistic and whilst I’m not adverse to an absence of dialogue (indeed games like Kairo are a powerful experience) it did feel somewhat hollow. I think much of this stems from the fact that Lilly Looking Through is heavily focused on the visual aspect of the game, and in that respect it does well, however it’s just not enough to carry the game on its own. Don’t get me wrong I think it’s still a great little story, especially if I’m guessing right in that their target demographic tend towards the younger generation, but it really is the bare minimum to keep it moving forward.

Lilly Looking Through Review Screenshot Wallpaper Favourite Puzzle

Lilly Looking Through is a gorgeous little game, one that rewards the player for being inquisitive with a visual display that is quite impressive. The early puzzle mechanics are fun and enjoyable however the later stages that assume some knowledge of colour theory unfortunately let it down, leading to a frustrating experience that feels more like luck than anything else. Still I think it’s a great little game, one that is probably best played by your youngest relative while you watch from the sidelines.

Rating: 7.5/10

Lilly Looking Through is available on PC and iOS right now for $9.99. Game was played on the PC with around 2 hours total play time.

So Long Itanium, You Will Not Be Missed.

I’ve worked with a lot of different hardware in my life, from the old days of tinkering with my Intel 80286 through to esoteric Linux systems running on DEC tin until I, like everyone else in the industry, settled on x86-64 as the defacto standard. Among the various platforms I was happy to avoid (including such lovely things as Sun SPARC) was Intel’s Itanium range as it’s architecture was so foreign from anything else it was guaranteed that whatever you were trying to do, outside of building software specifically for that platform, was doomed to failure. The only time I ever came close to seeing it being deployed was on the whim of a purchasing manager who needed guaranteed 100% uptime until they realised the size of the cheque they’d need to sign to get it.

Oh Shiny ItaniumIf Intel’s original dream was to be believed then this post would be coming to you care of their processors. You see back when it was first developed everything was still stuck in the world of 32bit and the path forward wasn’t looking particularly bright. Itanium was meant to be the answer to this, with Intel’s brand name and global presence behind it we would hopefully see all applications make their migration to the latest and greatest 64bit platform. However the complete lack of any backwards compatibility with any currently developed software and applications meant adopting it was a troublesome exercise and was a death knell for any kind of consumer adoption. Seeing this AMD swooped in with their dually compatible x86-64 architecture which proceeded to spread to all the places that Itanium couldn’t, forcing Intel to adopt the standard in their consumer line of hardware.

Itanium refused to die however finding a home in the niche high end market due to its redundancy features and solid performance for optimized applications. However the number of vendors continuing to support the platform dwindled from their already low numbers with it eventually falling to HP being the only real supplier of Itanium hardware in the form of their NonStop server line. It wasn’t a bad racket for them to keep up though considering the total Itanium market was something on the order of $4 billion a year and with only 55,000 servers shipped per year you can see how much of a premium they attract). Still all the IT workers of the world have long wondered when Itanium would finally bite the dust and it seems that that day is about to come.

HP has just announced that it will be transitioning its NonStop server range from Itanium to x86 effectively putting an end to the only sales channel that Intel had for their platform. What will replace it is still up in the air but it’s safe to assume it will be another Intel chip, likely one from their older Xeon line that shares many of the features that the Itanium had without the incompatible architecture. Current Itanium hardware is likely to stick around for an almost indefinite amount of time however due to the places it has managed to find itself in, much to the dismay of system administrators everywhere.

In terms of accomplishing it’s original vision Itanium was an unabashed failure, never finding the consumer adoption that it so desired and never becoming the herald of 64bit architecture. Commercially though it was somewhat of a success thanks to its features that made it attractive to the high end market but even then it was only a small fraction of total worldwide server sales, barely enough to make it a viable platform for anything but wholly custom solutions. The writing was on the wall when Microsoft said that Windows Server 2008 was the last version to support it and now with HP bowing out the death clock for Itanium has begun ticking in earnest, even if the final death knell won’t come for the better part of a decade.

 

This Suit is From The Internet.

If I’m honest clothes shopping isn’t one of my favourite things to do. Whilst I’m somewhat lucky in that I’m not particularly hard to fit the whole process just seems to take too long and the recovering introvert in me doesn’t enjoy discussing my appearance with the store staff. Still I have something of a passion for suits ever since I discovered the difference between the cheap, polyester suits of my youth and the down right exquisiteness of the full wool suit I bought for my sister in-law’s wedding. However I’m still a financial conservative at heart and whilst I’d love nothing more than to drop a couple thousand on a bespoke suit tailored to perfection I’m quite happy with something cheaper if it’s of quality and fits relatively well.

IMG_4171-Edit

My previous work suit was a decently priced affair from yd, it’s only downside being that it used the cheaper blended fabric so as it aged it started to show rather obviously. The suit I had had before that was pure wool and only went to retirement due to the fact that I naively bought a single pair of pants for it and finding a matching pair (it was grey) proved notoriously difficult. So I figured it was time to invest in another wool suit, one that would hopefully last me for a couple years and, hopefully, wouldn’t break the bank. After searching around for a while I stumbled ASOS’ range of suits and was quite surprised at their prices.

For comparison the best you can usually do on a full wool suit in a store is going to be around $500~600, usually a little more if you’re going to invest in another pair of pants to go with it. Online you might be able to get away a little cheaper, SuitSupply goes down to about $450 with full tailoring, but getting below that usually means a trip to Vietnam, if you want something of decent quality. ASOS though had a full wool suit for about $270 as well as everything else you’d need to kit out a new work wardrobe. In the end I figured it was worth the risk and for about $470 I was able to get a suit, 2 pairs of pants and 4 shirts delivered to my door in about 5 days.

The results, as the above picture shows, speak for themselves.

First off the construction of their suits is top notch, at least on par with my store bought suits of years past. The fabric has a very subtle diagonal lining on it which makes it look a little more welcoming than a solid black fabric suit tends to. As for sizing their provided guide seems to be spot on as I ordered their long versions and they fit my rather tall frame well. The pants will probably need a bit of tailoring to take them up a bit but I much prefer that to the alternative. The suit jacket also feels like it will need a couple wears to settle in properly as the collar has a rather annoying tendency to flip itself up at the moment.

Probably my one major complaint will be in the variability of the shirts as I bought 4 of their Smart Shirts in the same size but they all fit differently. For instance the charcoal button down collar ones seem to fit perfectly in almost all regards (the sides might need to be taken in a bit) however the herringbone one has almost uncomfortably tight sleeves, especially when you bend your elbows. I had figured that since they were all pretty much the exact same design there wouldn’t be that much variability but unfortunately there is. I haven’t looked up other reviews to see if this is a widespread issue however so it might just be an isolated case.

For the price I have to say I’m quite stunned as whilst I was expecting something that was serviceable I wasn’t expecting something that would exceed the quality of what I can source here locally. There are a few quirks here and there of course however apart from getting something fully tailored this is something that should be expected. It doesn’t approach the quality of my good dinner suit however but the price of admission for that particular garment was almost double of this one. So if you’re looking for a daily suit then you really can’t go past ASOS, especially for the price.

The SR-72: And You Thought the Blackbird Was Fast.

The SR-71, commonly referred to as the Blackbird, was a pinnacle of engineering. Released back in 1966 it was capable of cruising at Mach 3.2 at incredible heights, all the way up to 25KM above the Earth’s surface. It was the only craft that had the capability to outrun any missiles thrown at it and it’s for this reason alone that not one Blackbird was ever lost to enemy action (although a dozen did fail in a variety of other scenarios). However the advent of modern surveillance techniques, such as the introduction of high resolution spy satellites and unmanned drones made the capabilities that the Blackbird offered somewhat redundant and it was finally retired from service back in 1998. Still plane enthusiasts like myself have always wondered if there would ever be a successor craft as nothing has come close to matching the Blackbird’s raw speed.

Lockheed Martin SR-72 Concept

 

The rumours of a successor started spreading over 3 decades ago when it was speculated that the USA, specifically Lockheed Martin, had the capability to build a Mach 5 version of the  Blackbird. It was called Project Aurora by the public and there have been numerous sightings attributed to the project over the years as well as a lot of sonic boom data gathered by various agencies pointing towards a hypersonic craft flying in certain areas. However nothing concrete was ever established and it appear that should the USA be working on a Blackbird successor it was keeping it under tight wraps, not wanting a single detail of it to escape. A recent announcement however points to the Aurora being just a rumour with the Blackbirds successor being a new hypersonic craft called the SR-72.

Whilst just a concept at this stage, with the first scaled prototype due in 2023, the SR-72’s capabilities are set to eclipse that of the venerable Blackbird significantly. The target cruise speed for the craft is a whopping Mach 6, double that of its predecessor. The technology to support this kind of speed is still highly experimental to the point where most of the craft built to get to those kinds of speeds (in air) have all ended rather catastrophically. Indeed switching between traditional jet engines and the high speed scramjets is still an unsolved problem (all those previous scramjet examples were rocket powered) and is likely the reason for the SR-72’s long production schedule.

What’s particularly interesting about the SR-72 though is the fact that Lockheed Martin is actually considering building it as the aforementioned reasons for the Blackbird’s retirement haven’t gone away. Whilst this current concept design seems to lend itself to a high speed reconnaissance drone (I can’t find any direct mention of it being manned and there’s no visible windows on the craft), something which does fit into the USA’s current vision for their military capabilities, it’s still a rather expensive way of doing reconnaissance. However the SR-72 will apparently have a strike capable variant, something which the Blackbird did not have. I can’t myself foresee a reason for having such a high speed craft to do bombing runs (isn’t that what we have missiles for?) but then again I’m not an expert on military strategy so there’s probably something I’m missing there.

As a technology geek though the prospect of seeing a successor to the SR-72 makes me giddy with excitement as the developments required to make it a reality would mean the validation of a whole bunch of tech that could provide huge benefits to the rest of the world. Whilst I’m sure the trickle down wouldn’t happen for another decade or so after the SR-72’s debut you can rest assured that once scramjet technology has been made feasible it’ll find its way into other aircraft meaning super fast air travel for plebs like us. Plus there will also be all the demonstrations and air shows for Lockheed Martin to show off its new toy, something which I’m definitely looking forward to.

Beware (and Do Not Exploit) The 10x Engineer.

There’s something of a mythology in the developer community around high performing employees who are seemingly able to output much more work than anyone else is in the same amount of time. The concept isn’t strictly limited to software development either as I’m sure anyone from any industry can regale you of a tale of someone who did the work of multiple people, whether through sheer intelligence or dedication to getting the job done. Whilst I have the utmost respect for people with this kind of capability I’ve recognised a disturbing trend among projects who contain people like this and, to the betterment of the wider world, I believe we have to stop seeking these mythical people out in order to exploit their talents for our own ends.

stressed

 

Long time readers might remember a post I wrote a couple years ago about how I tanked my university project due to my lack of any project management skills. A big part of this was my failure to recognise the fact that I had a 10x worker on my hands, someone who was able to do the vast majority of the work on the project without aid from anyone else. In the short term it was a great boon to the project, we made fast progress and we all felt like we were getting somewhere with the idea. However it didn’t take long for all the additional work that 10x person was doing to turn into a dependency, something which the whole team was relying on. My failure to recognise this (and to pitch in myself) was what led to the inevitable demise of that project but I’ve since learned that this is not an uncommon occurrence.

Typically the situation develops from the best of intentions with a high performing employee put on a task or project that’s in need of some immediate attention. For them it’s not too much trouble to solve and the short time frame in which it’s achieved means that they quickly establish themselves as someone that can get stuff done. What happens next depends on the person as once that reputation is established the stream of requests will only intensify. From what I can tell it goes one of two ways: either the 10x in question sets a hard limit (and sticks to it) or they continue to take everything on board, right up until breaking point.

For the former it’s not too much of a problem and indeed would go a long way to highlighting resourcing issues with a project. I firmly believe that whilst the occasional bouts of additional hours aren’t too detrimental long, sustained periods eventually lead to burnout and loss of productivity. So setting limits on how much work you do and staunchly refusing to take on additional tasks shows where additional resources need to be placed. This also requires you to be comfortable with things you’re personally involved with failing on occasion, something which a lot of people find hard to do.

Indeed the latter kind of 10x-er can’t let things fail, especially for anything that they’ve had direct input with. For tasks on the critical path this can be to the project’s benefit as you can rely on the fact that it will get completed no matter what else happens. However as more and more people start to go to this 10x person for help the breadth of things they’re involved with, and thus feel responsible for, broadens to the point where almost anything is within their purview. Thus a terrible feedback loop is established, one whereby they become critical to everything and feel compelled to continue working. This continues until they burn out or some forcible action is taken.

Whilst this is a two sided problem I do feel that we, as the regular workers of the world, can do a lot to ensure such people aren’t destroyed by the burden that we place on them. It can be so easy to fob a task off onto someone, especially when you know they’ll do it quicker and better than you could, however if you know that person is similarly being burdened by multiple other people it may be better for you to learn how to do that task yourself. Then hopefully that 10x worker can continue operating at that capacity without approaching those dangerous levels where burnout becomes all too common.