You’d be forgiven for thinking that Virgin Galactic had disappeared into a cloud of vapourware. Whilst they had managed to build, fly and drop test SpaceShipTwo over two years ago there really hadn’t been much more from them since. Sure if you were keen you could find out what they were up to but the majority of the time it was more of the same: dozens of drop tests under their belt with no firm indication of when the next envelope push was going to happen. Indeed the last time I wrote about them was over 2 years ago and every time I wrote a space article since then I’ve always checked up on them to see if anything had changed. Unfortunately nothing did but a couple weeks ago I heard a rumour that they might be doing their first powered test soon.
That rumour appears to have come true.
WhiteKnightTwo and SpaceShipTwo launched around 8 hours ago and performed their routine ascent up to about 14KMs. Then they separated and shortly afterwards SpaceShipTwo ignited its N2O/Rubber hybrid motor for 16 seconds, propelling it 2.7KMs higher and seeing it reach speeds just over Mach 1. SpaceShipTwo then glided back down to earth for a successful landing, aptly demonstrating the scaled up motor from the original Ansari X-Prize winning craft was quite capable of accomplishing its required task. It’s one thing to read the the text however and another thing altogether to watch it happen:
It’s a huge step forward for Virgin Galactic as it serves as a solid verification of all the critical systems required in order to get the craft into space. Further testing will see the motor burn for longer and longer each time, pushing SpaceShipTwo ever closer to that goal of passing the Kármán line at 100KM above sea level. Virgin Galactic appears to be quite confident in the craft as they’re planning for a full space flight before the year is out which, if the motor is similarly built to SpaceShipOne’s, would see them ramp the burn time from the paltry 16 seconds we saw today to well over 90 seconds. Considering the rigorous amount of testing SpaceShipTwo has undergone prior to this I can’t see much that would stand in the way of achieving this goal.
Virgin Galactic is going to be the first step in commoditizing space access. Sure right now it’s not much more than a joy ride (although even short suborbital flights can have some good science done with them) but SpaceShipTwo is the first to market in private space travel for regular people and with so many others already throwing their hats in the ring I can’t imagine it’ll stay so expensive for long. I might not be able to afford a ticket yet but I don’t think I’ll be waiting too long for my chance at it and that makes me incredibly excited.
Congratulations Virgin Galactic and godspeed.
Since my side projects (including this blog) don’t really have any kind of revenue generation potential I tend to shy away from spending a lot on them, if I can avoid it. This blog is probably the most extravagant of the lot getting its own dedicated server which, I’ll admit, is overkill but I’d had such bad experiences which shared providers before that I’m willing to bear the cost. Cloud hosting on the other hand can get nightmarishly expensive if you don’t keep an eye on it and that was the exact reason I shied away from it for any of my side projects. That was until I got accepted into the Microsoft BizSpark program which came with a decent amount of free usage, enough for me to consider it for my next application.
The Azure benefits for BizSpark are quite decent with a smattering of all their offerings chucked in which would easily be enough to power a nascent start up’s site through the initial idea verification stage. That’s exactly what I’ve been using it for and, as longtime readers will tell you, my experiences have been fairly positive with most of the issues arising from my misappropriation of different technologies. The limits, as I found out recently, are hard and running up against them causes all sorts of undesirable behaviour, especially if you run up against your compute or storage limit. I managed to run up against the former due to a misunderstanding of how a preview technology was billed but I hadn’t hit the latter until last week.
So the BizSpark benefits are pretty generous for SQL storage, giving you access to a couple 5GB databases (or a larger number of smaller 1GB ones) gratis. That sounds like a lot, and indeed it should be sufficient for pretty much any burgeoning application, however mine is based around gathering data from another site and then performing some analytics on it so the amount of data I have is actually quite large. In the beginning this wasn’t much of a problem as I had a lot of headroom however after I made a lot of performance improvements I started gathering data at a much faster rate and the 5GB limit loomed over me. In the space of a couple weeks I managed to fill it completely and had to shut it down lest my inbox get filled with “Database has reached its quota” errors.
Looking over the database in the Azure management studio (strangely one of the few parts of the Azure that still uses Silverlight) showed that one particular table was consuming the majority of the database. Taking a quick look at the rows it was pretty obvious as to why this was the case, I had a couple columns that had lengthy URLs in them and over the 6 million or so records I had this amounted to a huge amount of space being used. No worries I thought, SQL has to have some kind of built in compression to deal with this and so off I went looking for an easy solution.
As it turns out SQL Server does and its implementation would’ve provided the benefits I was looking for without much work on my end. However Azure SQL doesn’t support it and the current solution to this is to implement row based compression inside your application. If you’re straight up dumping large XML files or giant wads of text into SQL rows then this might be of use to you however if you’re trying to compress data at a page level then you’re out of luck, unless you want to code an extravagant solution (like creating a compression dictionary table in the same database, but that’s borderline psycotic if you ask me).
The solution for me was to move said problem table into its own database and, during the migration, trim out all the fat contained within the data. There were multiple columns I never ended up using, the URL fields were all very similar and the largest column, the one most likely causing me to chew through so much space, was no longer needed now that I was able to query that data properly rather than having to work around Azure Table Storage’s limitations. Page compression would’ve been an easy quick fix but it would’ve only been a matter of time before I found myself in the same situation, struggling to find space where I could get it.
For me this experience aptly demonstrated why its good to work within strict constraints as left unchecked these issues would’ve hit me much harder later on. Sure it can feel like I’m spinning my wheels when hitting issues like this is a monthly occurrence but I’m still in the learning stage of this whole thing and lessons learned now are far better than ones I learn when I finally move this thing into production.
If you’ve ever played QWOP you can understand the appeal of games that are intrinsically badly designed, usually to provide challenge in an otherwise ru rudimentary game. I’m not sure what it is but they seem to trigger the competitive OCD part of my brain, pushing me to master them even though there’s little to be gained since none of the skills gained in these games translate to other titles. They do provide a rather weird sense of enjoyment though, usually when I find a way to beat the system through an emergent property of the game that is, again, due the deliberately bad programming/controls/physics. Surgery Simulator 2013 is yet another title that fits in the “deliberately bad but devilishly fun” genre and I spent some time with it over the past week.
Born out of this year’s Global Game Jam Surgery Simulator 2013 started off as a comical heart transplant simulator where you, an unnamed doctor (or are you? It’s never really made clear), must get a new heart in your patient before they run out of blood. Unlike games like Trauma Centre which attempt to recreate the tension of performing medical procedures like this Surgery Simulator instead puts you incontrol of a single hand that you must use to perform all tasks, one that’s incredibly awkward to control. Still you persevere, performing heart transplants, double kidney replacements and even a brain transplant.
For a game that was originally created in 48 hours I have to say I was very impressed with the graphics in Surgery Simulator 2013. Granted they’re nothing spectacular but the stylization, almost TF2 like in nature, adds to the overall comedic tone. The level of detail in the environments are also quite astounding with all sorts of stuff you’d expect to see in a reception/surgery and, quite surprisingly, most of them functioning in some way. I have to say I didn’t expect any of the floppy disks to work when I put them in the drive, nor the pen to draw on the paper when I first started mucking around.
The premise of Surgery Simulator 2013 is simple: you need to get the new organs in the patient before they run out of blood. This sounds a lot easier than it is as the patient loses blood every time you hack into them and should you be… less careful with where you bash/slash/cut they’ll start to continually lose blood, putting a firm timer on how long you have to complete it. This is made all the more difficult by the controls which aren’t exactly intuitive, especially with the way they interact with the various tools and organs you’ll be working with.
Your hand is controlled by a combination of your keyboard and mouse. The A, W, E, R and Space bar keys represent your fingers which works fairly well although I often found my hand getting out of place after a little while. Your hand’s position and rotation are controlled by the mouse with regular mouse movement changing the overall position, depressing the left mouse button dropping your hand down and the right mouse allowing you to rotate your arm and move your wrist. If this sounds confusing it most certainly is and this is where the challenge comes in, mastering these whacky controls in order to perform the correct actions.
I thought that since I’d played a little bit of the original game I’d be more than capable of doing the same actions in the full version of Surgery Simulator 2013 but I couldn’t have been more wrong. The original was a little more liberal in what you could accomplish without severely injuring your patient like being able to bust open the entire rib cage with a single, well placed hammer strike. Attempting the same thing in this version seemed to do a lot more harm than good, often resulting in ~10% of their blood disappearing and leaving them bleeding rather quickly. It seems that the best way to complete most surgeries was with a light, precise touch, something I didn’t think was actually possible.
So whilst you might be able to accomplish everything by using the power tools to slice and dice your way through and knock organs flying with the hammer should you want to go after any of the numerous achievements you’d be advised to try the light touch and use the scalpel/surgery laser more often. Indeed whilst I might not be at A++ level on any of the surgeries yet I definitely found it a lot easier once I started playing it a little more carefully. There’s also the green syringe on the side which when used on the patient stops any bleeding completely which is a godsend when you’re trying to find out where to cut and failing miserably.
If you’re not finding the regular surgeries much of a challenge then there’s the Ambulance Mode which ratches up the difficult level significantly. You’ll get all the same tools however you’ll be constantly bounced around, moving all your tools around and often throwing something onto/into your patient. You can also lose things out the back of the ambulance, including the organ you’re trying to replace. Whilst it’s not impossible it sure is a damn sight harder, especially when the fire extinguisher keeps landing on your patient’s head.
For a game that was built in 48 hours then polished over the next few months Surgery Simulator is a surprisingly well done game, expertly capturing the “so bad it’s good” idea with it’s awkward control scheme and rediculous game premise. If you’re someone who likes to master the nigh on impossible then there’s a lot to love in Surgery Simulator 2013 and the myraid of achievements is sure to keep you coming back in the hopes of performing the perfect surgery. It’s certainly not a game for everyone, especially if you can’t stand being frustrated by bad controls, but the hilarity that ensues is most definitely worth the price of admission.
Surgery Simulator 2013 is available on PC right now for $9.99. Total game time was approximately 2 hours with 29% of the achievements unlocked.
It was a late night in March 2007 where deep in the bowels of the Belconnen shopping mall dozens of consoles gamers gathered. I sat there, my extremely patient and soon to be wife by my side, alongside them eagerly awaiting what was to come, adrenaline surging despite the hour rapidly approaching midnight. We were all there for one thing, the release of the PlayStation 3, and just under an hour later all of us would walk out of there with one of them tucked under our arms. I stayed up far too long setting the whole system up only to crash out before I was able to play any games on it. That same PlayStation, the one I paid a ridiculous price for in both cash and sleep, still sits next to my TV today alongside every other current console.
Well, apart from one, the Wii U.
The reason behind me regaling you with tales of my more insane gamer years is not to humblebrag my way into some kind of gamer cred, no it’s more to highlight the fact that between then and now 6 years have passed. I’ve seen console games rapidly evolve from the first tentative titles, which barely stressed the hardware, to today’s AAA titles which are exploiting every single aspect of the system that they run on. Back in their day both the PlayStation3 and Xbox360 were computational beasts that could beat most other platforms in raw calculative potential without breaking a sweat. Today however that’s no longer the case with the PC having long retaken that crown and people are starting to notice.
Of course console makers are keenly aware of this and whilst the time between generations is increasing they still see the need to furnish a replacement once the current generation starts getting long in the tooth. Indeed if current rumours are anything to go by we’ll likely see both the PlayStation4 and Xbox-something this year. However the rather lackluster sales of the first installment in next generation consoles (the Nintendo WiiU) has led at least one industry critic to be rather pessimistic about whether the next generation is really needed:
Whatever the case, what lessons can Sony and Microsoft take on board from how their rival has fared, as they prepare to make their moves into the next console generation? Well, there’s one immediately apparent lesson: Don’t start a new fucking console generation, because it’s a bad climate and triple-A gaming is becoming too fat and toxic to support its own weight. If you make triple-A games even more expensive and troublesome to develop – not to mention forcing them to adhere to online and hardware gimmicks that shrink and alienate the potential audience even further – then you will be driving the Titanic smack into another iceberg in the hope that it’ll somehow freeze shut the hole the first one made.
The thing is the problems that are affecting the WiiU don’t really translate to Sony or Microsoft. The WiiU was Nintendo’s half-hearted attempt to recapture the more “hardcore” gaming crowd which, let’s be honest here, was a small minority of their customer based. The Wii was so successful because it appealed to the largest demographic that had yet to be tapped: those who traditionally did not play video games. The WiiU, whilst being comparable to current gen consoles, doesn’t provide enough value to end users in order for them to fork out the cash for an upgrade. That then translates into developers not wanting to touch the platform which starts a vicious downward spiral that’ll be incredibly hard to break from.
However the biggest mistake Yahtzee makes is in assuming the next generation of consoles will be harder to develop for, and this is simply not the case.
Both the Xbox360 and the PlayStation3 are incredibly complicated beasts to program for with the former running on a custom variant of PowerPC and the latter running on Sony’s attempt to develop a supercomputer, the Cell. Both of these had their own quirks, nuances and tricks developers used in order to squeeze more performance out of them, none of which were translatable to any other platform. The next generation however comes to us with a very familiar architecture backing it (x86-64) which has decades, yes decades, of programming optimizations, frameworks and development behind it. Indeed all the investment that game developers have made in PC titles (which they’ve thankfully continued to do despite its diminutive market share) will directly translate to the next generation platforms from Microsoft and Sony. Any work on either platform will also directly translate to the other which is going to make cross-platform releases far cheaper, easier and of much higher quality than they have been previously.
In principle I agree with the idea, we don’t need another generation of consoles like we have in the past where developers are forced to retool and spend the next 2 years catching up to the technology. However the next generation we’re getting is nothing like the past and is shaping up to be a major boon to both developers and consumers. As far as we can tell the PlayStation4 and Durango are going to be nothing like the WiiU with many major developers already on board for both platforms and nary a crazy peripheral has been sighted for either of them. To cite the WiiU as the reason why the next generation isn’t needed is incredibly short sighted as Nintendo has shown it’s no longer in the same market as Sony and Microsoft are.
The current generation of consoles have run their course and its time for their replacements to take the stage. The convergence of technology between the two major platforms will only mean good things for developers and consumers alike. There are issues that are plaguing the wider industry, there’s no doubt about that, and whilst I won’t say that the next generation will be the panacea to their ills it’s good step in the first direction as there’s an incredible amount of savings to be made in developer time from the switch to a more common architecture. Whether that translates into better games or whatever Yahtzee is ultimately lusting after will have to remain to be seen but the next generation is bright light on the horizon, not an iceberg threatening to sink the industry.
As most readers are aware I’m an incredibly amateur photographer having dabble in it on and off again for the past 5 years but only really started taking it seriously towards the end of last year. I’m still very much in the early stages of my understanding as whilst I can produce some pictures that I (and others) like my hit rate still feels incredibly low, especially when I set out to create a very specific image. A lot of that is comes from my still nascent understanding of how to light subjects properly and how the direction/intensity changes the resulting image.
Now whilst the following video isn’t exactly the greatest introduction on how you should go about lighting your subject (in this a model’s face) it does showcase just how dramatically you can change the resulting image simply by moving the light source:
Showing this to my wife she was adamant that they were splicing video together with different models as the changes are quite dramatic. It is the same person however as if you look at the eyes you can see the light source rotating at a rather impressive clip which is what gives rise to the dramatic changes in shadows. Pausing at different sections also makes it quite clear what the impacts of the direction of light are and how they are reflected in the final image.
I wonder what the effect would be if instead of moving the light they used multiple sources then just cycled through them. Hmmmmm…….
It’s hard for me to hide my fan boy nature when it comes to private space flight. Whilst all credit must go to Scaled Composites and Virgin Galactic for getting me inspired about all things space they have unfortunately taken a second seat to my current space crush. Not-so-long time readers will know that I’m talking about SpaceX, a company that has shown time and time again that they’re capable of not only developing technology that no private entity had previously but also delivering on their patently crazy promises. However I’m not in favour of monopolies/single points of failure (stemming from my capitalistic/engineering nature respectively) and the more options we have available to us for putting things in space the better.
Today it appears we have another contender: the Orbital Science’s Antares rocket.
Now I’ve only mentioned Orbital Sciences briefly in the past, noting that they won a contract to provide launch capabilities to NASA alongside SpaceX as part of the Commercial Orbital Transportation Services (COTS) program, but their legacy stretches back quite a long way. Founded in 1982 they’ve developed several different launch platforms in tandem with NASA and have also been involved in numerous high profile scientific missions. Most recently they developed the Dawn craft which is currently in the asteroid belt transiting from the asteroid Vesta to the dwarf planet Ceres. Needless to say if anyone has the chops to develop their own launch system it’s orbital sciences and the Antares rocket is their first such system.
On paper it looks to be somewhere between the Falcon 1 and 9 with a total payload to LEO of around 5000kg. The two first stage engines are curious little beasts, originally designed to form the basis of the Russian N-1 rocket that was bound for the moon. Considering that launch system was a dismal failure you’d then have to wonder about them using the engines from it but N-1’s issues were mostly process/design based rather than stemming from issues from one particular component. It also has a slightly wider payload fairing than the Falcon 9 at 3.9m in diameter which could come in handy for certain mission profiles.
The first launch of the Antares (dubbed A-ONE) was scheduled to happen in the middle of last week however some minor technical issues delayed the launch. The rocket itself was fine however one of the umbilical cables disconnected 12 minutes prior to launch, far too early when it usually happens right before lift off. Thankfully this didn’t require the rocket to be stood down and they were able to reschedule it for a couple days later. Unfortunately high winds on the second launch day caused them to issue a no-go due to weather and it was rescheduled for today. Thankfully conditions improved and they were able to launch, making the Antares rocket the second fully private rocket to make it to orbit.
Apart from that it’s still notable for many reasons. If the picture above looks a little unfamiliar to you it’s because the Antares wasn’t launched from the iconic Cape Canaveral. Instead it was launched from NASA’s Wallops Flight Facility located in Virginia, a place that doesn’t usually see rockets of this size. Indeed the Antares rocket is the largest rocket to ever be launched from this facility and will likely become the defacto launch site for the rocket in the future thanks to its much less crowded launch schedule. If all goes to plan this site could see another 2 launches of the Antares rocket this year which would be on par with SpaceX’s rapid turnaround times.
Today marks a great achievement for Orbital Sciences and the greater space industry as it shows that not only is the private space industry viable, it can likely support several competing players. This will only help spur innovation forward as companies look to outpace each other on every aspect. Whilst SpaceX might be the current starlet Orbital Sciences has decades of experience behind them and I can’t imagine them being in the backseat for very long. As always this means that the cost to launch will trend downwards and from there it’s only a matter of time before it reaches the commodity level.
And that, my friends, is really exciting.
It really is quite staggering to see how far games have come since I first started playing them nearly 3 decades ago. Even more surprising is how each style of game still has a place in the market today, even those that forego all modern trimmings in favour of recreating those early experiences. Last year saw a bevy of such titles cross my path and I was really quite surprised just how enjoyable revisiting that period of gaming could be. When I first read about Evoland it seemed like an intriguing idea as it would take you through the history of adventure games whilst also telling its own story.
Evoland starts out as a classic Legend of Zelda clone, all the way down to the pixely graphics and limited colour pallette. However as you move around and start finding chests of loot you’re not greeted by additional items to help you on your journey. No instead you will typically get an upgrade to your game experience like the addition of music, better colours and, my personal favourite, extra dimensions. These all build upon each other so as you progress through Evoland it becomes an ever increasingly varied game, one that aptly captures the essence of nearly all adventure games that have come before it.
Considering that Evoland’s primary goal is to take you through the history of adventure games the art style varies wildly from flat, 2D pixel art right up to full 3D environments that are reminiscent of titles like The Longest Journey. The pixelart is quite good, especially after a couple pallette upgrades, but the 3D feels incredibly rudimentary by comparison. It’s somewhat in line with the rest of the game as nothing about Evoland is terribly complicated so it all kind of fits together, at least enough to carry the overall thrust of the game forward.
In the beginning Evoland is your run of the mill, top down 2D adventure game complete with enemies that run around randomly and you equipped with only a sword with which to dispatch them. It plays exactly like the old Zelda games as well as you’re left to run around the environment looking for the next puzzle that’s blocking your progression. You can also, if you’re so inclined, explore even further to find all the collectibles that are scattered around the map although there’s little reason to do so outside of wanting to complete all the achievements.
The more you play Evoland the complex and nuanced it becomes, something you’ll be acutely aware of because it’ll tell you every time you unlock another game mechanic with an alert plastered across the bottom of the screen. Some of them have obvious and immediate impacts on the way the game plays, like the introduction of a world map which introduces random turn based combat encounters ala Final Fantasy, and others are more subtle like the “Something happened somewhere” alert that indicates you triggered an off screen event.
Initially the introduction of new elements is quite fun as it’s like a whole new game has been opened up for you. However due to the rudimentary nature of Evoland’s many different aspects they quickly start to descend into tedium. The random turn based encounters are probably the best example of this as you can’t walk for more than 10 seconds without one of them occurring. After a while these don’t take too long to resolve but the lack of variety in these encounters means that after the 3rd or 4th fight you’ve seen all the enemies Evoland has to offer and you’re essentially just grinding away XP and glis (a nod to Final Fantasy’s Gil system) which only has a limited amount of utility.
Indeed whilst Evoland is a cohesive game on the surface the actual mechanics of it aren’t exactly uniform across every new iteration. Most dungeons have been designed with a specific idea in mind and whilst some of the abilities will transfer across (like the upgraded combo sword attack) most of them won’t. So whilst one dungeon might give you a health orb rather than the 3 hearts system you’ll likely find that once you go anywhere else the health system du jour is back again. They also all seem to have separate internal values as well as half health in the turn based combat system doesn’t seem to translate to 1.5 hearts in the dungeon system.
Realistically Evoland is more like 4 distinct games that are loosely tied together by common elements. Viewed like this I’m more inclined to overlook the faults of them not completely interacting with each other. Indeed since the overall thrust of the game is more to take you through the evolution of adventure games rather than provide an in depth experience in each successive iteration of them I’d be missing the point if I judged it on the merits of the individual section’s gameplay. I guess what I’m getting at is if you’re looking for a solid gameplay experience you’re likely to come up short with Evoland, but that’s not the reason you’d play it.
There is some semblance of a story which really only sees development during the last couple sections. It might have been because I named my characters Dudeface, Butts and Mouman respectively but I didn’t feel any attachment to them nor any real drive to move the story forward apart from the desire to see which game mechanic would be unlocked next. The final boss battle was pretty cool though with the combination of music and larger than life boss aptly capturing the essence of those same encounters in games of yore.
Evoland serves as a great history book, detailing the many transitions that adventure games have undergone during the years. As a game it’s nothing spectacular but the essence of each era of adventure games is captured within each upgrade of the Evoland’s mechanics. There’s a very specific audience in mind for Evoland and it’s for people like me who grew up on all the titles that inspired it. So if you find yourself pining for the golden age of gaming or you’d just like take a trip down memory lane then Evoland is the game for you.
Evoland is available on PC right now for $9.99. Total game time was approximately 2 hours with ~83% completion and 34% of the achievements unlocked.
After a long weekend of staying up late, drinking merrily and enjoying the company of many close friends I found myself being a little under the weather. This is pretty atypical for me as I’ve only ever had the flu twice and I usually pass through the cold season relatively unscathed. Whilst there’s thousands of possible reasons for this I’ve always found that should I find myself in the beginnings of an infection a strong dose of chilli seems to make it subside, or at least take my mind off it long enough to start feeling better. I realised yesterday that whilst I might have some anecdotal evidence to support this I hadn’t really looked into the science behind it and the stuff I uncovered in my search has been pretty intriguing.
For starters there are some strange experiments out there that have used chilli (well the chemical that gives it the burn, capsaicin) as an apparently reliable method to induce coughing in test subjects. The first one I came across was testing whether or not coughing is a voluntary action and the results seem to indicate that the coughing we get with the common cold is a mixture of both. Other experiments showed that people with an upper respiratory tract infection (which includes things like the common cold) are more prone to coughing when exposed to a capsaicin/citric acid mixture. None of these really helped me in understanding whether or not chilli aids in reducing the symptoms of the common cold or helping to cure it but a couple other studies do provide some potential paths for benefits.
Subjects with perennial rhinitis, a permanent allergic reaction to stimulus that doesn’t vary by season, showed a marked decrease in nasal complaints when treated with a solution of 0.15mg of capsaicin per nostal every 2nd or 3rd day for 7 treatments. The benefits lasted up to 9 months after the treatment and incredibly there were no adverse effects on cellular homeostasis or overall neurogenic staining (which sounds rather impressive but is a little out of my league to explain). Whilst this doesn’t directly support the idea that consumption helps the common cold it does provide a potential mechanism for it to relieve symptoms. However how much capsaicin ends up in your sinuses while eating it isn’t something I could find any data on.
Other studies have found similar effects when capsaicin solutions have been sprayed into the nasal cavity with the improvements lasting for up to 6 months. That particular study was a little on the small side though with only 10 patients and no controls present but the result do fall in line with the previous study which had much more rigorous controls. The theme appears to resonate through most of the other studies that I could find: topical application in the sinuses is good, inhaling it will cause you to erupt in a coughing fit.
Anecdotally that seems to line up with the experiences I’ve had and it’s good to see it backed up by some proper science. As for consumed chilli helping overall however there doesn’t appear to be any studies that support that idea but there are potential avenues for it to work. So like many scientists I’ll have to say that the results are interesting but require a lot more research to be done. Whether it’s worthy of investigating is something I’ll leave up as an exercise to the reader, but I’m sure we’d find no shortage of spice loving test subjects who’d be willing to participate.
If you’ve been here a little while you’ll know that last year I won a competition to go up to Brisbane to cover TechEd Australia 2013 for LifeHacker Australia. During my time up there I wrote three posts covering everything from PowerShell, the evolution of the term “private cloud” and why Windows Server 201 would succeed. Evidently the LifeHacker writers and readers loved what I wrote and I ended up winning the mini-competition with the 2 other guest bloggers. At the time I was told that this would lead onto another series of posts for Microsoft themselves however that never eventuated but I did end up with a shiny new HP MicroServer that’s become the mainstay of my home network.
I thought that would be the end of it but a couple months ago Angus Kidman, the man behind much of LifeHacker Australia’s tech coverage, contacted me with an offer: come with him to the USA and participate in covering TechEd North America as part of their World of Servers initiative.
Of course I said yes.
It will be much the same as it was last year, I’ll be attending TechEd in New Orleans every day and writing up a post that sums up the lessons learned that I take away each day. The primary focus will still be on Server 2012 although with Microsoft’s increasing focus on cloud integration you can rest assured that I’ll be weaseling my way into as many Azure sessions as I possibly can. It’s going to be interesting to compare and contrast the two as I’m sure TechEd North America is going to be huge by comparison and hopefully that means we’ll get some juicy insights into some of Microsoft’s upcoming products.
But this post isn’t just for me to humble brag to you guys. I’m here to tell you that LifeHacker Australia is offering this very same opportunity to 2 lucky IT professionals! To enter all you have to do is fill out this entry form and answer a few questions about your IT chops. Once you’ve done that you’re in the running to win a fully paid trip to New Orleans to cover TechEd North America and you’ll get to hang out with me for the duration of the trip (most people would consider that a perk…most people ;)).
If you’re a budding blogger hoping to get a foot in the door or just a tech head who loves everything Microsoft then there really isn’t a better opportunity than the one LifeHacker is offering here. You’ve only got until May 1st to get your entries in (that’s 2 weeks people!) so I’d encourage you to get it in sooner rather than later. I’m incredibly excited to be going along for the ride on this one and if my previous experience was anything to go by it’ll be a blast and it’d be amazing if I could bring one my readers along for the ride.
Hope to see you there!
If you’ve worked in the IT industry it’s safe to assume that you’re familiar with ITIL or at least however it’s manage to manifest itself within your organisation. It’s probably one of the longest lasting ideals in IT today having been around for a good 20+ years in its current form, surprising for an industry that considers anything over 3 years archaic. Indeed anyone who’s been involved in implementing, maintaining or attempting to change an ITIL based process will likely call it that anyway and whilst I’m inclined to agree with them I think the problems stem more from the attitudes around these processes rather than the actual processes themselves.
Change management is by far the best example of this. The idea behind it is solid: any major changes to a system have to go through a review process that determines what impacts the change has and demands that certain requirements be met before it can be done. In an ideal world these are the kind of things you would do regardless of whether an external process required you to or not however the nature of IT tends towards many admins starting off in areas where such process aren’t required and thus, when they move onto bigger and better environments, processes like these are required to make sure they don’t unintentionally wreck havoc on larger systems. However change management is routinely seen as a barrier to getting actual work done and in many cases it is.
This is where the attitude problems start to occur. ITIL based processes (no one should be using pure ITIL, that’s crazy talk) should not be a hindrance to getting work done and the second they start becoming so is when they lose their value. Indeed the reason behind implementing an ITIL process like change management is to extract more value out of the process than is currently being derived, not to impede the work is being done. Essentially it should only be an extension of work that would be undertaken in the first place and if it isn’t then you either need to look at your implementation of the change process or why your current IT practices aren’t working with it.
Predominantly I think this comes from being far too strict with these kinds of processes with the prevailing attitudes in industry being that deviation from them will somehow lead to an downward spiral of catastrophes from which there is no escape. If these ITIL process are being routinely circumvented or if the amount of work required to complete the process outweighs the actual work itself then it’s not the people who are to blame, it is the process itself. Realistically instead of trying to mold people to the process, like I’ve seen it done countless times over, the process should be reworked to suit the people. Whilst this is by far more difficult to do than simply sending people on ITIL courses the benefits will far outweigh the costs of doing so and you’ll probably find that more people stick to it rather than attempt to circumvent it.Indeed much of the process revolution that has happened in the past decade has been due to these people rather than process focused ideals.
Whilst ITIL might be getting a little long in the tooth many of the ideals it touches on are fundamental in nature and are things that persist beyond changes in technology. Like many ideas however their application has been less than ideal with the core idea of turning IT in a repeatable, dependable process usurped by laborious processes that add no value. I believe changing the current industry view from focusing on ITIL based processes to people focused ones that utilize ITIL fundamentals would trigger a major shift in the way corporate IT entities do business.
A shift that I believe would be all for the better.