Posts Tagged‘update’

The Division Patch 1.8: The Game We All Wanted on Release.

The 1.0 version of The Division was a pretty great experience although its end game content was somewhat lacking. Indeed at the time of writing the review I was some 37 hours in and I only racked up another 8 before calling it quits altogether. Soon afterwards the incursion patch released but, frankly, there wasn’t enough in it to bring me back. Ever since then I’ve heard rumblings of the changes they’ve made, the content that’s been added and how all of that has resulted in a very well rounded game. With a couple of my friends recommending that I come back to give it a go I figured it’d be worth a shot and, honestly, if Massive Entertainment released this back in 2016 they would’ve been staring down the barrel of several game of the year awards.

The numerous patches since then haven’t expanded the story directly per se, however with the addition of new areas, encounters and whatnot the narrative world of The Division has expanded significantly. There’s a small amount of story explaining the background of the new additions to the game but you’ll likely miss most of it if you’re not paying attention. Like before a lot of the greater world building is done through the various kinds of collectibles you can find around the place, most of which will just build out the backstory of the main campaign a little more. It’d be nice to see some story focused DLC as I really did enjoy the campaign back on initial release but honestly with the rest of the changes that have come through I can see why it was probably left on the todo list.

The Division has retained its dedication to filling the world with incredible amounts of detail, something I had completely forgotten about in the near 2 years since I last played. Indeed that detail extends beyond just throwing random stuff everywhere as the level design itself is incredibly complex as well. I couldn’t tell you how many times me and my crew managed to get ourselves lost (in areas that we must have been through dozens of times before no less) when we’re on the hunt for an objective or similar. I’d usually chalk this up as a negative but it’s actually helped keep those same areas feeling fresh for much longer than you’d otherwise expect. Unfortunately I haven’t upgraded my machine since I last played (that’s probably coming next year) so I couldn’t really bump up any of the settings from their previous defaults. Maybe next time.

The amount of different activities that have been added, as well as the ones that have been revamped, are so numerous that returning players are likely to feel pretty overwhelmed. The good news is there’s really no required activity that you have to do, nor will you find yourself struggling to progress thanks to the tweaks to how enemies (and the loot they drop) scales. Essentially you have the ability to set the overall world’s difficulty as well as the challenge of the encounter itself. The first sets the level of the loot you’ll get and the latter the amount. This is great for gearing up as you can tweak the settings to get the most out of pretty much any encounter you’ll be doing. Loot drops aren’t restricted to any particular location either, meaning no matter what you end up doing you have a chance of getting the best gear. Of course the harder, higher end activities have better guaranteed loot to entice you to take on the challenge rather than just mindlessly farming.

Like all good loot treadmills the gear which allowed me to steamroll basically any encounter was made completely redundant upon logging in. My mix of high end and purple gear nowhere near the maximum attainable power level and so the loot grind began again in earnest. All in all though it only took me about 10 hours to get to the 270 range and from there it’s all about finding the gear with the right rolls to fill out whatever build you may be going for. Of course everything is about the sets and their bonuses now and whatever bonus takes your fancy will dictate the rest of your build. For now I’m still running with the best of what I have for the most part (I was lucky enough to get a Ninjabike bag which has made things easier) but am hoping to complete a full Predator’s Mark set in the not too distant future.

Thankfully not everything is left to just pure RNG and there are various ways in order to get the gear you want or, and this is definitely something I think all RNG loot games need, a way to optimise a drop to its ultimate potential. The Division isn’t shy with lavishing you with loot however it only does so because getting the right combination of stats and talents is infinitesimally rare. The recalibration station allows you to reroll a single talent on guns and a single stat on armour which sometimes can be enough to turn it from useable into a must-have. However the optimisation station means that a perfect set of stats with bad rolls can be brought up to the top tier rolls with enough farming. Sure, you don’t want to have to do this for every item, but for that one item which amps up your build significantly it’ll be worth the price of admission. Sadly I only realised that Ninjabike didn’t work for classified sets otherwise I wouldn’t have wasted my Division Tech on it.

However even with a rag tag bunch of armour pieces and weapons you’ll likely find that pretty much everything in The Division is available to you. Whilst my friend and I have been playing for a duo for the most part we only started to really hit the challenge wall past the 10 hour mark. At that point most of the higher end activities don’t appear to scale with group size and so are balanced for full teams of 4. Unfortunately it seems matchmaking at the moment isn’t all it’s cracked up to be as we’ve often gone through whole missions with it active before someone eventually joins. Still we’ve managed to farm in other areas without too much hassle so it’s not like we’re cut off from getting those shiny teal and red items.

The Dark Zone, which used to be this weird PVE but kind of PVP area, has now found its feet with the new changes to the zone. Previously it was pretty much just a high end gear farming place, one where someone going rogue was considered rude rather than part of the game. Now rogue agents are a real threat, one you have to be cautious of if you want to plunder the sweet loot in the area. I had many great encounters in the DZ, most of which ended with me and my team dead on the floor. However nothing is sweeter than the revenge you can take on them when they try to extract out with your loot. It might not be the most efficient way to farm items, especially if you’re actively looking for trouble, but it is one of the more enjoyable ones, especially with all the stories you’ll tell afterwards.

Some things haven’t received much love in the last 2 years though, namely the UI. Whilst I still love the aesthetic and simplicity of the UI when you’re run and gunning inventory management is something of a nightmare. Scrolling through dozens of items and trying to compare them to what you have is a real chore and the gear score really only tells half the story. If you’re min-maxing a particular build it’s easy to figure out what you need but even then you’re still likely to be carrying around a bunch of other items “just in case” you want to try a different one. There’s also other parts of the inventory that aren’t well described in-game (I have 6 different types of grenades? What do I need water for?) and honestly I can’t remember if they were even explained during the campaign. This doesn’t affect the overall enjoyment of the game too much but, given the amount of polish the rest of the game received, these parts do stick out more than they otherwise would.

The Division as it stands today isn’t the game I stopped playing all those years ago. The amount of diversity in terms of items, builds and activities is an order of magnitude above the game I remember. The core game play, which I quite enjoyed, remains mostly the same with the variety coming from the numerous gear sets which change the way the game plays out dramatically. Loot is plentiful but still a pain to manage, something I had hoped would have been improved over the years. All in all though it seems the rumours surrounding The Division being a game worth playing now are well justified and if you, like me, left it long ago now is definitely the time to jump back in.

Rating: 9.25/10

The Division is available on PC, Xbox One and PlayStation 4 right now for $89.95, $99.95 and $99.95 respectively. Game was played on the PC with 60 hours of total playtime (15 in patch 1.8).

Windows 10 to Have Mandatory Updates for Home Users.

Left to their own devices many home PC users will defer installing updates for as long as humanly possible, most even turning off the auto-updating system completely in order to get rid of those annoying pop ups. Of course this means that exploits, which are routinely patched within days of them being discovered, are often not installed. This leaves many unnecessarily vulnerable to security breaches, something which could be avoided if they just installed the updates once in a while. With Windows 10 it now seems that most users won’t have a choice, they’ll be getting all Microsoft updates regardless of whether they want them or not.

7484.Restart-warning_01455B5B

Currently you have a multitude of options to select from when you subscribe to Windows updates. The default setting is to let Windows decide when to download, install and reboot your computer as necessary. The second does all the same except it will let you choose when you want to reboot, useful if you don’t leave your computer on constantly or don’t like it rebooting at random. The third option is essentially just a notification option that will tell you when updates are available but it’ll be up to you to choose which ones to download install. The last is, of course, to completely disable the service something which not many IT professionals would recommend you do.

Windows 10 narrows this down to just the first two options for Home version users, removing the option for them to not install updates if they don’t want to. This is not just limited to a specific set of updates (like say security) either as feature updates as well as things as drivers could potentially find their way into this mandatory system. Users of the Pro version of Windows 10 will have the option to defer feature updates for up to 8 months (called Current Branch for Business) however past that point they’ll be cut off from security updates, something which I’m sure none of them want. The only version of Windows 10 that will have long term deferral for feature updates will be the Enterprise version which can elect to only receive security updates between major Windows updates.

Predictably this has caught the ire of many IT professionals and consumers alike, mostly due to the inclusion of feature updates in the mandatory update scheme. Few would argue that mandatory security updates are a bad thing, indeed upon first hearing about this that’s what I thought it would be, however lumping in Windows feature updates alongside it makes a much less palatable affair. Keen observers have pointed out that this is likely due to Microsoft attempting to mold Windows into an as-a-service offering alongside their current offerings like Office 365. For products like that continuous (and mandatory) updates aren’t so much of a problem since they’re vetted against a single platform however for home users it’s a little bit more problematic, given the numerous variables at play.

Given that Windows 10 is slated to go out to the general public in just over a week it’s unlikely that Microsoft will be drastically changing this position anytime soon. For some this might be another reason for them to avoid upgrading to the next version of Windows although I’m sure the lure of a free version will be hard to ignore. For businesses though it’s somewhat less of an issue as they still have the freedom to update how they please. Microsoft has shown however that they’re intent on listening to their consumer base and should there be enough outrage about this then there’s every chance that they’ll change their position. This won’t be stopping me from upgrading, of course, but I’m one of those people who has access to any version I may want.

Not everyone is in as fortunate position as I am.

Windows Threshold: Burying Windows 8 for the Sake of 9.

It’s hard to deny that Windows 8 hasn’t been a great product for Microsoft. In the 2 years that it’s been on the market it’s managed to secure some 12% of total market share which sounds great on the surface however its predecessor managed to nab some 40% in a similar time frame. The reasons behind this are wide and varied however there’s no mistaking that a large part of it was the Metro interface which just didn’t sit well with primarily desktop users. Microsoft, to their credit, has responded to this criticism by giving consumer what they want but like Vista the product that Windows 8 today is overshadowed by it’s rocky start. It seems clear now that Microsoft is done with Windows 8 as a platform and is now looking towards its successor, codenamed Windows Threshold.

Windows ThresholdNot a whole lot is known about what Threshold will entail but what is known points to a future where Microsoft is distancing itself from Windows 8 in the hopes of getting a fresh start. It’s still not known whether or not Threshold will become known as Windows 9 (or whatever name they might give to it) however the current release date is slated for sometime next year, on time with Microsoft’s new dynamic release schedule. This would also put it at 3 years after the initial release of Windows 8 which also ties into the larger Microsoft product cycle. Indeed most speculators are pegging Threshold to be much like the Blue release of last year with all Microsoft products receiving an update upon release. What interests me about this release isn’t so much of what it contains, more what it’s going to take away from Windows 8.

Whilst Microsoft has made inroads to making Windows 8 feel more like its predecessors the experience is still deeply tied to the Metro interface. Pressing the windows key doesn’t bring up the start menu and Metro apps are still have that rather obnoxious behaviour of taking over your entire screen. Threshold however is rumoured to do away with this, bringing back the start menu with a Metro twist that will allow you to access those kinds of applications without having to open up the full interface. Indeed for desktop systems, those that are bound to a mouse and keyboard, Metro will be completely disabled by default. Tablets and other hybrid devices will still retain the UI with the latter switching between modes depending on what actions occur (switch to desktop when docked, Metro when in tablet form).

From memory such features were actually going to make up parts of the next Windows 8 update, not the next version of Windows itself. Microsoft did add some similar features to Windows 8 in the last update (desktop users now default to desktop on login, not Metro) but the return of the start menu and the other improvements are seemingly not for Windows 8 anymore. Considering just how poor the adoption rates of Windows 8 has been this isn’t entirely surprising and Microsoft might be looking for a clean break away from Windows 8 in order to drive better adoption of Threshold.

It’s a strategy that has worked well for them in the past so it shouldn’t be surprising to see Microsoft doing this. For those of us who actually used Vista (after it was patched to remedy all the issues) we knew that Windows 7 was Vista under the hood, it was just visually different enough to break past people’s preconceptions about it. Windows Threshold will likely be the same, different enough from its direct ancestor that people won’t recognise it but sharing the same core that powered it. Hopefully this will be enough to ensure that Windows 7 doesn’t end up being the next XP as I don’t feel that’s a mistake Microsoft can afford to keep repeating.

 

Microsoft’s Surface 2: A Big Hole To Fill.

There’s no question that  Microsoft’s attempt at the tablet market has been lacklustre. Whilst the hardware they have powering their tablets was decent the nascent Windows Store lacks the diversity of its competitors, something which made the RT version of it even less desirable. This has since resulted in Microsoft writing down $900 million in Surface RT and associated inventory something which many speculated would be the end of the Surface line. However it appears that Microsoft is more committed than ever to the Surface idea and recently announced the Surface 2, an evolutionary improvement over its predecessor.

Surface 2 ProThe new Surface 2 looks pretty much identical to predecessor although it’s a bit slimmer and is also a bit lighter. It retains the in built kick stand but it now has 2 positions instead of one something which I’m will be useful to some. The specifications under the hood have been significantly revamped for both versions of the tablet with the RT (although it’s no longer called that) version sporting a NVIDIA Tegra 4 and the Pro one of the new Haswell i5 chips. Microsoft will also now let you choose how much RAM you get in your Pro model, allowing you to cram up to 8GB in there. The Pro also gets the luxury of larger drive sizes, up to 512GB should you want it (although you’ll be forced to get the 8GB RAM model if you do). Overall I’d say this is pretty much what you’d expect from a generation 2 product and the Pro at least looks like it could be a decent laptop competitor.

Of course the issues that led Microsoft to write down nearly a billion dollars worth of inventory (after attempting to peddle as much of it as they could to TechEd attendees) still exist today and the upgrade to Windows 8.1 won’t do much to solve this. Sure in the time between the initial Surface release and now there’s been a decent amount of applications developed for it but it still pales in comparison. I still think that the Metro interface is pretty decent on a touch screen but Microsoft will really have to do something outrageous to convince everyone that the Surface is worth buying otherwise it’s doomed to repeat its predecessor’s mistakes.

The Pro on the other hand looks like it’d be a pretty great enterprise tablet thanks to its full x86 environment. I know I’d much rather have those in my environment than Android or iPads as they would be much harder to integrate into all the standard management tools. A Surface 2 Pro on the other hand would behave much like any other desktop allowing me to deliver the full experience to anyone who had one. Of course it’s then more of a replacement for a laptop than anything else but I do know a lot of users who would prefer a tablet device rather than the current fleet of laptops they’re given (even the ones who get ultrabooks).

Whilst the Pro looks like a solid upgrade I can’t help but feel that the upgrade to the RT is almost unnecessary given the fact that most of the complaints levelled at it were nothing to do with its performance. Indeed not once have I found myself wanting for speed on my Surface RT, instead I’ve been wanting my favourite apps to come across so that I don’t have to use their web versions which, on Internet Explorer, typically aren’t great. Maybe the ecosystem is mature enough now to tempt some people across but honestly unless they already own one I can’t really see that happening, at least for the RT version. The Pro on the other hand could make some headway into Microsoft’s core enterprise market but even that might not be enough for the Surface division.

 

Increasing Microsoft’s Agility With Windows Blue.

Microsoft’s flagship product, Windows, isn’t exactly known for it’s rapid release cycle. Sure for things like patches, security updates, etc. they’re probably one of the most responsive companies out there. The underlying operating system however is updated much less frequently with the base feature set being largely the same for the current 3 year product life cycle. In the past that was pretty much sufficient as the massive third party application market for Windows made up for anything that might have been lacking. Customers are increasingly looking for more fully featured platforms however and whilst Windows 8 is a step in the right direction it had the potential to start lagging behind its other, more frequently updated brethren.

Had Windows 8 stayed as a pure desktop OS this wouldn’t be a problem as the 3 year product cycle fit in perfectly with their largest customer base: the enterprise. Since Windows 8 will now form the basis of every Microsoft platform (or at least the core WinRT framework) they’re now playing in the same realm as iOS and Android. Platform updates for these two operating systems happen far more frequently and should Microsoft want to continue playing in this field they will have to adapt more rapidly. Up until recently I didn’t really know how Microsoft was planning to accomplish this but it seems they’ve had something in development for a while now.

Windows Blue

Windows Blue is shaping up to be the first feature pack for Windows 8, scheduled for release sometimes toward the end of this year. It’s also the umbrella term for similar updates happening across the entire Microsoft platform around the same time including their online services like Outlook.com and SkyDrive. This will be the first release of what will become a yearly platform update that will bring new features to Windows and its surrounding ecosystem. It will not be in lieu of the traditional platform updates however as there are still plans to deliver Windows 9 on the same 3 year cycle that we’ve seen for the past 2 Windows releases.

Whilst much of the press has been around the leaked Blue build and what that means for the Windows platform it seems that this dedication to faster product cycles goes far deeper. Microsoft has shifted its development mentality away from it’s traditional iterative process to a continuous development process, a no small feat for a company of this magnitude. Thus we should expect the entire Microsoft ecosystem, not just Windows, to see a similarly rapid pace of development. They had already done this with their cloud offerings (as it seems to gain new features every year) and the success they saw there has been the catalyst for applying it to the rest of the their product suites.

Microsoft has remained largely unchallenged in the desktop PC space for the better part of 2 decades but the increasing power of mobile devices has begun to erode their core business. They have then made the smart move to start competing in that space with an unified architecture that will enable a seamless experience across all platforms. The missing piece of the puzzle was their ability to rapidly iterate on said platform like the majority of their rivals were, something which the Blue wave of products will begin to rectify. Whether it will be enough to pull up some of their worse performing platforms (Windows Phone) will remain to be seen however, but I’m sure we can agree that it will be beneficial, both for Microsoft and us as consumers.

 

The Quirks of Qlogic’s QAUCLI Tool (and The Perils of Shutting Down DOS).

Probably the biggest part of my job, and really it should be the biggest part of any competent administrator’s job, is automation. Most often system administrators start out in smaller places, usually small businesses or their own home network, where the number of machines under their control rarely exceeds single digits. At this point its pretty easy to get by with completely manual processes and indeed it’s usually much more efficient to do so. However things change rapidly as you come into environments with hundreds if not thousands of end points that require some kind of configuration to be done on them and at that point its just not feasible to do it manually any more. Thus most of my time is spent finding ways to automate things and sometimes this leads me down some pretty deep rabbit holes.

Take for instance the simple task of updating firmware.

You’d probably be surprised to find out that despite all the advances in technology over the decades firmware updates are still done through good old fashioned DOS, especially if you’re running some kind of hypervisor like VMware’s ESXi. For the most part this isn’t necessarily a bad thing, DOS is so incredibly well known that nearly all the problems you come across have a solid solution for it, but it does impose a lot of limitations on what you can do. For me the task was simple: the server needed to boot up, update the required firmware and then shut down at the end sop my script would know that the firmware update had completed successfully. There were other ways of doing this, like constantly querying the firmware version until it showed the updated status, but shutting down at the end would be far quicker and much more reliable (the firmware versions returned aren’t always 100% accurate). Not a problem I thought, the DOS CD I had must contain some kind of shut down command that I can put in AUTOEXEC.BAT and we’ll be done in under an hour.

I was utterly, utterly wrong.

You see DOS comes from the day when power supplies were much more physical things than they are today. When you went to turn your PC on back then you’d flip a large mechanical switch, one that was directly wired to the power supply, that’d turn on with an audible clack. Today the button you press isn’t actually connected to the power supply directly it’s connected to the motherboard and when the connection is closed it sends a signal (well it shorts 2 pins) to turn it on. What this means is that DOS really didn’t have any idea about shutting down a system since you’d just yank the power out from underneath it. This is the same reason that earlier versions of Windows gave you that “It’s now safe to turn off your computer” message, the OS simply wasn’t able to communicate to the power supply.

There are of course a whole host of third party solutions out there like this shutdown.com application, FDAPM from the FreeDos guys and some ingenious abuse of the DOS DEBUG command but unfortunately they all seemed to fail when presented with Dell hardware. As far as I can tell this is because the BIOS on the Dell M910 isn’t APM aware which means the usual way these applications talk to the power supply just won’t work (FDAPM reports this as such) which leaves us with precious few options for shutting down. Frustrated I decided that DOS might not be the best platform for updating the firmware and turned towards WinPE.

WinPE is kind of like a cut down version of Windows (available for free by the way)  that you can boot into, usually used to deploy the operating system in large server and desktop fleets. By cut down I mean really cut down, the base ISO it creates is on the order of 140MB, meaning if you need anything in there you basically have to add it in yourself. After adding in the scripting framework, drivers for the 10GB Ethernet cards and loading the QAUCLI tool I found in the Windows version of the firmware update I thought it would be a quick step of executing a command line and we’d be done.

Turns out QAUCLI is probably closer to an engineering tool in development more than a production level application. Whilst it may have some kind of debug log somewhere (I can’t for the life of me find it and the user guide doesn’t list anything) I couldn’t find any way to get it to give me meaningful information on what it was doing, whether it was encountering errors or if I had executed the command incorrectly. The interactive portion of it is quite good, in fact its almost a different tool when used interactively, but the scripted section of it just doesn’t seem to work as advertised.

Here’s a list of the quirks I came across (for reference the base command I was trying to use was qaucli -pr nic -svmtool mode=update fwup=p3p11047.bin):

  • Adding output=stdout as an option will make the tool fail regardless of any other option.
  • There is no validation on whether the firmware file you give it exists or not, nor if the firmware file itself is valid.
  • Upgrading/downgrading certain firmware versions will fail. I was working with some beta firmwares that were supposed to fix a client issue which could have likely been the cause but doing the same action interactively worked.
  • There is no feedback as to whether the command worked or failed past the execution time. If it fails to update it takes about a minute to finish, if it works its closer to 3~5 minutes.
  • Windows seems to be able to talk to some Qlogic cards natively (the QME2572  fibre channel cards specifically) but not the 10GB cards. This is pretty typical as ESXi needs a driver to talk to these cards as well so its not much of a quirk of QAUCLI per se, more that you need to be aware that if you want to flash the firmware on them in a WinPE environment you need to inject the drivers into the image.

Honestly though it could very well be my fault for tinkering with an executable that I probably shouldn’t be. Try as I might to find a legitimate download for QAUCLI I can’t really find one and the only place you’ll be able to get it is by extracting the Windows installer package and pulling it out of there. Still it’s a valuable tool and one that I think could be a lot better than it currently is but if you find yourself in a situation like I did hopefully these little tips will save you some frustration.

I know I would’ve appreciated them 3 days ago 😉

Sortilio Update: It’s Just Better All Over.

So like most products that a developer creates with one purpose in mind my first iteration of Sortilio was pretty bare bones. Sure if you had a small media collection that was named semi-coherently it worked fine (like it did for my test data) but past that it started to fall apart rather rapidly. Case in point: I let it loose on my own media collection, you know for the purposes of eating my own dog food. It didn’t take long for it to fall flat on its face, querying The TVDB’s API so rapidly that the rate limiter kicked in almost instantaneously. There was also the issue of not being able to massage the data once it had done the automated matching portion as even the best automated tools can still make mistakes. With that in mind I set about improving Sortilio and put the finishing touches on it yesterday.

Now the first update you’ll notice is the slightly changed main screen with a new Options tab and two extra buttons down in the right hand corner. They all function pretty much as you’d expect: the options tab has a few options for you to configure (only one of them works currently, the extensions one), save will export the current selection to a file for use later and load will  import said file back into Sortilio. The save/load functionality is quite handy if you’d like to manually go in there and sort out the data yourself as it’s all plain XML that I’m sure anyone with half a coding mind about them would be able to figure out. I put it in mostly for debugging purposes (re-running the identification process is rather slow, more on that in a bit) but I can see it being quite useful, especially with larger collections.

As I mentioned earlier whilst the automated matching does a pretty good job of getting things right there are times when it either doesn’t find anything or its got it completely wrong. To alleviate this I added in the ability for you to be able to double click the row to bring up the following screen:

Shown in this dialog is the series drop down which allows you to select from a list of episodes that Sortilio has already downloaded. The list is populated by the cache that Sortilio creates from its queries to The TVDB so if it managed to match one file in the series correctly it will have it cached already so you can just select it and hit update. Sortilio will then identify other files that had the same search term and ask if you’d like to update them as well (since it will have probably got them wrong as well). Should the series you’re looking for not be available you can then hit the search button which brings up this dialog:

From here you can enter whatever term you want and hit search. This will then query The TVDB and then display the results in a list for you. Select the most appropriate one and then hit OK and you’ll have the new series assigned to that file.

Under the hood things have gotten quite a bit better as well. The season string matching algorithm has been improved a bit so that identifies seasons better than it previously did. For instance if you had a file that was like say battlestar.galactica.2003.s01e20.avi Sortilio would (wrongly) identify that as season 20 because of the 2003 before the series/episode identifier. It now prefers the right kind of identifiers and is a little better overall at getting it right, although I still think that the way I’m going about it is slightly ass backwards. Chalk that up to still figuring out how to best do string splitting based on a regex.

Now on the surface if you were to compare this version to the previous it would appear to run quite a bit slower. There’s a good reason for this and it all comes down to the rate limit on The TVDB API. After playing around with various values I found that the sweet spot was somewhere around a 2 second delay between searches. Without any series cached this would mean that every request will incur a 2 second penalty, significantly increasing the amount of time required to get the initial sort done. I’ve alleviated this somewhat by having Sortilio search its local cache first before attempting to head out to the API but that’s still noticeably slower that it was originally. I’ve reached out to the guys behind The TVDB in the hopes that I can get an excerpt of their database that I can include within Sortilio that will make the process lightening fast but I’ve yet to hear back from them.

So as always feel free to grab it, have a play and then send me any feedback you have regarding it. I’ve already got a list of improvements to make on this version but I’d definitely call this usable and to prove a point I have indeed used it on my own media collection. It gets about 90% of the way there with the last 10% needing manual intervention, either within Sortilio or outside cleaning up after it has done its job. If you’ve used it and encountered problems please save the sort file and the debug log and send them to me at [email protected].

You can grab the latest version here.

[NOTE: There is no link currently because gmail barfed at the file attachment I sent myself to upload this morning. Follow me on Twitter to be notified of when it comes out!]

Silverlight May Die, But the Developers Won’t.

You’d think that since I invested so heavily in Silverlight when I was developing Lobaco that I would’ve been more outraged at the prospect of Microsoft killing off Silverlight as a product. Long time readers will know that I’m anything but worried about Silverlight going away, especially considering that the release of the WinRT framework takes all those skills I learnt during that time and transitions them into the next generation of Windows platforms. In fact I’d say investing in Silverlight was one of the best decisions at the time as not only did I learn XAML (which powers WPF and WinRT applications) but I also did extensive web programming, something I had barely touched before.

Rumours started circulating recently saying that Microsoft had no plans to develop another version of the Silverlight plugin past the soon to be released version 5. This hasn’t been confirmed or denied by Microsoft yet but there are several articles citing sources familiar with the matter saying that the rumour is true and Silverlight will recieve no attention past this final iteration. This has of course spurred further outrage at Microsoft for killing off technologies that developers have heavily invested in and whilst in the past I’ve been sympathetic to them this time around I don’t believe they have a leg to stand on.

Microsoft initially released Silverlight back in 2007 and has release updates to the platform every year or so since then. Taking that into consideration you’d figure that the latest release of Silverlight has 1 or 2 years in it before other technologies (most likely HTM5 and JavaScript) overtake it in terms of functionality. In that time Windows 8 will be released along with WinRT, the framework that will be instantly familiar to any Silverlight developer. Sure the code might not be directly translatable to the new platform but considering the design work is done in XAML and C# is a supported language I’d struggle to find any Silverlight developer who wouldn’t be able to blunder their way through with a couple Google searches and a StackOverflow account.

All of Microsoft’s platforms are so heavily intertwined with each other that it’s really hard to be just a Silverlight/WPF/ASP.NET/MFC developer without a lot of crossover into other technologies. Hell apart from the rudimentary stuff I learnt whilst in university I was able to self learn all of those technologies in the space of a week or two without many hassles. Compare that with my month long struggle to learn basic Objective-C (which took me a good couple months afterwards to get proficient in) and you can see why I think that any developer whining about Silverlight going away is being incredibly short sighted or just straight up lazy.

In the greater world of IT you’re doomed to fade into irrelevance if you don’t keep pace with the latest technologies and developers are no exception to this. Whilst I can understand the frustration in losing the platform you may have patronized for the past 4 years I can’t sympathize with an unwillingness to adapt to a changing market. The Windows platform is by far one of the most developer friendly and the skills you learn in any Microsoft technology will flow onto other Microsoft products, especially if you’re proficient in any C based language. So whilst Microsoft might not see a future with Silverlight that doesn’t mean the developers are left high and dry, in fact they’re probably in the best position to innovate out of this situation. 

ProcrastinationOn: Apply Directly to the Forehead!

It was almost 9 months and 200 posts ago that I thrust my pre-alpha version of Geon into the world for everyone to see. Thanks to my innate shyness I didn’t go the whole hog and release it into the wild for the whole world to see and I’m still glad for that as the first version was, to put it lightly, a smoking pile of crap. Had anymore than about 5 users got on it at once (the record stood at 2) my server would have fallen on its face trying to deliver all the content over my poor little 1Mbps connection. The saving grace of Silverlight taught me that I could use my client side programming skills to do what I wanted on the web without having to completely relearn everything and the next few versions of Geon came along that much faster.

Right now I’m comfortable enough to let every reader of this blog know that there’s a new version of Geon up (those adventurous amongst you would’ve noticed a link to the new version in a previous post) and it comes along with a UI change that I had been alluding to a while back. In essence the change was done in order to increase the readability of the information streams you’ve selected as prior to this you just had the one bar that would scroll along madly if you dared to look at multiple locations at once or just so happened to add Twitter from anywhere that was mildly populated. In addition to the UI changes I have also made the switch to Silverlight 4 which added in things like native scroll wheel support (I can’t tell you how happy that made me) and a slight performance improvement over Silverlight 3. Thankfully none of the breaking changes they made in the transition affected Geon so the upgrade was only a few clicks and a restart of Visual Studio away.

The new UI works similarly to the old one as you select your location first by clicking the location button on the left hand side and then clicking the location on the map you want to see. Then you can add in information feeds from the same bar in a similar way and they’ll automatically add themselves to the closet location circle on the map. As of right now all the feeds available work apart from Facebook (you’ll get a pop up asking it to connect with your Facebook account but no information will appear) because their geolocation is still not fully implemented and I’m not keen to do a whole lot of mangling to get results that are more than likely irrelevant anyway¹. Once you’re done adding the streams hit the button up in the left hand corner to see your streams in all their glory. Rows are locations and the columns are the feeds, all titled properly so you can tell what’s what.

Having all that done means however I’m now out of options for procrastinating. You see whilst this version included some new streams (videos and Wikipedia), a much better UI and a cleaner back end (mmmm JSON) most of the heavy lifting had already been done in previous versions. After getting the initial hard parts out of the way with the UI most of it could have been done inside of a week, although I casually programmed it over the course of a month or so. The next thing on the list is the real meat of Geon: the request system.

That pretty much means I have to start diving into something I’ve never coded before: webservices. Whilst I can’t really say I’ve been avoiding this I haven’t been actively looking to do anything about it either, apart from the casual search for tutorials on how to build user authentication systems. I know I’m just being a big baby about this and I should just suck it up and do it but it’s just been so darn easy up until this point I’ve been wondering why no one has done it before. As it turns out the rudimentary parts that most netziens have come to expect are the most complex and tiresome parts which is why it hasn’t been done (and also explains why some services don’t have logins at all).

I’ve decided to suck it up and just start hammering away at it until I get the thing going. It’s much like when I first started out coding Geon and I was using RSS feeds for everything, it was just the first way I found to do things. After fiddling around for a while and getting some advice from a real developer mate I found that had I just taken the time to research it the whole idea of using other formats was so much easier. I’m sure with an afternoon of searching under my belt I’ll be ready to tackle the big bad demon that is the client/server architecture of Geon.

¹I thought I should elaborate on this a little bit. There’s been rumours of a geo-api from Facebook for a while now but with their developer conference f8 over and done with I haven’t heard anything solid about its actual implementation. They’ve tweaked their privacy policy to allow the storage of geo information in Facebook however the API as of right now remains unchanged. There are a lot of apps out there making use of geo data and Facebook but there’s no way to extract that out of Facebook currently. You can kind of figure stuff out by finding out a user’s hometown location, reverse geo-coding the location, figuring out if that’s within your bounding box and then displaying messages from them if it’s within your area HOWEVER that’s an incredibly messy way of doing it and honestly isn’ the kind of thing I was looking for. I’ll be integrating Facebook information when they finalize their geo-api but until then it won’t work.

Geon 1.1 Update.

It’s that time again! I’ve updated my Geon application to version 1.1 and this brings along with it a UI change, a shift in focus for some things and of course new features. This time around though I thought I’d give you a walk through of what Geon can do and how you can go about using it. This will also give me a chance to explain away any problems that you’ll see, since this is still technically what I’d call a beta (because it’s far from feature complete).

Opening up the Geon page will greet you with a slightly more usable interface than the previous version. It’s now a 3 column layout with statically set widths for most of the items. It’s best viewed at 1680 x 1050 but it’s still usable at lower resolutions. The left column has a set of check boxes for choosing what information you want to see and a filter box at the bottom. The center column is a map that when clicked will change your current location to where you clicked, so you can view an information feed from another area. The right column is for part of a future release that will allow you to send requests to other Geon users in that area for pictures/video/text, and also allow you to respond to requests. For now the right column will only notify you when you change your location, but soon it will display all the recent Geon requests and responses for your area.

Ticking any of the boxes will bring up information from that source. For Twitter, News and Blogs this will appear in the left column as text and links. For Flickr  Should you wish to apply a filter for it, say for the rally that took place in Sydney on the weekend, you can put your filter terms in the box below. This is handy if you want to see an information stream for an event that might not be hitting the front page news, as most of it gets drowned out by the headlines. Geon will initially retrieve a maximum of 15 results for each of the services regardless of time frame and will attempt live updates after that. One thing I have noticed is that blog posts and news items will usually be at the bottom due to their publishing time. Tweets, as per their nature, will usually be at the top. The pause check box shows whether or not the feed is attempting to live update, and similar to the last Geon release there are a few reasons why you’d want to stop it (want to read your feed without the scroll bar snapping back up to the top is one).

Now for the juicy bit. Scroll to some location on the map that you’d like to see news for and single click. After a short delay you’ll notice in the response box a message telling you that you have moved to a new location. Your information feed will now start to update from this new area. If you had information from another location open previously it will slot the information in chronologically. If you want to clear your information view before switching location just untick all the boxes and it will clear the feed list for you. I’ve noticed that if you’re viewing a busy area then switch to a quieter one (say from Sydney to Canberra) the new information will be buried in the midst of the old, which is probably not entirely useful.

And now for the all important bugs/known issues:

  • Internet Explorer is still unsupported: For the most part everything seems to work ok except for when you click the map. There’s still a wide discrepancy between where you click on the map and where it thinks you clicked. Firefox and Chrome appear fine and this could just be an IE8 issue however I haven’t taken the time to test IE6/7, mainly because I have no idea why an ASP.NET application would be having troubles in IE and not Firefox.
  • The Pause checkbox is a little iffy: For the most part it works fine but there are times when clicking it will not change the state of the timer on the page, and it will keep trundling along as if nothing happened. I’ve just thought of a way to fix it (GARGH why didn’t I see that yesterday) so I’ll fix it up when I get home.
  • Feed updates are delayed by about 30~90 seconds: Anyone following me on Twitter will have noticed me tweeting to test my live updates. Since the design is based off reading a RSS feed from Twitter and other various sources it should show up as soon as the feed is updated (which appears to be real time). However it sometimes takes a minute or two to update. I’ve got a feeling this is due to some caching my RSS client library does, so I’ll have to work on that one.
  • Feeds in busy places shuffle themselves around: If you’re watching a busy place like New York you might notice the feed rearranging itself. This is because I sort the feed based on the date and if they’re the same (which happens a lot with tweets) it arbitrarily arranges them. This makes the feed shuffle around a bit and is currently the best solution to the problem which initially was that the feed was completely random.

What’s in scope for 1.2? I’m glad you asked (even if you didn’t ;)):

  • Implement the request/respond function: This will probably take the better part of a weekend to get done. I had it planned for this release but it got dropped since I had other things to do other than just coding.
  • Make it pretty: Right now it’s dull as dishwater to look at. It needs to be made a little better looking and if anyone out there is interested in some design/development work in order to make Geon look better feel free to contact me. You will be paid for your services.
  • Add in an information timeline: Much like Google Wave’s slider bar that allows you to see how information evolved over a period of time I want to implement something similar in Geon.

So that’s about it. I’m still taking all feature requests/ideas for Geon so if you think something would be cool or useful just leave a comment or give me an email.