Ho boy, rarely have I copped more flak for a post both online and offline than my piece early last year on how the general population of Instagram made me feel. In all honesty whilst I knew there were a few people it would piss off, which was one of the reasons it sat in my drafts folder for ages, I still felt like I had some valid points to make based on my observations based around the Instagram user base at large. Many people took offence to this, arguing points ranging from “Why should that matter to you anyway?” to “You’re using it wrong, there’s great communities on there”. I was hoping that the comments section would have been the end of all of it but late last week the topic came up again and I lost an hour in the ensuing debate so I figured it was time I made my position on this whole matter more clear.
I recognise that for every example I can dredge up of someone posting a horribly framed and filtered picture of their breakfast someone else can just as easily show me something like this. My criticism wasn’t levelled at people who use the service in this fashion but reading back over the post and the ensuing comments I never really made that entirely clear, so mea culpa on that one. However I don’t feel that the general thrust of my argument has been invalidated by that as many users agree that the vast majority of stuff on Instagram isn’t particularly great. This isn’t unique to Instagram however as any user generated content site suffers from Sturgeon’s Law and honestly the mentality of users on said sites really doesn’t vary that much but Instagram hit closer to home thanks to my interest in this particular area.
I’ve also had people try to bring me back into the Instagram fold in order to convince me that there’s something in the platform for me. Now whilst I wasn’t an active user for quite some time I did have the application installed on my Galaxy S2 for the better part of the year, mostly so I could view pictures linked to me on Twitter without having to use Instagram’s then rather shitty web interface. From time to time I’d look at pictures on there and see some genuinely good ones but not often enough to convince me that it was worth investing my time to better my feed by subscribing to said users. The fact of the matter is I already have many other avenues for discovering photographers that I like, ones that share a critical characteristic with.
Our preferred platform of choice.
For me the undisputed platform of choice is my DSLR. I’ve tried many other camera systems from high end point and shoots, film SLRs and yes multitudes of cameras in phones but in the end I always end up coming back to my DSLR. The reasoning behind this is because of the amount of control and influence I have over the final image, something which I struggle with on any other platform. It may sound weird if you prefer the simplicity that’s granted to you by camera phones (something which I do understand) but I find it a lot easier to take pictures on my DSLR, to the point where using anything else just frustrates me. I think that’s because I know that whilst I can do a lot of things in post should I so desire there are some things I simply can’t unless I’m using my preferred platform of choice.
This is somewhat at odds with the Instagram community which, as far as I’m aware, doesn’t take particularly kindly to those who take photos outside of their phone and then upload them via the service. If I was going to use Instagram again that’s the way I would use it but I’d rather not antagonize the community further by breaking the current social norm on there. For now I really only use Facebook to distribute pictures (mostly because my recent photographic endeavours have involved friend’s weddings) but I’ve been a fan of Flickr and 500px for a long time now as they seem to be more my kind of people.
I’ve come to realise that even my beloved DSLR community isn’t immune to this kind of malarkey either as there are far, far too many people who I walking around with a $1000+ camera with the shocking kit lens on it shooting in auto thinking that they’re the next Don McCullin. The criticisms I’ve levelled at Instagram apply to them as well although they’ve yet to congregate onto a platform that’s as ubiquitous as Instagram has become.
After the backlash I received I set myself a challenge to try and use my camera phone to produce pictures that I’d be proud to share and the above is probably one of the dozens I’ve taken that’s anywhere near what I wanted it to be. 6 months of trying have shown me there’s definitely a lot of effort required into creating good pictures, arguably the same amount as required by using a DSLR, but I still feel like I’m constrained by my phone. Maybe that’s a personal thing, something that I could overcome with more time and dedication, but in saying that I’d propose the same thing to all the Instagrammers out there. Borrow a friends DSLR and see the world from our side. Maybe you’ll come away with an appreciation for the technology that helped give birth to the platform you so love today.
It’s hard to believe that we’re still in the first year of Google+ as it feels like the service has been around for so much longer. This is probably because of the many milestones it managed to pass in such a short period of time, owing the fact that anyone with a Google account can just breeze on into the nascent social network. I personally remained positive about it as the interface and user experience paradigms suited my geeky ways but the lack of integration with other services along with the lack of migration of others onto the service means that it barely sees any use, at least from me.
Still I can’t generalize my experience up to a wider view of Google+ and not just because that’s bad science. Quite often I’ve found myself back on Google+, not for checking my feed or posting new content, but to see conversations that have been linked to by news articles or friends. Indeed Google+ seems to be quite active in these parts with comment threads containing hundreds of users and multitudes of posts. Most often this is when popular bloggers or celebrities start said thread so its very much like Twitter in that regard, although Google+ feels a whole lot more like one big conversation rather than Twitter’s 1 to many or infinitum of 1 to 1 chat sessions. For the most part this still seems to be heavily biased towards the technology scene, but that could just be my bias stepping in again.
Outside that though my feed is still completely barren with time between posts from users now expanding to weeks. Even those who swore off all other social networks in favour of Google+ have had to switch back as only a small percentage of their friends had an active presence on their new platform of choice. This seems to be something of a trend as user interactivity with the site is at an all time low, even below that of struggling social network MySpace. Those figures don’t include mobile usage but suffice to say that the figures are indicative of the larger picture.
Personally I feel one of the biggest problems that Google+ has is lack of integration with other social network services and 3rd party product developers. Twitter’s success is arguably due to their no holds barred approach to integration and platform development. Whilst Google+ was able to get away with not having it in the beginning the lack of integration hurts Google’s long term prospects significantly as people are far less likely to use it as their primary social network. Indeed I can’t syndicate any of the content that I create onto their social network (and vice-versa) due to the lack of integration and this means that Google+ exists as a kind of siloed platform, never getting the same level of treatment as the other social networks do.
Realistically though it’s all about turning the ghost towns that are most people’s timelines into the vibrant source of conversation that many of the other social networks are. Right now Google+ doesn’t see much usage because of the content exclusivity and effort required to manually syndicate content to it. Taking away that barrier would go a long way to at least making Google+ look like its getting more usage and realistically that’s all that would be required for a lot of users to switch over to it as their main platform. Heck I know I would.
Like almost any industry IT can sometimes feel like a pretty thankless job. If you’re halfway competent at what you do people won’t notice the vast amount of effort you put in to making sure everything runs smooth and will begin to question whether they really need to keep you around. Conversely if everything isn’t running smooth it’s more likely everyone will recognize your hard work, but you’ll be spending all your time fighting fires and solving problems that need to be done urgently which isn’t the greatest thing if you like your work to be stress free. Plus the work doesn’t stop once you clock off in the afternoon since (if you’re one of the only computer guys in your family/circle of friends) people will bug you with their computer problems, begging you to provide a fix for them.
The latter point though is applicable to almost any industry. Way too often when people are socializing and the topic of work comes up people’s professions seem to be an open door for people to solicit free advice from the first person to mention what they do for a crust. The doctors will get regaled with tales of various ailments, the mechanic about car problems and the IT guy will of course be barraged with all sorts of strange questions that realistically can’t be answered on the spot. Whilst I don’t shy away from telling people what I do for work anymore (I just tell them my going rate should they want me to fix their problems) it does make IT a bittersweet industry to work in sometimes, and I’m not the only one to think that.
What spurred this idea was this blog post on why it doesn’t pay to be the computer guy. Boyd makes some great points in there hitting on some common frustrations that nearly every IT person has encountered throughout their career. Indeed I had found myself struggling with such problems for some time, like the lack of appreciation for the work that I did and how saying “it should work this way, let me check that” turns into “he said it would do exactly this, and it didn’t so its his fault” quicker than I could ever imagine. Still whilst I’m not going to say that much has changed in the 4 years since he wrote that post there is one thing that I learnt from my time in project management that I feel could solve at least half of the problems that he faced there.
That thing is expectation management.
You see when people expect the world of you it’s in our nature to not turn them down. It’s really quite flattering to have people come and ask you for help and the more you’re able to do for them the more they will expect of you. For us IT folk this has a habit of spiraling wildly out of control since 90% of the problems users encounter are 5 minutes on Google away from being fixed, so expectations eventually reach levels that no one will be able to live up to. Thus you end up being placed on a pedestal and users will look to you first instead of attempting to solve the problem first. They then expect you to be the answer to all their problems, which seems to be the root of many of Boyd’s complaints.
The best way to fight this problem is to educate the users on what they can do to help themselves out, empowering them. Back when I used to work as an IT technician for servicing people’s computers in their homes I’d usually spend a good hour of my time there explaining to them what was wrong and how they could go about fixing it themselves in the future. You’d think this would be bad for business but it wasn’t as many customers would recommend me based on my services, with a good 20% of new customers coming from referrals. Additionally when they did hit a problem they couldn’t fix themselves they were far more appreciative of my skills when I returned, knowing the effort that went into it.
We IT people could also do with eating some humble pie once in a while. I can’t tell you how many times I’ve been asked something that I know nothing about and I’ve straight up said “I don’t know” to a user’s face. Their reaction is always of surprise since it’s unusual for anyone (let alone an IT know-it-all) to admit they have no idea about something you just asked about. It’s not easy I’ll admit and your pride will take some hits from being so brutally honest about your limitations but it will knock you off that pedestal the users have put you on and they’ll be far more likely to treat you like a human rather than some IT deity. If a workplace doesn’t value this kind of honesty then I’d recommend moving on, unless you like the position you’re currently in.
There are a few points Boyd makes however that can’t be simply managed away like the constant skill devaluation and getting asked the same questions again and again, but you life as an IT worker can be a whole lot more tolerable when you start molding people’s expectations of you to more closely align to reality. It’s not easy sometimes, especially when it feels like you’re giving your boss reasons to fire you, but in the end you’ll be better for it and you’ll be far more appreciated for the work you do.
Maybe I’m just hanging around the wrong places on the Internet but recently there seemed to be a higher than average level of vitriol being launched at Microsoft. From my totally arbitrary standpoint it seems that most people don’t view Microsoft as the evil empire that they used to and instead now focus on the two new giants in the tech center, Apple and Google. This could be easily explained by the fact that Microsoft hasn’t really done anything particularly evil recently whilst Apple and Google have both been dealing with their ongoing controversies of platform lock-down and privacy related matters respectively. Still no less than two articles have crossed my path of late that squarely blame Microsoft for various problems and I feel they warrant a response.
The first comes courtesy of the slowly failing MySpace who has been bleeding users for almost 2 years straight now. Whilst there are numerous reasons as to why they’re failing (with Facebook being the most likely) one blog asked the question if their choice of infrastructure was to blame:
1. Their bet on Microsoft technology doomed them for a variety of reasons.
2. Their bet on Los Angeles accentuated the problems with betting on Microsoft.
Let me explain.
The problem was, as Myspace started losing to Facebook, they knew they needed to make major changes. But they didn’t have the programming talent to really make huge changes and the infrastructure they bet on made it both tougher to change, because it isn’t set up to do the scale of 100 million users it needed to, and tougher to hire really great entrepreneurial programmers who could rebuild the site to do interesting stuff.
I won’t argue point 2 as the short time I spent in Los Angeles showed me that it wasn’t exactly the best place for acquiring technical talent (although I haven’t been to San Francisco to give it a good comparison, but talking with friends who have seems to confirm this). However betting on Microsoft technology is definitely not the reason why MySpace started on a long downward spiral several years ago, as several commenters point out in this article. Indeed MySpace’s lack of innovation appears to stem from the fact that they outsourced much of their core development work to Telligent, a company that provides social network platforms. The issue with such an arrangement meant that they were wholly dependent on Telligent to provide updates to the platform they were using, rather than owning it entirely in house. Indeed as a few other commenters pointed out the switch to the Microsoft stack actually allowed MySpace to Scale much further with less infrastructure than they did previously. If there was a problem with scaling it definitely wasn’t coming from the Microsoft technology stack.
When I first started developing what became Lobaco scalability was always something that was nagging at the back of my head, taunting me that my choice of platform was doomed to failure. Indeed there have been only a few start-ups that have managed to make it big using the Microsoft technology stack so it would seem like the going down this path is a sure fire way to kill any good idea in its infancy. Still I have a heavy investment in the Microsoft line of products so I kept on plugging away with it. Problems of scale appear to be unique for each technology stack with all of them having their pros and cons. Realistically every company with large numbers of users has their own unique way of dealing with it and the technology used seems to be secondary to good architecture and planning.
Still there’s still a strong anti-Microsoft sentiment amongst those in Silicone Valley. Just for kicks I’ve been thumbing through the job listings for various start ups in the area, toying with the idea of moving there to get some real world start-up experience. Most commonly however none of them want to hear anything about a Microsoft based developer, instead preferring something like PHP/Rails/Node.js. Indeed some have gone as far as to say that .NET development is black mark against you, only serving to limit your job prospects:
Programming with .NET is like cooking in a McDonalds kitchen. It is full of amazing tools that automate absolutely everything. Just press the right button and follow the beeping lights, and you can churn out flawless 1.6 oz burgers faster than anybody else on the planet.
However, if you need to make a 1.7 oz burger, you simply can’t. There’s no button for it. The patties are pre-formed in the wrong size. They start out frozen so they can’t be smushed up and reformed, and the thawing machine is so tightly integrated with the cooking machine that there’s no way to intercept it between the two. A McDonalds kitchen makes exactly what’s on the McDonalds menu — and does so in an absolutely foolproof fashion. But it can’t go off the menu, and any attempt to bend the machine to your will just breaks it such that it needs to be sent back to the factory for repairs.
I should probably point out that I don’t disagree with some of the points of his post, most notably how Microsoft makes everything quite easy for you if you’re following a particular pattern. The trouble comes when you try to work outside the box and many programmers will simply not attempt anything that isn’t already solved by Microsoft. Heck I encountered that very problem when I tried to wrangle their Domain Services API to send and receive JSON a supported but wholly undocumented part of their API. I got it working in the end but I could easily see many .NET developers simply saying it couldn’t be done, at least not in the way I was going for it.
Still that doesn’t mean all .NET developers are simple button pushers, totally incapable of thinking outside the Microsoft box. Sure there will be more of those type of programmers simply because .NET is used is so many places (just not Internet start-ups by the looks of it) but to paint all of those who use the technology with the same brush seems pretty far fetched. Heck if he was right then there would’ve been no way for me to get my head around Objective-C since it’s not supported by Visual Studio. Still I managed to get competent in 2 weeks and can now hack my way around in Xcode just fine, despite my extensive .NET heritage.
It’s always the person or company, not the technology, that limits their potential. Sure you may hit a wall with a particular language or infrastructure stack but if you’re people are capable you’ll find a way around it. I might be in the minority when it comes to trying to start a company based around Microsoft technology but the fact is that attempting to relearn another technology stack is a huge opportunity cost. If I do it right however it should be flexible enough so that I can replace parts of the system with more appropriate technologies down the line, if the need calls for it. People pointing the finger at Microsoft for all their woes are simply looking for a scapegoat so they don’t have to address the larger systemic issues or are simply looking for some juicy blog fodder.
I guess they found the latter, since I certainly did 😉
My mum isn’t the most technical person around. Sure she’s lived with my computer savvy father for the better part of 3 decades but that still doesn’t stop her from griping about new versions of software being confusing or stupid, much like any regular user would. Last night I found out that her work had just switched over to Windows 7 (something I’ve yet to do at any office, sigh) and Office 2010. Having come from XP and Office 2003 she lamented the new layout of everything and how it was impossible to get tasks done. I put forth that it was a fantastic change and whilst she might fight it now she’ll eventually come around.
I didn’t do too well of convincing her that, though 😉
You see when I first saw Vista I was appreciative of the eye candy and various other tweaks but I was a bit miffed that things had been jumbled around for seemingly no reason. Over time though I came to appreciate the new layout and the built in augmentations (start menu search is just plain awesome) that helped me do things that used to be quite laborious. Office 2007 was good too as many of the functions that used to be buried in an endless stream of menu trees were now easily available and I could create my own ribbon with my mostly used things on it. Most users didn’t see it that way however and the ribbon interface received heavy criticism, on par with that leveled at Vista. You’d then think that Microsoft would’ve listened to their users and made 7 and office 2010 closer to the XP experience, but they didn’t and continued along the same lines.
Why was that?
For all the bellyaching about Vista it was actually a fantastic product underneath. Many of the issues were caused by manufacturers not providing Vista compatible drivers, magnified by the fact that Vista was the first consumer level operating system to support 64 bit operation on a general level (XP64 was meant for Itaniums). Over the years of course drivers matured and Vista became quite a capable operating system although by then the damage had already been done. Still it laid the groundwork for the success that Windows 7 has enjoyed thus far and that will continue long after the next iteration of Windows is released (more on that another day ;)).
Office 2010 on the other hand was a different beast. Microsoft routinely consults with customers to find out what kind of features they might be looking for in future products. For the past decade or so 80% of the most requested features have already been in the product for a while, users just weren’t able to find them. In order to make them more visible Microsoft created the ribbon system, putting nearly all the features less than one click away. Quite a lot of users found this to be quite annoying since they were used to the old way of doing things (and many old shortcuts no longer worked) but in the end it’s won over many of its critics showcased by its return in 2010.
What can this experience tell us about users? Whilst they’re a great source of ideas and feedback that you can use to improve your application sometime you have to make them sit down and take their medicine so that their problems can go away. Had Microsoft bent over to the demands of some of their more vocal users we wouldn’t have products like Windows 7 and Office 2010 that rose from the ashes of their predecessors. Of course many of the changes were initially driven by user feedback so I’m not saying that their input was completely worthless, more that sometimes in improving a product you’ll end up annoying some of your loyal users even if the changes are for their benefit.