Posts Tagged‘traffic’

Website Performance (or People are Impatient).

Way back when I used to host this server myself on the end of my tenuous ADSL connection loading up the web site always felt like something of a gamble. There were any number of things that could stop me (and the wider world) from getting to it like: the connection going down, my server box overheating or even the power going out at my house (which happened more often than I realised). About a year ago I made the move onto my virtual private server and instantly all those worries evaporated and the blog has been mostly stable ever since. I no longer have to hold my breath every time I type my url into the address bar nor do I worry about posting media rich articles anymore, something I avoided when my upstream was a mere 100KB/s.

What really impressed me though was the almost instant traffic boost that I got from the move. At the time I just put it down to more people reading my writing as I had been at it for well over a year and a half at that point. At the same time I had also made a slight blunder with my DNS settings which redirected all traffic from my subdomains to the main site so I figured that the burst in traffic was temporary and would drop off as people’s DNS caches expired. The strangest thing was though that the traffic never went away and continued to grow steadily. Not wanting to question my new found popularity I just kept doing what I was always doing until I stumbled across something that showed me what was happening.

April last year saw Google mix in a new metric to their ranking algorithm: page load speed, right around the same time that I experienced the traffic boost from moving off my crappy self hosting and onto the VPS. The move had made a significant improvement in the usability of the site, mostly due to the giant pipe that it has, and it appeared that Google was now picking up on that and sending more people my way. However the percentage of traffic coming here from search engines remained the same but since it was growing I didn’t care to investigate much further.

I started to notice some curious trends though when aggregating data from a couple different sources. I use 2 different kinds of analytics here on The Refined Geek the first being WordPress.com Stats (just because it’s real-time) and Google Analytics for long term tracking and pretty graphs. Now both of them agree with each other pretty well however the one thing they can’t track is how many people come to my site but leave before the page is fully loaded. In fact I don’t think there’s any particular service that can do this (I would love to be corrected on this) but if you’re using Google’s Webmaster Tools you can get a rough idea of the number of people that come from their search engine but get fed up waiting for your site to load. You can do this by checking the number of clicks you get from search queries and comparing that to the number of people visiting your site from Google Analytics. This will give you a good impression of how many people abandon your site because it’s running too slow.

For this site the results are quite surprising. On average I lose about 20% of my visitors between them clicking on the link in Google and actually loading a page¹. I shudder to think how many I was losing back in the days where a page would take 10+ seconds to load but I’d hazard a guess it was roughly double that if I take into account the traffic boost I got after moving to a dedicated provider. Getting your site running fast then is probably one of the most important things you can do if you’re looking to get anywhere on the Internets, at least that’s what my data is telling me.

After I realised this I’ve been on a bit of a performance binge, trying anything and everything to get it running better. I’m still in the process of doing so however and many of the tricks that people talk about for WordPress don’t translate well into the Windows world so I’m basically hacking my way through it. I’ve dedicated part of my weekend to this and I’ll hopefully write up the results next week so that you other crazy Windows based WordPressers can benefit from my tinkering.

¹If people are interested in finding out this kind of data from their Google Analytics/Webmasters Tools account let me know and I might run up a script to do the comparison for you.

 

Unexpected Joy.

Ah finally, a day where waking up was as natural as walking. I rolled over and checked the time, 8:00am a new record for this wearied traveller. I put my arm over my loved one and found her awake too which set off our normal morning ritual of getting ready for the day ahead. We had informally set aside this day to make our respective returns to Disneyland but hadn’t looked into it much past that. Figuring we could work out the details over breakfast we set about finding somewhere to get something to eat. Since this was our first full day here we assumed that finding something would be as easy as it was in NYC. Unfortunately it wasn’t and we wandered for a good 15 minutes before finding anything, making our future plans include finding somewhere good to go and making that our regular joint.

Just like in NYC southern California has its own version of CityPass which I picked up on when looking for cheap Disneyland tickets. Since it had been such good value before we decided to get them again even thought we didn’t think that we’d need 3 days at Disneyland (although we welcomed it). After battling through the dregs of the morning rush traffic we found ourselves outside those hallowed gates, now a multi-story parking lot capable of holding thousands of cars. We parked and then caught the tram into Disneyland proper and began our revisiting of childhood and teenage memories.

I’ll be honest, I was mostly doing this for Rebecca and was happy to go along with it as long as she enjoyed herself. That all changed almost the second I saw the massive TRON Legacy stage that was smack bang in the middle of the walkway. I had forgotten that Disney was behind the whole TRON thing, not to mention a whole lot of other cool things like their Tomorrowland Exhibits which house the Honda ASIMO robot. We spent a good 4 hours touring the various shows and riding the California Screamer rollercoaster. Towards the end of the day Rebecca wanted to ride on of those rapids rides which, while I’m not adversed to usually, didn’t particularly want to ride on since I’d be wearing wet jeans until we could get back to the hotel. Still I thought I’d go along with it and you can pretty much guess the outcome, I got completely soaked. Since we had booked tickets for Medieval Times I didn’t want to spend the next 5 hours in wet pants so we headed back to the hotel, and that’s when things took a turn for the worse.

Now I had been in some pretty heavy traffic back when I was driving around in Miami, but it was all pretty comparable to say Sydney or Melbourne during their more busy times. Still we figured that leaving at 4:30PM we’d have enough time to get to the hotel and back to Medieval times before it started at 7:00PM. It took us about 2 hours to get to the hotel which was only 28 miles from Disneyland. Thankfully the trip back was a little better but it still took us almost an hour to drive back making us late for our tournament. Still we were assured we’d get in as long as we had our tickets so we persevered.

Once we got in we grabbed our seats, right up in the front row. I had found a good set of coupons that gave us the best package you could get for less than the cost of the regular tickets and boy was it worth it. Even on a Thursday night the place was packed to the rafters and the crowd was roaring as we walked in. The entire show is really something to behold with skilled equestrians, falconers and enough choreographed fights to give me a thrill I wouldn’t soon forget. I had had reservations about it when I had first heard about it but after experiencing it for myself I was totally hooked, cheering for our champion and booing all others. If anyone has a chance to catch one of these shows I highly recommend you do, it’s 2 hours that you’ll be sure to enjoy, especially if you’re a fantasy nut like myself. Thankfully the drive home was much smoother than the previous trips, getting us back to the hotel in record time.

Tomorrow we’ve got some errands to take care of (like getting our clothes washed) but we’re hoping to hit up Universal Studios in Hollywood. After today’s experience at Disneyland I must say I’m now looking forward to visiting the other theme parks as well as spending another 2 days at Disneyland, as the tickets we bought allow us to. Hopefully too my server will be up soon so I can post these for you guys because I know how desperate you are to hear about my last week over here 😉

Welcome to the Smog.

I was awake long before the alarm went off, an annoying trait I picked up many years ago when I was tinkering with my body clock. It always seems that whenever I set an alarm I’ll be awake at least 10~20 minutes before hand but should I try to wake without an alarm I’ll more than likely oversleep. Still I wasn’t as tired as I thought I would be and 30 minutes later I was ready to go and spending the last few moments in our hotel room watching the morning news whilst Rebecca got ready. We left the hotel 30 minutes later and caught a cab to JFK airport where we spent the next hour or so tracking down some breakfast before boarding our flight.

JFK to LA is one of the longest flights in the states and it was set to be about 5 hours to get to our destination. Thankfully this flight had Wifi and in a fit of forethought I had bought a monthly pass to the Delta in-flight system for only $7 more than the daily pass, granting me the ability to catch up on my tweets and Facebook posts. I didn’t use it much more than that though I was far too engrossed in the last 150 pages of Judas Unchained to care about much else, practically gulping down those last chapters without coming up for air. Knowing that I had 5 hours to kill and more than enough time to finish that book I had brought the sequel, Judas Unchained, along with me and started tucking into that immediately. I got about 100 pages into it before the call came on for us to land in LA. Our 5 hour flight had only lost us 3 hours putting the local time smack bang on midday.

After grabbing our luggage we made our way to the airport shuttle area to catch a bus to the hire car place. I had spent a lazy hour or so looking over the cars available the night before and settled on a Dodge Charger. Realistically it would be far too big for us, more suited to a small family, but I wanted something that would provide us a deal of comfort over some of the longer drives we were planning and the econoboxes weren’t really going to cut it. Arriving there we were told they were out of Chargers and the only one they had resembled a battered old Ford Falcon. A “mid sized” SUV was available for a similar price so we went ahead with that and got to pick our car. I picked a Jeep Grand Cherokee and got us on our way to the hotel with gusto.

After the initial excitement of seeing the city I realised that what I thought was clouds on the horizon was actually the thick blanket of smog. As we got closer it only got worse with the haze giving most of the buildings a particularly eery glow. Even in NYC the pollution wasn’t this bad as when we looked back at Manhattan from Liberty Island we could clearly see everything. From a similar distance in LA we were struggling to see the more distant buildings. Needless to say, I didn’t feel much like looking around town.

The hotel itself is in downtown LA, a fact I had failed to notice when booking it. Usually this wouldn’t be much of a concern but I knew this meant they’d be charging like a wounded bull for their parking. Indeed they were to the tune of $35 a night, as much as the car had cost to hire per day. Still the valets were nice and it meant we didn’t have to worry about parking but after the amount we paid for the hotel getting hit for extras feels a bit rich. I had the same feeling when I hooked up my laptop in the room only to be shown an sign-in page for internet access for $13 a day. I’d stayed in a $70/night hotel that gave me the fastest Internet connection I’ve had in the entire US for free so this was just the icing on the shit sandwich. Undeterred I started poking around to see if they were doing ARP poisoning like the DoubleTree was in NYC and found that they weren’t. 10 minutes of trying various MAC addresses later had me up and running with an Internet connection without having to shell out for something that I don’t believe should be a paid extra in a 4 star hotel.

We wanted to pick up some hotel room supplies so I tracked down the closet Walmart and punched the co-ordinates. About 10 minutes later we were passing through a pretty run down section of downtown LA. We past the supposed location twice and noticed that what was supposed to be a Walmart looked like a run down strip mall. Figuring that it was either a planned site (or more likely, one that was closed down) I found another and promptly got stuck in pre-rush hour traffic, seeing a 10 mile journey take almost 45 minutes. We got there though and secured out wants before heading back to the hotel for a workout before dinner. We hit a local pizza place that served amazing pizzas, leaving us both too full to finish and taking a box home to polish off later.

We’re hoping to hit up Disney World tomorrow as it’s something that Rebecca has really been looking forward to. After visiting Disney Land in Tokyo almost a decade ago and hating it (but then again I was an angsty little bastard) I’m keen to see if I’ll have a similar reaction the second time around. Plus it will hopefully get us away from the dreadful smog that seems to be unrelenting around here, making staying in the city very undesirable.

Resistance is Futile, Integration is Inevitable.

Enabling your users to interact with your application through the use of open APIs has been a staple of the open web since its inception over a decade ago. Before that as well the notion of letting people modify your product helped to create vast communities of people dedicated to either improving the user experience or creating features that the original creators overlooked. I can remember my first experience with this vividly, creating vast levels in the Duke Nukem 3D level creator and showing them off to my friends. Some of these community developed products can even become the killer feature of the original application itself and whilst this is a boon for the application itself it pose some issues to the developer.

Probably the earliest example I can think of this would have to be World of Warcraft. The client has a pretty comprehensive API available that enabled people to create modifications to do all sorts of wonderful things, from the more mundane inventory managers to boss timer mods that helped keep a raid coordinated.  After a while many mods became must haves for any regular player and for anyone who wanted to join in the 40 persons raids they became critical to achieving success. Over the years many of these staple mods got replaced by Blizzard’s very own implementations of them ensuring that anyone that was able to play the game was guaranteed to have them. Whilst most of the creators weren’t enthused that all their hard work was now being usurped by their corporate overlords many took it as a challenge to create even more interesting and useful mods, ensuring their user base stayed loyal.

More recently this issue has come to light with Twitter who are arguably popular due to the countless hours of work done by third parties. Their incredibly open API has meant that anything they were able to do others could do to, even to the point of them doing it better than them. In fact it’s at the point where only a quarter of their traffic is actually on their main site, the other three quarters is from their API. This shows that whilst they’ve built an incredibly useful and desirable service they’re far from the best providers of it, with their large ecosystem of applications filling in the areas where it falls down. More recently however Twitter has begun incorporating features into its product that used to be provided by third parties and the developer community hasn’t been too happy about it.

The two most recent bits of technology that Twitter has integrated have been the new Tweet button (previously provided by TweetMeme) and their new link shortening service t.co which was handled by dozens of others.  The latter wasn’t unique to Twitter at all and whilst many of the new comers to the link shortening space made their name on Twitter’s platform many of them report that it’s no longer their primary source of traffic. The t.co shortener is then really about Twitter taking control of the platform that they developed and possibly using the extra data they can gather from it as leverage in brokering advertising and partnership deals. The Tweet button however is a little bit more interesting.

Way back when news aggregator sites were all the rage. From Digg to Del.icio.us to Reddit there were all manner of different sites designed around the central idea of sharing online content with others. Whilst the methods of story aggregation differed from service to service most of them ended up implementing some kind of “Add this story to X” button that could be put on your website. This served two purposes: it helped readers show a little love to the article by giving it some attention on another site and secondly it gave content to the other site to link to, with little involvement from the user. The TweetMeme button then represented a way to drive Twitter adoption further and at the same time get even more data on their users than they previously had before. Twitter, for what it’s worth, said they licensed some of the technology from TweetMeme for their button however they have still in essence killed off one of their popular services and that’s begun to draw the ire of some developers.

The issue many developers take with Twitter building these services into their main product is because it puts a chilling effect on products based on Twitter’s ecosystem. Previously if you had built something that augmented their service chances were you could build yourself quite the web property. Unlike other companies which would acquire these innovator’s companies in order to integrate their technology Twitter has instead taken to developing the same products themselves, in direction competition with those innovators. The reasons behind this are simple, Twitter simply doesn’t have the cash available to do acquisitions like the big guys do. They’re kind of stuck between a rock and a hard place as whilst they need to encourage innovation using their platform they can’t let it go on forever, lest they become irrelevant past delivering an underlying service. Realistically the best option for them is to start generating some cash in order to start acquiring innovator’s technology rather than out competing them but they’re still too cash poor for them to this to be viable.

In the end if you build your product around someone else’s service you’re really putting yourself at their mercy. The chill that Twitter is putting on their developers probably won’t hurt them in the long run should they not continue to copy other’s solutions to their problems however their fledgling advertising based business model is at odds with all the value add developers. Twitter is quite capable of doing some impressive innovation on their own (see #newtwitter) but their in house development is nothing compared to the hordes of third parties who’ve been doing their part to improve their ecosystem. I’m interested to see what direction they go with this, especially so since I’m working on what could be classed as a competing service.

Although I’m hoping people don’t see it that way 😛