Posts Tagged‘wordpress’

WordPress Randomly Dying? Check MaxClients.

Long time readers will know how much this blog has struggled with its various incarnations over the past 4 years. Initially I ran it from home on a server that I was using for development purposes so it ran inside a virtual machine that contained not one, but two database engines (MS-SQL for development and MySQL for the blog) all behind the tenuous 1.5Mbit upstream connection. This held up ok until I wanted to do anything fancy like put pictures on there (which would kill the connection for anything over 50kb) and it was relatively unstable, going down for days at a time since I couldn’t get a reliable remote connection to it. Since then I’ve churned my way through different virtual private servers (and all the issues they have) before landing on my current Burst.NET Ubuntu box which has been the best of the bunch so far.

Well, on the surface at least.

Apache MaxClients Ubuntu Error Log

Since my blog as attained a steady amount of traffic it usually doesn’t take long for someone to pipe up when it goes down, especially if it happens during the day time in Australia. Since I now have remote access to the server I’m one command away from rebooting it should anything happen to it and have done so multiple times when it has come to my attention. However there’s a good 12 or so hours during the day when I’m not really paying attention to the blog due to being at home and/or sleep and downtime during this period usually goes unnoticed until I try to login during the morning. Since a good chunk of my audience is in the USA this can mean an awful amount of missed traffic which isn’t the greatest way to start the day.

Now when I first set up the blog on this host there were a couple teething issues (mostly due to my rusty Linux skills) but for probably 2 months afterwards everything ran without the slightest indication of an issue. Then every so often the blog would simply stop responding, the server would be up and everything else on it was running fine but try as I might I couldn’t get it to serve out a PHP page. Wanting to get it back up as quickly as I could I recycled the Apache service and it came back up instantly and I figured it was just some transient error and went back to my everyday blogging routine. However it kept happening, usually at the most inopportune times, and so last weekend I sat down to find the root cause of the issue.

Turns out its WordPress itself.

The above screenshot shows the error pretty quickly, essentially Apache has reached the maximum number of clients it can serve and will start to reject users after that point. Whilst the causes of this are wide and varied it can usually the culprit can usually be traced down to some WordPress plugin or script that’s opening up connections and then not closing them properly. The best way to take care of this is to fix the script in question but since I have little interest in diving into the mess that is PHP I’ve simply upped the MaxClients setting, reduced the time out period and scheduled an Apache reboot to clear out anything that gets stuck open. All of these combined seems to be an effective solution to this issue in the mean time and once I feel up to the task of delving through all the code to find the offending script I’ll nip it in the bud for good.

Apart from that little quirk though this iteration of the blog’s underlying infrastructure has been pretty fantastic with all the plugins functioning the way I expect them to without me having to fiddle with web.config settings for hours on end. It’s also significantly faster as well, reducing page load times by half for dynamic pages and becoming near instant when its served from cache. You could attribute this to the fact that it’s a lot beefier than its predecessor but neither of them showed significant load for an extended period of time. I guess where I’m going with this is that if you’re going to host your own WordPress blog it’s just plain better on Linux, especially if you’ve better things to be doing (like, you know, blogging).

Website Performance (or People are Impatient).

Way back when I used to host this server myself on the end of my tenuous ADSL connection loading up the web site always felt like something of a gamble. There were any number of things that could stop me (and the wider world) from getting to it like: the connection going down, my server box overheating or even the power going out at my house (which happened more often than I realised). About a year ago I made the move onto my virtual private server and instantly all those worries evaporated and the blog has been mostly stable ever since. I no longer have to hold my breath every time I type my url into the address bar nor do I worry about posting media rich articles anymore, something I avoided when my upstream was a mere 100KB/s.

What really impressed me though was the almost instant traffic boost that I got from the move. At the time I just put it down to more people reading my writing as I had been at it for well over a year and a half at that point. At the same time I had also made a slight blunder with my DNS settings which redirected all traffic from my subdomains to the main site so I figured that the burst in traffic was temporary and would drop off as people’s DNS caches expired. The strangest thing was though that the traffic never went away and continued to grow steadily. Not wanting to question my new found popularity I just kept doing what I was always doing until I stumbled across something that showed me what was happening.

April last year saw Google mix in a new metric to their ranking algorithm: page load speed, right around the same time that I experienced the traffic boost from moving off my crappy self hosting and onto the VPS. The move had made a significant improvement in the usability of the site, mostly due to the giant pipe that it has, and it appeared that Google was now picking up on that and sending more people my way. However the percentage of traffic coming here from search engines remained the same but since it was growing I didn’t care to investigate much further.

I started to notice some curious trends though when aggregating data from a couple different sources. I use 2 different kinds of analytics here on The Refined Geek the first being WordPress.com Stats (just because it’s real-time) and Google Analytics for long term tracking and pretty graphs. Now both of them agree with each other pretty well however the one thing they can’t track is how many people come to my site but leave before the page is fully loaded. In fact I don’t think there’s any particular service that can do this (I would love to be corrected on this) but if you’re using Google’s Webmaster Tools you can get a rough idea of the number of people that come from their search engine but get fed up waiting for your site to load. You can do this by checking the number of clicks you get from search queries and comparing that to the number of people visiting your site from Google Analytics. This will give you a good impression of how many people abandon your site because it’s running too slow.

For this site the results are quite surprising. On average I lose about 20% of my visitors between them clicking on the link in Google and actually loading a page¹. I shudder to think how many I was losing back in the days where a page would take 10+ seconds to load but I’d hazard a guess it was roughly double that if I take into account the traffic boost I got after moving to a dedicated provider. Getting your site running fast then is probably one of the most important things you can do if you’re looking to get anywhere on the Internets, at least that’s what my data is telling me.

After I realised this I’ve been on a bit of a performance binge, trying anything and everything to get it running better. I’m still in the process of doing so however and many of the tricks that people talk about for WordPress don’t translate well into the Windows world so I’m basically hacking my way through it. I’ve dedicated part of my weekend to this and I’ll hopefully write up the results next week so that you other crazy Windows based WordPressers can benefit from my tinkering.

¹If people are interested in finding out this kind of data from their Google Analytics/Webmasters Tools account let me know and I might run up a script to do the comparison for you.

 

IIS 7.5, WordPress and WinCache: A Match Made in Hell.

This blog has had a variety of homes over the past few years although you wouldn’t know it by looking at it. Initially it was hosted on a Windows 2008 server I built myself, sitting behind the tenuous link of my ADSL connection. Don’t get me wrong this is a great way to get started if you’ve got admin roots like me but inevitably my ADSL connection would go down or people would just plain give up waiting for it to load, what with my upstream only able to handle 100KB/s. Still for most of its life the blog remained in that configuration as I couldn’t find a hosting provider I was happy with.

Of course the day came when WordPress decided to stop playing nice with IIS and started returning internal server 500 errors. Thankfully it would usually right itself after a reboot but it was always a count down to the time when it would start erroring out again and being the busy man that I am I never had the time to troubleshoot it. Eventually I caved and set up an Ubuntu box to host it, figuring that all my woes would be solved by switching to the platform that everyone expects WordPress to run on. I’ll be honest it was a good change as I could finally use all the caching plugins, and traffic took an upward trend thanks to the faster loading times.

Unfortunately that didn’t last particularly long either as whilst the blog was particularly zippy the Linux VM would sometimes stop responding to requests and would only start behaving itself after a reboot. The cause of this I’m still not sure of as the VM was still up but it just refused to keep on serving web pages, including all the funky admin tools my PHPMyAdmin and Webmin. It was around this time I found myself in possession of a shiny new VPS that was only hosting my fledgling app Lobaco so I figured a small time WordPress blog wouldn’t be too much for it to handle. Indeed it wasn’t and the blog has been steaming along on it ever since.

However the unfortunate internal server errors returned eventually and whilst I was able to get around them with the trusty old reboot a couple times they became more persistent until I eventually couldn’t get rid of them. After digging around in the event logs for a while I eventually stumbled across references to php_wincache.dll which upon googling lead me to posts like these, showing I wasn’t alone in this internal server error hell. Disabling the plugin fixed the problem and all was well with the world. Of course many months later I found myself  trying to optimize my blog again and I started looking at the things I had removed in order to keep this thing up and running.

The first was the caching plug-ins which are unequivocally the best thing for performance on a dynamic PHP site. The vast majority of WordPress caching plug-ins don’t play nice with Windows as they make the assumption they’re on Linux and attempt to write files in all sorts of whacky locations that simply don’t exist. WP-SuperCache, although still suffering from some Linux based assumptions, can be wrangled into working properly with IIS and has been doing so for the past couple months. I also found that WinCache had been updated since I had unceremoniously removed it from my php.ini file so I decided to give it another try. Again everything was rosy for a time, that was until last weekend.

I fired up my blog on Saturday to find the home page coming up fine but I was logged out for some reason. This happens from time to time so I wasn’t worried but trying to login left me with the dreaded internal server 500 error. Poking around it looked like any non-cached page was failing meaning the majority of my site was unavailable. The event logs showed the dreaded WinCache dll failing again and disabling it brought my website back around again. It seems, at least for now, that I’ll have to give WinCache a miss as the last update to it was almost 3 months ago and its past performance has led me to believe that it’s not entirely stable.

So if you’re crazy like me, trying to run WordPress on IIS and all, and you’re WordPress blog seems to take a dive more often than not make sure to get rid of WinCache at least until they get their act together. I haven’t delved into my previous VMs to see if it was the culprit back then but my most recent set of problems can be traced directly back to WinCache wrecking havoc by attempting to cache PHP objects and if this post can save 1 person the headache of trying to track it down I’ll consider it a huge success

Respect for the Content Creators.

I’ve been at this whole blog thing for a while now. Not as long as many of the big names mind you but long enough to get into the culture and social conventions that fellow bloggers adhere to. As with anything on the Internet the rules are fast and loose and the worst thing that will happen to you for breaking them will usually be an angry email from someone you didn’t even know you could offend. For the most part though I’ve avoided incurring the wrath of any of my fellow netizens, apart from the good old fashioned trolls who make an appearance anywhere on the web.

One of these unspoken rules is that if you’re going to use someones content, maybe a quote from an article or picture off their website, you provide a link back to their site. The reasoning behind this is that the biggest gateway to the Internet, Google, uses the number of sites linking in as a sort of popularity count to judge how relevant a site is to a particular search. The more links you have coming in the more popular you are and the higher up in the search results your sites will appear. There is of course many other factors taken into consideration but nothing beats a good old fashioned link from someone else’s site to yours, especially if it comes from what Google considers to be a highly ranked page in itself.

Personally I have no problem with giving out links to those who’ve created content that I have purloined for my site. Usually I’m taking a quote from an article that’s inspired me to write a post on something and they deserve to have their work recognised. More often than not though I’m not even using the content directly and giving them a link as to support my own view which I’m putting forth. This healthy little eco-system of tit-for-tat means that the original content creators get the credit they deserve and the information gets freely distributed across the web.

More recently however it’s become apparent that some people are more interested in just taking the content and not giving credit where its due. I’ve come across a couple sites that have blatantly copied my articles verbatim and posted them to their sites as their own. You’d think I wouldn’t be able to find most of them but since quite a few of my articles contain links to my other writings on the site these content thieves unwittingly send links my way. When their site is eventually crawled by Google they show up on my report that shows all the links coming back into my site. For the most part though they’re a minority, and I’ve happily ignored the majority of them (in fact most of them seem to disappear rather quickly, leading me to believe they’re probably scam/malware sites).

What does get going though is webmasters who don’t trust people to do the right thing on the Internet. If you were one of the lucky few yesterday who read my charming piece on China vs Google you may have noticed an intrusive ad right in the middle of a quoted article. Now I make no secret that the original article there came from The Register and did my civic duty in providing a link back to their site. Unbeknownst to me however was that the quote I took from that article, by way of copy and paste from the site, had been injected with a tangled mass of HTML and JavaScript that wasn’t visible in the WordPress editor. Additionally the code has been designed to only trigger when copied into an editor capable of HTML rendering, as the code is nowhere to be seen when its copied into notepad.

This isn’t the first time I’ve come across this kind of chicanery either. Many sites have been using a small bit of JavaScript to inject additional lines into content when it has been copied from their site, usually containing a link or two back to their site. I didn’t have a problem with most of these as they showed up in the editor and I never forget to give a link where its due. This new trick from The Register however was something far more sinister as it not only hid itself from my view it also put a rather obnoxious ad for videos right in the middle of my post. Heaven forbid if their ad server ever got compromised and started serving malware which would in turn make me look like the perpetrator of such nefarious deeds.

I’ve got no issue giving credit where credit is due for those people who work hard to get stories out and give us lowly bloggers a bit of fodder to toy with. However I take offense when the trust between the creator and their wider audience is broken and they resort to such spineless tactics as to mangle the clipboard data with code that attempts to hide itself from plain view. If you have a problem with people copying your content don’t inject obnoxious advertisments, instead handle the situation properly. If you have a terms of use for quoted content then copy that in instead. Sure people will still get around it but mangling the copied content with crappy HTML and JavaScript will only help to offend those who are more than likely trying their best to promote your content. The content pirates won’t care either way.

Sure it was a small thing and it took me all of 10 seconds to go into the HTML editor and remove it but I can’t help but feel like that implicit trust that had been there for so long has been cast by the wayside by those who think we’re all out to profit off their hard work. Nothing could be further from the truth, I want people to read the original articles that’s why I link to them, but there are few organisations out there who just have to be unnecessarily rude by doing these things and they’re not going to win any friends by doing so.

Don’t make me write a plugin to scrub your cruft from WordPress blogs automatically. Hell hath no fury like a blogger/programmer scorned.