You don’t have to look far on this blog to know that I’m a Sony fan although my recent choice in products would tell you otherwise. I do genuinely appreciate them as a company as whilst they’ve made a whole bunch of mistakes they’ve also delivered some amazing products on the years, typically in industries where they’re far from being industry leaders. My relationship began with them many years ago when I first laid my hands on the original PlayStation console and has continued on since then.
Today they announced the next generation of their home entertainment systems: the PlayStation 4.
Whilst the event is still unfolding while I’m writing this there’s already been a lot of rumours confirmed, surprises unveiled and of course a whole bunch of marketing blather that no one is interesting in hearing. Among the confirmed rumours are the fact that it’s an x86 platform under the hood, the controller has a touchpad on it (among several other features including a Kinectesque motion tracking system) and a customized PC GPU. Of course the really interesting things are the features that have managed to remain secret throughout the various leaks and speculative sprees that have been occurring over the past couple months.
For starters it appears that the PS4 will come equipped with a whopping 8GB of GDDR5 rather than the 4GB that was previously advertised. This is interesting because the Durango apparently faced issues trying to integrate that amount of memory due to the bandwidth requirements and thus opted to go with DDR3 and a speedy 32MB cache to counter-act that. Sony has either made a last minute change to the design to get specification parity (although 4GB GDDR5 is arugably much better than 8GB of DDR3) or had this planned for quite a long time, meaning that they overcame the engineering challenge that Durango couldn’t (or wouldn’t, for various reasons).
One of the much speculated features was the integration of streaming services allowing users to share screenshots, game clips and all manner of things. Part of the leaked specifications for both Durango and Orbis hinted at an external processing unit that would enable this without the main GPU or CPU taking a hit. This has come to fruition and it appears that Ustream will the the platform of choice. Whilst I know a lot of people aren’t particularly thrilled with this (it seems a lot of us gamers didn’t get out of the anti-social gaming box we cocooned ourselves in during our formative years) for someone like me who reviews games it’s an absolute godsend as it means that my convoluted recording rig won’t be required just so I can get a few in game screen shots. Realistically this is just an organic progression of features that have been present in some games to making them available natively in the platform, something I’m sure the developers are thankful for.
There’s also a swath of remote play stuff which looks like a natural progression of the stuff that’s already in the PS3/PSP combo. Some of the pictures shown during the stream indicate that it might extend further than just the Vita and that’d definitely be something as not everyone (not even me, shocking I know) wants to invest in a Vita in order to get that kind of functionality. With their acquisition of Gaikai, which was ostensibly for the streaming backwards compatibility that’ll come for PS1/2/3 games, they do have the opportunity to take that same streaming and let you play your games anywhere with your PS4 providing the underlying grunt. There’s no mention of that specifically but all the key parts are there and that’d certainly give them a leg up on Microsoft when it comes to delivering a ubiquitous platform.
Fanboyism aside the PS4 does genuinely look like a great piece of hardware and the services that are being built on top of it are going to be really competitive. Sony has been lagging behind Microsoft for a long time in the services space and it looks like for the first time they’ll at least be at parity with them. We’ll have to wait for the Durango announcement first before we can make true comparisons between the two but if the leaks are anything to go by it’s going to be a good time for us gamers, whatever our chosen platform is.
Now if only they gave us a release date. That one delicious piece of information is curiously absent.
There’s an expectation upon purchasing a console that it will remain current for a decent length of time, ostensibly long enough so that you feel that you got your money’s worth whilst also not too long that the hardware starts to look dated in comparison to everything else that’s available. Violating either of these two constraints usually leads to some form of consumer backlash like it did when the Xbox360 debuted rather shortly after the original Xbox. With the next generation bearing down on us the question of how long this generation of consoles will last, and more importantly stay relevant, is a question that’s at the forefront of many people’s minds.
Certainly from a purely specifications perspective the next generation of high performance consoles aren’t going to be among the fastest systems available for long. Both of them are sporting current gen CPUs and GPUs however it’s quite likely that their hardware will be superseded before they ever hit the retail shelves. AMD is currently gearing up to release their 8000 series GPUs sometime in the second quarter of this year. The CPUs are both based off AMD’s Jaguar micro-architecture and should be current for at least a year or so after their initial release, at least in terms of the AMD line, although with the release of Haswell from Intel scheduled for some time in the middle of this year means that even the CPUs will be somewhat outdated upon release. This is par for the course for any kind of IT hardware however so it shouldn’t come as much of a surprise that more powerful options will be available even before their initial release.
Indeed consoles have always had a hard time keeping up with PCs in terms of raw computing power although the lack of a consistent, highly optimizable platform is what keeps consoles in the game long after their hardware has become ancient. There does come a time however when the optimizations just aren’t sufficient and the games start to stagnant which is what led to the more noticeable forms of consolization that made their way into PC games. It’s interesting to note this as whilst the current generations of consoles have been wildly popular since their inception the problem of consolization wasn’t really apparent until many years afterwards, ostensibly when PC power started to heavily outstrip the current gen consoles’ abilities.
Crytek head honcho Cevat Yerli has gone on record saying that even the next gen consoles won’t be able to keep up with PCs when it comes to raw power. Now this isn’t a particularly novel observation in itself, any PC gamer would be able to tell you this, but bringing in the notion of price is an intriguing one. As far as we can tell the next generation of consoles will come out at around $600, maybe $800 if Sony/Microsoft don’t want to use them as loss leaders any more. Whilst they’re going to be theoretically outmatched by $2000 gaming beasts from day 1 it gets a lot more interesting if we start making comparisons to a similarly priced PC and the capabilities it will have. In that regard consoles actually offer quite a good value proposition for quite a while to come.
So out of curiosity I specced up a PC that was comparable to the next gen consoles and came out at around $950. At this end of the spectrum prices aren’t affected as much by Moore’s Law since they’re so cheap already and the only part that would likely see major depreciation would be the graphics card which came in at about $300. Still, taking the optimizations that can be made on consoles into account, the next gen consoles do represent pretty good value for the performance they will deliver on release and will continue to do so for at least 2~3 (1~2 iterations of Moore’s Law) years afterwards thanks to their low price point. Past then the current generation of CPUs and GPUs will perform well enough at the same price point in order to beat them in a price per dollar scenario.
In all honesty I hadn’t really thought of making a direct comparison at the same price point before and the results were quite surprising. The comparison is even more apt now thanks to the next generation coming with a x86 architecture underneath which essentially makes them cheap PCs. Sure they may never match up to the latest and greatest but they sure do provide some pretty good value. Whilst I didn’t think they’d have trouble selling these things this kind of comparison will make the decision to buy one of them that much easier, at least to people like me who are all about extracting the maximum value for their dollars spent.
Well another year has gone by since my last post on the iPad so that must mean its time for Apple to release another one. The tech media has been all abuzz about what Apple had in store for us today (like there was any doubt) ever since Apple sent out invites to the event that, as of writing, is still taking place. Speculation has been running rampant as to what will be included in the next version and what will be left by the wayside. Not wanting to disappoint their fans Apple has announced the next version of the iPad (strangely bereft of any nomenclature denoting its version) and it’s pretty much met expectations.
Usually I’d chuck a photo of the device up here for good measure but the new iPad is basically identical to the last one as far as looks go, being only slightly thicker and heavier than its predecessor. Honestly there’s little room for innovation in looks as far as tablets go, just look at any other tablet for comparison, so it’s no surprise that Apple has decided to continue with the same original design. Of course this might come to the dismay of Apple fans out there, but there’s at least one defining feature that will visually set the new iPad apart from its predecessors.
That feature is the screen.
If you cast your mind back a year (or just read the first linked post) you’ll notice that rumours of a retina level screen for the iPad have been circulating around for quite some time. At the time many commented that such a resolution would be quite ludicrous, like near the resolution of Apple’s 30″ cinema dislpays kind of ludicrous. Sure enough the now current generation of iPad sports a 2048 by 1536 resolution display which gives it a PPIof 264, double that of the iPad 2. Whilst everyone is calling this a “retina” level display its actually far from it as the screen in the iPhone 4s sports 326 PPI or about 20% more pixels. The display will still look quite incredible, hell even monitors with a lower resolution and an order of magnitude more size manage to look great, but calling it a retina display is at best disingenuous.
Of course to power that monster of a screen Apple has had to upgrade the processor. The new chip is dubbed the A5X and sports a dual core general CPU and a quad core graphics chip. As always Apple is keeping the gritty details a closely guarded secret but it’s safe to assume that it sports a faster clock rate and has more integrated RAM than its predecessor. I wouldn’t be surprised if it was something along the lines of 1.2GHz with 1024MB of RAM as that would put it on par with many other devices currently on the market. We’ll have to wait for the tear downs to know for sure though.
Apart from that there’s little more that’s changed with the new iPad. The camera is slightly better being able to take 5MP stills and film 1080p video. Whilst you won’t find Siri on this yet you will now be given the option of doing speech-to-text on the iPad. That’s pretty much it for what’s new with the iPad and whilst I wouldn’t think that’d be a compelling reason to upgrade from the 2 I’m sure there will be many who do exactly that.
I’ll be honest with you, I’ve been eyeing off an iPad for quite some time now. I had had my eye on many an Android tablet for a while but the fact remains that the iPad has the greatest tablet ecosystem and for the use cases I have in mind (read: mostly gaming) there’s really no competition. The new iPad then, whilst not being worth the upgrade in my opinion, has reached a feature level where it represents good value for those looking to enter into the tablet market. If you’re just looking for a general tablet however there are many other options which would provide far more value, bar the insanely high resolution screen.
Apple’s yearly release schedule seems to be doing wonders for them and the new iPad will not likely be an exception to that. Past the screen and new processor there’s really nothing new about the now current generation iPad but I can see many people justifying their purchase based on those two things alone. The really interesting thing to watch from now will be how Apple goes about developing their ecosystem as whilst the iPad can boast the best tablet experience Google’s not too far behind, just waiting for the chance at the crown.
7 months down the line and I’m still a big fan of my Samsung Galaxy S2. It’s been a great phone, combining large screen size with a slim, lightweight shell that I sometimes have to check for to remind myself that its still in my pocket. It’s surprisingly resilient as well, having taken more than a couple drops from pretty decent heights and coming out the other end with only minor scuffs and nary a scratch on the screen. Sadly I can’t say much more for the battery life as it seems that the more apps I pile on there the worse it gets, but I can’t really blame the phone for my app hoarding ways.
However I always knew that this relationship would be temporary, I mean how could it not? It started with geek wunderlust and as it is with all relationships that start like that it’s inevitable that my eyes would begin to wander, and so they have with this announcement:
…Ladies and gentlemen, here is the Samsung Galaxy S III:
- 1.5GHz quad-core Samsung Exynos processor
- 4.8-inch “full HD” 1080p resolution with 16:9 aspect ratio display
- A 2-megapixel front-facing camera and an 8-megapixel rear camera
- Ceramic case
- 4G LTE
- Android 4.0
I’ll spare you the photoshopped Galaxy S2 images that are doing the rounds but suffice to say those specs are pretty darn amazing. They’re also fairly plausible as well given Samsung’s research into the component technologies and current trends for both carriers and the Android platform. The detail that caught my eye however was the ceramic case as that’s not a material that you’d usually expect to see on a mobile phone with plastic and glass being the only 2 real choices. There could be reasoning behind it though and if my suspicions are correct its due to the crazy amount of tech they’ve stuffed under the hood.
Traditionally ceramics are pretty poor heat conductors which is why they make for good mugs and insulation materials. However there are quite a few advanced ceramics that are very capable of moving heat just as efficiently as most metals are, some even better. Now anyone who has a dual core smart phone knows how hot the buggers get when you’re using them for an extended period and since most phones are plastic that heat tends to stick around rather than dissipate. The ceramic case could then be an attempt to mitigate the heat problems that will come with the quad core processor and larger screen. This also has the potential to make the phones somewhat more brittle however (ceramics don’t flex, they shatter) so it will be interesting to see how Samsung compensates for that.
With just those few details though I’m already excited for Samsung’s next instalment in their flagship line of smart phones. Their last 2 iterations of the Galaxy S line have gone from strength to strength, firmly cementing themselves as the number one Android handset manufacturer. The Galaxy S3 looks to continue this trend with specifications that are sure to tempt even the most recent purchasers of the S2. I know I’ll find it hard to resist and I’m thankful that it probably won’t be out for a little while longer.
I don’t think my wallet would appreciate buying 2 phones within 7 months of each other 😉
Like any technology geek real world performance of a component is the most important aspect for me when I’m looking to purchase new hardware. Everyone knows manufacturer’s can’t be trusted with ratings, especially when they come up with their own systems that provide big numbers that mean absolutely nothing, so I primarily base my purchasing decisions based on aggregating reviews from various sources around the Internet in order to get a clear picture of which brand/revision I should get. After that point I usually go for the best performance per dollar as whilst it’s always nice to have the best components the price differential is usually not worth the leap, mostly because you won’t notice the incremental increase. There are of course notable exceptions to this hard and fast rule and realistically my decision in the end wasn’t driven by rational thought so much as it was pure geeky lust after the highest theoretical performance.
Solid State Drives present quite an interesting value proposition for us consumers. They are leaps and bounds faster than their magnetic predecessors thanks to their ability to access data instantaneously and their extremely high throughput rates. Indeed with the hard drive being the bottleneck of performance for nearly every computer in the world the most effective upgrade you can get is that of a SSD. Of course nothing can beat magnetic hard drives for their cost, durability and capacity so it’s very unlikely that we’ll be seeing the end of them anytime soon. Still the enormous gap that separates SSDs from any other storage medium brings about some interesting issues of its own: benchmarks, especially synthetic ones, are almost meaningless for end users.
I’ll admit I was struck by geek lust when I saw the performance specs for the OCZ Vertex 3, they were just simply amazing. Indeed the drive has matched up to my sky high expectations with me being able to boot, login and open up all my applications in the time it took my previous PC just to get to the login screen. Since then I’ve been recommending the Vertex 3 to anyone who was looking to get a new drive but just recently OCZ announced their new budget line of SSDs, the Agility 3. Being almost $100 cheaper and sporting very similar performance specs to that of the Vertex it’s a hard thing to argue against especially when you consider just how fast these SSDs are in the first place.
Looking at the raw figures it would seem like the Agility series are around 10% slower than their Vertex counterparts on average, which isn’t bad for a budget line. However when you consider that the 10% performance gap is the difference between your windows loading in 6.3 seconds rather than 7 and your applications launching in 0.9 seconds instead of 1 then the gap doesn’t seem all that big. Indeed I’d challenge anyone to be able to spot the differences between two identical systems configured with different SSDs as these kinds of performance differences will only matter to benchmarkers and people building high traffic systems.
Indeed one of my mates had been running a SSD for well over a year and a half before I got mine and from what he tells me the performance of units back then was enough for him to not notice any slow down after not formatting for that entire time. Likely then if you’re considering getting a SSD but are turned off by the high price of current models you’ll be quite happy with the previous generation as the perceived performance will be identical. Although with the Agility 3 120GB version going for a mere $250 the price difference between generations isn’t really that much anymore.
Realistically SSDs are just the most prominent example of why synthetic benchmarks aren’t a good indicator of real world performance. There’s almost always an option that will provide similar performance for a drastically reduced price and for the end user the difference will likely be unnoticeable. SSDs are just so far away from their predecessors that the differentials between the low and high end are usually not worth mentioning, especially if you’re upgrading from good old spinning rust. Of course there will always be geeks like me whose lust will overcome their sensibility and reach for the ultimate in performance, which is why those high end products still exist today.
My main PC at home is starting to get a little long in the tooth, having been ordered back in the middle of 2008 and only receiving upgrades of a graphics card and a hard drive since then. Like all PCs I’ve had it suffered a myriad of problems that I just usually put up with until I stumbled across a work around, but I think the vast majority of them can be traced to a faulty motherboard (Can’t put more than 4GB of RAM in it or it won’t post) and a batch of faulty hard drives (that would randomly park the heads causing it to freeze). At the time I had the wonderful idea of buying the absolute latest so I could upgrade cheaply for the next few years, but thanks to the consolization of games I found that wasn’t really necessary.
To be honest it’s not even really necessary now either, with all the latest games still running at full resolution and most at high settings to boot. I am starting to lag on the technology front however with my graphics card not supporting DirectX 11 and everything but the RAM being 2 generations behind (yes, I have a Core 2 Duo). So I took it upon myself to build a rig that combined the best performance available of the day rather than trying to focus on future compatibility. Luckily for me it looks like those two are coinciding.
Just because like any good geek I love talking shop when it comes to building new PCs here are the specs of the potential beast in making:
The first couple choices I made for this rig were easy. Hands down the best performance out there is with the new Sandy Bridge i7 chips with the 2600K being the top of the lot thanks to its unlocked multiplier and hyperthreading, which chips below the 2600 lack. The choice of graphics cards was a little harder as whilst the Radeon comes out leagues ahead on a price to performance ratio the NVIDIA cards still had a slight performance lead overall, but hardly enough to justify the price. Knowing that I wanted to take advantage of the new SATA 6Gbps range of drives that were coming out my motherboard choice was almost made for me as the Asrock P67 seems to be one of the few that has more than 4 of the ports available (it has 6, in fact).
The choice of SSD however, whilst extremely easy at the time, became more complicated recently.
You see back in the initial pre-production review round the OCZ Vertex 3 came out shooting, blasting away all the competition in a seemingly unfair comparison to its predecessors. I was instantly sold especially considering the price was looking to be quite reasonable, around the $300 mark for a 120GB drive. Sure I could opt for the bigger drive and dump my most frequently played games on it but in reality a RAID10 array of SATA 6Gbps drives should be close enough without having to overspend on the SSD. Like any pre-production reviews I made sure to keep my ear to the ground just in case something changed once they started churning them out.
Of course, something did.
The first production review that grabbed my attention was from AnandTech, renowned for their deep understanding of SSDs and producing honest and accurate reviews. The results for my drive size of choice, the 120GB, were decidedly mixed on a few levels with it falling down in several places where the 240GB version didn’t suffer any such problems. Another review confirmed the figures were in the right ballpark although unfortunately lacking a comparison to the 240GB version. The reasons behind the performance discrepancies are simple, whilst functionally the same drives the differences come from the number of NAND chips used to create the drive. The 240GB version has double the amount of the 120GB version which allows for higher throughput and additionally grants the drive a larger scratch space that it can use to optimize its performance¹.
So of course I started to rethink my position. The main reason for getting a real SSD over something like the PCIe bound RevoDrive was that I could use it down the line as a jumbo flash drive if I wanted to and I wouldn’t have to sacrifice one of my PCIe lanes to use it. The obvious competitor to the OCZ Vertex 3 would be something like the Intel 510 SSD but the reviews haven’t been very kind to this device, putting it barely in competition with previous generation devices.
After considering all my options I think I’ll still end up going with the OCZ Vertex 3 at the 120GB size. Whilst it might not be the kind of performance in every category it does provide tremendous value when compared to a lot of other SSDs and it will be in another league when compared to my current spinning rust hard drive. Once I get around to putting this new rig together you can rest assured I’ll put the whole thing through its paces, if at the very least to see how the OCZ Vertex 3 stacks up against the numbers that have already been presented.
¹Ever wondered why some SSDs are odd sizes? They are in fact good old fashioned binary sizes (128GB and 256GB respectively) however the drive reserves a portion of that (8GB and 16GB) to use as scratch space to write and optimize data before committing it. Some drives also use it as a buffer for when flash cells become unwritable (flash cells don’t usually die, you just can’t write to them anymore) so that the drive’s capacity doesn’t degrade.