Technology

Tailoring Stuff

When Will Buying Clothing Online be as Good as Offline?

I’m not exactly what you’d call a fashionista, the ebbs and flows of what’s current often pass me by, but I do have my own style which I usually refresh on a yearly basis. More recently this has tended towards my work attire, mostly because I spend a great deal more time in it than I did previously. However the act of shopping for clothes is one I like to avoid as I find it tiresome, especially when trying to find the right sizes to fit my not-so-normal dimensions. Thus I’ve recently turned towards custom services and tailoring in order to get what I want in the sizes that fit me but, if I’m honest, the online world still seems to be light years behind that which I can get from the more traditional fashion outlets.

Tailoring Stuff

For instance one of the most frustrating pieces of clothing for me to buy is business shirts. Usually they fall short in one of my three key categories (length, sleeve length and fit in the mid section) so I figured that getting some custom made would be a great way to go. So I decided that I’d last out for a couple shirts from 2 online retailers, Original Stitch and Shirts My Way, to see if I could get something that would tick all 3 categories. I was also going to do a review of them against each other to see which one of the retailers provided the better fit and would thus become my defacto supplier of shirts for the foreseeable future. However upon receiving both shirts I was greeted with the unfortunate reality: they both sucked.

They seemed to get some of the things right, like the neck size and overall shirt length, however they both seemed to be made to fit someone who weighed about 40kg more than I do with the mid section being like a tent. Both of them also had ridiculously billowy sleeves, making my arms appear to be twice as wide as they should be. I kind of expected something like this to happen with Original Stitch, since their measurements aren’t exactly comprehensive, but Shirts My Way also suffered from the same issues even though I followed their guidelines exactly. Comparing this to the things I’ve had fitted or tailored in the past I was extremely disappointed as I was expecting as good or better service.

The problem could be partially solved by technology as 3D scanning could provide extremely accurate sizing that online stores could then incorporate in order to ensure you got the right fit the first time around. In fact I’d argue that there should be some kind of open standard for this, allowing all the various companies to develop their brand of solutions for it that would be interoperable between different clothing companies. That is something of a pipe dream, I know, but I can’t be the only person who has had this kind of frustration trying to get the right fits from online retailers.

I guess for now I should just stick with the tried and true methods for getting the clothing that I want as the online experience, whilst infinitely more convenient, ultimately delivers a lacklustre product. I’m hopeful that change is coming although it’s going to take time for it to become widespread and I’m sure that there won’t be any standards across the industry for a long time after that. Maybe one day I’ll be able to order the right fits from the comfort of my own home but, unfortunately, that day is not today.

BBC Derp

The BBC Thinks all VPN Users are Pirates.

If you want Netflix in Australia there’s really only one way to do it: get yourself a VPN with an endpoint in the states. That’s not an entirely difficult process, indeed many of my less tech savvy friends have managed to accomplish it without any panicked phone calls to me. The legality of doing that is something I’m not qualified to get into but since there hasn’t been a massive arrest spree of nefarious VPN users I can’t imagine it’s far outside the bounds of law. Indeed you couldn’t really do that unless you also cracked down on the more legitimate users of VPN services, like businesses and those with regulatory commitments around protecting customer data. However if you’d ask the BBC users of VPNs are nothing but dirty pirates and it’s our ISP’s job to snoop on them.

BBC Derp

In a submission to the Australian Government, presumably under the larger anti-piracy campaign that Brandis is heading, the BBC makes a whole list of suggestions as to how they should go about combating Australia’s voracious appetite for purloined content. Among the numerous points is the notion that a lot of pirates now use a VPN to hide their nefarious activities. In the BBC’s world ISPs would take this as a kind of black flag, signalling that any heavy VPN user was likely also engaging in copyright infringement. They’d then be subject to the woeful idea of having their Internet slowed down or cut off, presumably if they couldn’t somehow prove that it was legitimate. Even though they go on to talk about false positives the ideas they discuss in their submission are fucking atrocious and I hope they never see the light of day.

I have the rather fortunate (or unfortunate, depending on how you look at it) ability of being able to do my work from almost anywhere I choose, including my home. This does mean that I have to VPN back into the mothership in order to get access to my email, chat and all other corporate resources which can’t be made available over the regular Internet. Since I do a lot of this at home under the BBC’s suggestion I’d probably be flagged as a potential pirate and be subject to measures to curb my behaviour. Needless to say I don’t think I’m particularly unique in this either so there’s vast potential for numerous false positives to spring up under this system.

Worse still all of those proposed measures fall on the ISP’s shoulders to design, implement and enforce. Not only would this put an undue burden on them, which they’d instantly pass onto us in the form of increased prices, it would also make them culpable when an infringing user figured out how to defeat their monitoring system. Now everyone knows that it doesn’t take long for people to circumvent these systems which, again, increases pressure on the ISPs to implement even more invasive and draconian systems. It’s a slippery slope that we really shouldn’t be going down.

Instead of constantly looking towards the stick as the solution to Australia’s piracy woes it’s time for companies, and the Australian government, to start looking at the carrot. Start looking at incentives for rights holders to license content in Australia or mandating that we get the same content at the same time for the same price as it is elsewhere. The numerous Netflix users in Australia shows there’s demand for such a service, we just need it to match the same criteria that customers overseas expect. Once we get that I’m sure you’ll see a massive reduction in the amount of piracy in Australia, coupled with the increase in sales that the right’s holders seem so desperate to protect.

Apple Watch Space Black

Now We Can Stop Talking About the iWatch.

I honestly couldn’t tell you how long I’ve been hearing people talk about Apple getting into the smartwatch business. It seemed every time that WWDC or any other Apple event rolled around there’d be another flurry of speculation as to what their wearable would be. Like most rumours details on it were scant and so the Internet, as always, circlejerked itself into a frenzy about a product that might not have even been in development. In the absence of a real product competitors stepped up to the plate and, to their credit, the devices have started to look more compelling. Well today Apple finally announced their Watch and it’s decidedly mediocre.

Apple Watch Space Black

For starters it makes the same mistake that many smartwatches do: it follows the current design trend for nearly all other smartwatches. Partly this is due to the nature of LCD screens being rectangular, limiting what you can do with them, however for a company like Apple you’d expect them to buck the trend a bit. Instead you’ve got what looks like an Apple-ized version of the Pebble Steel, not entirely unpleasing but at the same time feeling incredibly bland. I guess if you’re a fan of having a shrunken iPhone on your wrist then the style will appeal to you but honestly smartwatches which look like smartwatches are a definite turn off for me and I know I’m not alone in thinking this.

Details as to what’s actually under the hood of this thing are scarce, probably because unlike most devices Apple announces you won’t be able to get your hands on this one right away. Instead you’ll be waiting until after March next year to get your hands on one and the starting price is somewhere on the order of $350. That’s towards the premium end of the smartwatch spectrum, something which shouldn’t be entirely unexpected, and could be indicative of the overall quality of the device. Indeed what little details they’ve let slip do seem to indicate it’s got some decent materials science behind it (both in the sapphire screen and the case metals) which should hopefully make it a more durable device.

Feature wise it’s pretty much as you’d expect, sporting the usual array of notifications pushed from your phone alongside a typical array of sensors. Apple did finally make its way into the world of NFC today, both with the Apple Watch and the new iPhone, so you’ll be able to load up your credit card details into it and use the watch to make payments. Honestly that’s pretty cool, and definitely something I’d like to see other smartwatch manufacturers emulate, although I’m not entirely hopeful that it’ll work anywhere bar the USA. Apple also toutes an interface that’s been designed around the smaller screen but without an actual sample to look over I really couldn’t tell you how good or bad it would be.

So all that blather and bluster that preceded this announcement was, surprise, completely overblown and the resulting product really does nothing to stand out in the sea of computerized hand adornments. I’m sure there’s going to be a built in market from current Apple fans but outside that I really can’t see the appeal of the Apple Watch over the numerous other devices. Apple does have a good 6 months or so to tweak the product before release so there’s potential for it to become something before they drop it on the public.

Professional Memory Holder

DDR4 Appears on The Market; I Realise I’ve Been Under a Rock.

Whilst I don’t spend as much time as I used to keeping current with all things PC hardware related I still maintain a pretty good working knowledge of where the field is going. That’s partly due to my career being in the field (although I’m technically a services guy) but mostly it’s because I love new tech. You’d think then that DDR4, the next generation in PC memory, making its commercial debut wouldn’t be much of a surprise to me but I had absolutely no idea it was in the pipeline. Indeed had I not been building out a new gaming rig for a friend of mine I wouldn’t have known it was coming, nor that I could buy it today if I was so inclined.

Professional Memory Holder

Double Data Rate Generation 4 (DDR4) memory is the direct successor to the current standard, DDR3, which has been in widespread use since 2007. Both standards (indeed pretty much all memory standards) were developed by the Joint Electron Device Engineering Council (JEDEC) who have been working on DDR4 since about 2005. The reasoning behind the long lead times on new standards like this is complicated but it comes down to a function of getting everyone to agree to the standard, manufacturers developing products around said standard and then, finally, them making their way into the hands of consumers. Thus whilst new memory modules come and go with the regular tech cycle typically the standards driving them remain standard for the better part of a decade or two which is probably why this writer neglected to keep current on it.

In terms of actual improvements DDR4 seems like an evolutionary step forward rather than a revolutionary one. That being said the improvements introduced with the new specification are nothing to sneeze at with one of the big improvements being a reduction in the voltage (and thus power) that the specification requires. Typical DDR4 modules will now use 1.2V compared to DDR3’s 1.5V and the low voltage variant, typically seen in low power systems like smartphones and the like, goes all the way down to 1.05V.To end consumers this won’t mean too much but for large scale deployments the savings from running this new memory add up very quickly.

As you’d expect there’s also been a bump up in the operating speed of DDR4 modules, ranging from 2133Mhz all the way up to 4266Mhz. Essentially the lowest tier of performance DDR4 memory will match the top performers of DDR3 and the amount of headroom for future development is quite significant. This will have a direct impact on the performance of systems that are powered by DDR4 memory and whilst most consumers won’t notice the difference it’s definitely going to be a defining feature of enthusiast PCs for the next couple years. I know that I updated my dream PC specs to include it even though the first generation of products is only just hitting the market.

DDR4 chips are also meant to be a lot more dense than their DDR3 predecessors, especially considering that the specification has also accommodated 3D layering technologies like Samsung’s V-NAND. Many are saying that this will lead to DDR4 being cheaper for a comparable amount of memory vs DDR3 however right now you’ll be paying about a 40% premium on pretty much everything if you want to build a system around the new style of memory. This is to be expected though and whilst I can eventually see DDR4 eclipsing DDR3 on a price per gigabyte basis that won’t be for several years to come. DDR3 has 7 years worth of economies of scale built up and they won’t become irrelevant for a very long time.

So whilst I might be a little shocked that I was so out of the loop I didn’t know a new memory standard had made its way into reality I’m glad it has. The improvements might be incremental rather than a bold leap forward but progress in this sphere is so slow that anything is worth celebrating. The fact that you can build systems with it today is just another bonus, one that I’m sure is making dents in geek’s budgets the world over.

Facebook Bluray Archiving

You Won’t See Blu Ray Archiving Anytime Soon.

Ask your IT administrator what medium they back up all your data to and the answer is likely some form of magnetic tape storage. For many people that’d be somewhat surprising as the last time they saw a tape was probably a couple decades ago and it wasn’t used to store much more than a blurry movie or maybe a couple songs. However in the world of IT archiving and backup there’s really no other medium that can beat tapes for capacity, durability or cost. Many have tried to unseat tapes from their storage crown but they’re simply too good at what they do and Facebook’s latest experiment, using Blu Ray disc caddies as an archiving solution, isn’t likely to take over from them anytime soon.

Facebook Bluray ArchivingThe idea Facebook has come with is, to their credit, pretty novel. Essentially they’ve created these small Blu Ray caddies each of which contains 12 discs. These are all housed in a robotic enclosure which is about the size of a standard server rack. Each of these racks is capable of storing up to 10,000 discs which apparently gives rise to a total 1PB worth of storage in a single rack. Primarily it seems to be a response to their current HDD based backup solutions which, whilst providing better turn around for access, are typically far more costly than other archiving solutions. What interests me though is why Facebook would be pursuing something like this when there are other archiving systems already available, ones with much better ROI for the investment.

The storage figures quoted peg the individual disc sizes at 100GB something which is covered off under the BD-R XL specification. These discs aren’t exactly cheap and whilst I’m sure you could get a decent discount when buying 10,000 the street price for them is currently on the order of $60. If they’re able to even get a 50% discount on these discs that means that you’re still on the hook for about $300K just for the media. If you wanted to get a similar amount of storage on tapes (say using the 1.5TB HP LTO-5 which can be had for $40) you’re only paying about $27K a tenth of the cost. You could even halve that again if you were able to use compression on the tapes although honestly you don’t really need to at that price point.

Indeed pretty much every single advantage that Facebook is purporting this Blu Ray storage system to have is the same benefit you get with a tape drive. Tapes are low power, as their storage requires no active current draw, are readily portable (and indeed there are entire companies already dedicated to doing this for you) and have many of the same durability qualities that DVDs do. When you combine this with the fact that they’re an already proven technology with dozens of competitive offers on the table it really does make you wonder why Facebook is investigating this idea at all.

I’d hazard a guess it’s just another cool engineering product, something that they’ll trial for a little while before mothballing completely once they look at the costs of actually bringing something like that into production. I mean I like the idea, it’s always good to see companies challenging the status quo, however sometimes the best solutions are the ones that have stood the test of time. Tapes, whether you love them or hate them, by far outclass this system in almost all ways possible and that won’t change until you can get Blu Ray discs at the same dollars per gigabyte that you can get tapes. Even then Facebook is going to have to try hard to find some advantage that Blu Rays have that tapes don’t as right now I don’t think anyone can come up with one.

Can you?

 

Australia's Shitty Internet

Why We Need the Full FTTP NBN.

The unfortunate truth about telecommunications within Australia is that everyone is under the iron rule of a single company: Telstra. Whilst the situation has improved somewhat in the last decade, mostly under threat of legal action from the Australian government, Australia still remains something of an Internet backwater. This can almost wholly be traced back to the lack of investment on Telstra’s behalf in new infrastructure with their most advanced technology being their aging HFC networks that were only deployed in limited areas. This is why the NBN was such a great idea as it would radically modernize our telecommunications network whilst also ensuring that we were no longer under the control of a company that had long since given up on innovating.

Australia's Shitty Internet

 

To us Australians my opening statements aren’t anything surprising, this is the reality that we’ve been living with for some time now. However when outsiders look in, like say the free CDN/DDoS protection service Cloudflare (who I’ve recently started using again), and find that bandwidth from Telstrat is about 20 times more expensive than their cheapest providers it does give you some perspective on the situation. Whilst you would expect some variability for different locations (given the number of dark fiber connections and other infrastructure) a 20x increase does appear wildly out of proportion. The original NBN would be the solution to this as it would upend Telstra’s grip on the backbone connections that drive these prices however the Liberal’s new MTM solution will do none of this.

Right now much of the debate of the NBN has been framed around the speeds that will be delivered to customers however that’s really only half of the story. In order to support the massive speed increases that customers would be seeing with the FTTP NBN the back end infrastructure would need to be upgraded as well and this would include the interconnects that drive the peering prices that Cloudflare sees. Such infrastructure would also form the backbone of wide area networks that businesses and organisations use to connect their offices together, not to mention all the other services that rely on backhaul bandwidth. The MTM NBN simply doesn’t have the same requirements, nor the future expandability, to necessitate the investment in this kind of back end infrastructure and, worse still, the last mile connections will still be under the control of Telstra.

That last point is one I feel that doesn’t get enough attention in the mainstream media. The Liberals have released several videos that harp on about the point of making the right amount of investment in the NBN, citing that there’s a cut off point where extra bandwidth doesn’t enable people to do anything more. The problem with that thinking is though that, with the MTM NBN, you cannot guarantee that everyone will have access to those kinds of speeds. Indeed the MTM NBN can only guarantee 50Mbps to people who are 200m or less away from an exchange which, unfortunately, the vast majority of Australians aren’t. Comparatively FTTP can deliver the same speeds regardless of distances and also has the ability to provide higher speeds well into the future.

In all honesty though the NBN has been transformed from a long term, highly valuable infrastructure project to a political football, one that the Liberal party is intent to kick around as long as it suits their agenda. Australia had such potential to become a leader in Internet services with an expansive fiber network that would have rivalled all others worldwide. Instead we have a hodge podge solution that does nothing to address the issues at hand and the high broadband costs, for both consumers and businesses alike, will continue as long as Telstra controls a vast majority of the critical infrastructure. Maybe one day we’ll get the NBN we need but that day seems to get further and further away with each passing day.

HERNRER ERGERS

The NBN Cost Benefit Analysis is a Steaming Pile of Horeshit.

There seems to be this small section of my brain that’s completely disconnected from reality. At every turn with the Liberal’s and the NBN it’s been the part of my head that’s said “Don’t worry, I’m sure Turnbull and co will be honest this time around” and every single time it has turned out to be wrong. At every turn these “independent” reports have been stacked with personnel that all have vested interests in seeing Turnbull’s views come to light no matter how hard they have to bend the facts in order to do so. Thus all the reports that have come out slamming Labor’s solution are not to be trusted and the latest report, the vaulted cost benefit analysis that the Liberals have always harped on about, is a another drop on the gigantic turd pile that is the Liberal’s NBN.

HERNRER ERGERS

The problems with this cost-benefit analysis started long before the actual report was released. Late last year Turnbull appointed Henry Ergas as the head of the panel of experts that would be doing the cost-benefit analysis. The problem with this appointment is that he’d already published many pieces on the NBN before which where not only critical of the NBN but were also riddled with factual inaccuracies. So his opinion of the NBN was already well known prior to starting this engagement and thus he was not capable of providing a truly independent analysis, regardless of how he might want to present it. However in the interests of fairness (even though Turnbull won’t be doing so) let’s judge the report on it’s merits so we can see just how big this pile of horseshit is.

The report hinges primarily on a metric called “Willingness to Pay (WTP)” which is what Australians would be willing to pay for higher broadband speeds. The metric is primarily based around data gathered by the Institute for Choice which surveyed around 3,000 Australians about their current broadband usage and then showed them some alternative plans that would be available under the NBN. Problem is the way these were presented were not representative of all the plans available nor did they factor in things like the speed not being guaranteed on FTTN vs guaranteed speed on FTTP. So essentially all this was judging was people’s willingness to change to another kind of plan and honestly was not reflective of whether or not they’d want to pay more for higher broadband speeds.

Indeed this is even further reflected in the blended rate of probabilities used to determine the estimation of benefits with a 50% weighting applied to the Institute for Choice data and a 25% modifier to the other data (take-up demand and technical bandwidth demand) which, funnily enough, find in favour of the FTTP NBN solution. Indeed Table I makes it pretty clear that whenever there was multiple points of data the panel of experts decided to swing the percentages in ways that were favourable to them rather than providing an honest view of the data that they had. If the appointment of a long anti-NBN campaigner wasn’t enough to convince you this report was a total farce then this should do the trick.

However what really got me about this report was summed up perfectly in this quote:

The panel would not disclose the costs of upgrading [to] FTTP compared with other options, which were redacted from the report, citing “commercial confidentiality associated with NBN Co’s proprietary data”.

What the actual fuck.

So a completely government owned company is citing commercial in confidence for not disclosing data in a report that was commissioned by the government? Seriously, what the fuck are you guys playing at here? It’s obvious that if you included the cost of upgrading a FTTN network to FTTP, which has been estimated to cost at least $21 billion extra, then the cost-benefit would swing wildly in the direction of FTTP. Honestly I shouldn’t be surprised at this point as the report has already taken every step possible to avoid admitting that a FTTP solution is the better option. Hiding the upgrade cost, which by other reports commissioned by the Liberals is to be required in less than 5 years after completion of their FTTN NBN, is just another fact they want they want to keep buried.

Seriously, fuck everything about Turnbull and his bullshit. They’ve commissioned report after report, done by people who have vested interests or are in Turnbull’s favour, that have done nothing to reflect the reality of what the NBN was and what it should be. This is just the latest heaping on the pile, showcasing that the Liberals have no intention of being honest nor implementing a solution that’s to the benefit of all Australians. Instead they’re still focused on winning last year’s election and we’re all going to suffer because of it.

Panasonic Super Tablet

Super Tablets? You’re Not Simple, Are You?

Smartphones and laptops have always been a pain in the side of any enterprise admin. They almost always find themselves into the IT environment via a decidedly non-IT driven process, usually when an executive gets a new toy that he’d like his corporate email on. However the tools to support these devices have improved drastically allowing IT to provide the basic services (it’s almost always only email) and then be done with it. For the most part people see the delineation pretty clearly: smartphones and tablets are for mobile working and your desktop or laptop is for when you need to do actual work. I’ve honestly never seen a need for a device that crosses the boundaries between these two worlds although after reading this piece of dribble it seems that some C-level execs think there’s demand for such a device.

I don’t think he could be more wrong if he tried.

Panasonic Super TabletThe article starts off with some good points about why tablet sales are down (the market has been saturated, much like netbooks were) and why PC sales are up (XP’s end of life, although that’s only part of it) and then posits the idea of creating “super tablets” in order to reignite the market. Such a device would be somewhere in between an iPad and a laptop, sporting a bigger screen, functional keyboard and upgraded internals but keeping the same standardized operating system. According to the author such a device would bridge the productivity gap that currently divides tablets from other PCs giving users the best of both worlds. The rest of the article makes mention of a whole bunch of things that I’ll get into debunking later but the main thrust of it is that some kind of souped up tablet is the perfect device for corporate IT.

For starters the notion that PCs are hard to manage in comparison to tablets or smartphones is nothing short of total horseshit. The author makes a point that ServiceNow, which provides some incident management software, is worth $8 billion as some kind of proof that PCs break often and are hard to manage. What that fails to grasp is that ServiceNow is actually an IT Service Management company that also has Software/Platform as a Service offerings and thus are more akin to a company like Salesforce than simply an incident management company. This then leads onto the idea that the mobile section is somehow cheaper to run that its PC counterpart which is not the case in many circumstances. He also makes the assertion that desktop virtualization is expensive when in most cases it makes heavy use of investments that IT has already made in both server and desktop infrastructure.

In fact the whole article smacks of someone who seems cheerfully ignorant of the fact that the product that he’s peddling is pretty much an ultrabook with a couple of minor differences. One of the prime reasons people like tablets is their portability and the second you increase the screen size and whack a “proper” keyboard on that you’ve essentially given them a laptop. His argument is then that you need the specifications of a laptop with Android or iOS on it but I fail to see how extra power is going to make those platforms any more useful than they are today. Indeed if all you’re doing is word processing an Internet browsing then the current iteration of Android laptops does the job just fine.

Sometimes when there’s an apparent gap in the market there’s a reason for it and in the case of “super tablets” it’s because when you take what’s good about the two platforms it bridges you end up with a device that has none of the benefits of either. This idea probably arises from the incorrect notion that PCs are incredibly unreliable and hard to manage when, in actuality, that’s so far from reality it’s almost comical. Instead the delineations between tablets and laptops are based on well defined usability guidelines that both consumers and enterprise IT staff have come to appreciate. It’s like looking at a nail and a screw and thinking that combining them into a super nail will somehow give you the benefits of both when realistically they’re for different purposes and the sooner you realise that the better you’ll be at hammering and screwing.

 

14nmCosts

Intel Keeps Moore’s Law Alive With 14nm Fabrication.

The popular interpretation of Moore’s Law is that computing power, namely of the CPU, doubles every two years or so. This is then extended to pretty much all aspects of computing such as storage, network transfer speeds and so on. Whilst this interpretation has held up reasonably well in the past 40+ years since the law has coined it’s actually not completely accurate as Moore was actually referring to the number of components that could be integrated into a single package for a minimum cost. Thus the real driver behind Moore’s law isn’t performance, per se, it’s the cost at which we can provide said integrated package. Keeping on track with this law hasn’t been easy but innovations like Intel’s new 14nm process are what have been keeping us on track.

14nmCosts

CPUs are created through a process called Photolithography whereby a substrate, typically a silicon wafer, has the transistors etched onto it through a process not unlike developing a photo. The defining characteristic of this process is the minimum size of a feature that the process can etch on the wafer which is usually expressed in terms of nanometers. It was long thought that 22nm would be the limit for semiconductor manufacturing as this process was approaching the physical limitations of the substrates used. However Intel, and many other semiconductor manufacturers, have been developing processes that push past this and today Intel has released in depth information regarding their new 14nm process.

The improvements in the process are pretty much what you’d come to expect from a node improvement of this nature. A reduction in node size typically means that a CPU can be made with more transistors that performs better and uses less power than a similar CPU built on a larger sized node. This is most certainly the case with Intel’s new 14nm fabrication process and, interesting enough, they appear to be ahead of the curve so to speak, with the improvements in this process being slightly ahead of the trend. However the most important factor, at least in respect Moore’s Law, is that they’ve managed to keep reducing the cost per transistor.

One of the biggest cost drivers for CPUs is what’s called the yield of the wafer. Each of these wafers costs a certain amount of money and, depending on how big and complex your CPU is, you can only fit a certain number of them on there. However not all of those CPUs will turn out to be viable and the percentage of usable CPUs is what’s known as the wafer yield. Moving to a new node size typically means that your yield takes a dive which drives up the cost of the CPU significantly. The recently embargoed documents from Intel reveals however that the yield for the 14nm process is rapidly approaching that of the 22nm process which is considered to be Intel’s best yielding process to date. This, plus the increased transistor density that’s possible with the new manufacturing process, is what has led to the price per transistor dropping giving Moore’s law a little more breathing room for the next couple years.

This 14nm process is what will be powering Intel’s new Broadwell set of chips, the first of which is due out later this year. Migrating to this new manufacturing process hasn’t been without its difficulties which is what has led to Intel releasing only a subset of the Broadwell chips later this year, with the rest to come in 2015. Until we get our hands on some of the actual chips there’s no telling just how much of an improvement these will be over their Haswell predecessors but the die shrink alone should see some significant improvements. With the yields fast approaching those of its predecessors they’ll hopefully be quite reasonably priced too, for a new technology at least.

It just goes to show that Moore’s law is proving to be far more robust than anyone could have predicted. Exponential growth functions like that are notoriously unsustainable however it seems every time we come up against another wall that threatens to kill the law off another innovative way to deal with it comes around. Intel has long been at the forefront of keeping Moore’s law alive and it seems like they’ll continue to be its patron saint for a long time to come.

Google Plus

Winding Down Google+ is the Right Move, But Might Be Too Late.

When Google+ was first announced I counted myself among its fans. Primarily this was due to the interface which, unlike every other social media platform at the time, was clean and there was the possibility I could integrate all my social media in the one spot. However as time went on it became apparent that this wasn’t happening any time soon and the dearth of people actively using it meant that it just fell by the wayside. As other products got rolled into it I wasn’t particularly fussed, I wasn’t a big user of most of them in the first place, however I was keenly aware of the consternation from the wider user base. It seems that Google might have caught onto this and is looking to wind down the Google+ service.

Google Plus

Back in April the head of Google+, Vic Gundotra, announced that he was leaving the company. Whilst Google maintained that this would not impact on their strategy many sources reported that Google was abandoning its much loathed approach of integrating Google+ into everything and that decrease in focus likely meant a decrease in resources. Considering that no one else can come up for a good reason why Gundotra, a 7 year veteran of Google, would leave the company it does seem highly plausible that something is happening to Google+ and it isn’t good for his future there. The question in my mind then is whether or not winding down the service will restore the some of the goodwill lost in Google’s aggressive integration spree.

Rumours have it that Google+ Photos will be the first service to be let free from the iron grip of its parent social network. Considering that the Photos section of Google+ started out as the web storage part of their Picasa product it makes sense that this would be the first service to be spun out. How it will compete with other, already established offerings though is somewhat up in the air although they do have the benefit of already being tightly integrated with the Android ecosystem. If they’re unwinding that application then it makes you wonder if they’ll continue that trend to other services, like YouTube.

For the uninitiated the integration of YouTube and Google+ was met with huge amounts of resistance with numerous large channels openly protesting it. Whilst some aspects of the integration have been relaxed (like allowing you to use a pseudonym that isn’t your real name) the vast majority of features that many YouTubers relied on are simply gone, replaced with seemingly inferior Google+ alternatives. If Google+ is walking off into the sunset then they’d do well to bring back the older interface although I’m sure the stalwart opponents won’t be thanking Google if they do.

Honestly whilst I liked Google+ originally, and even made efforts to actively use the platform, it simply hasn’t had the required amount of buy in to justify Google throwing all of its eggs into that basket. Whilst I like some of the integration between the various Google+ services I completely understand why others don’t, especially if you’re a content creator on one of their platforms. Winding down the service might see a few cheers here or there but honestly the damage was already done and it’s up to Google to figure out how to win the users back in a post Google+ world.