One thing that always fascinates me is how much (or indeed how little) technology can change some processes. Technology almost always makes things better, faster and cheaper but you’d think there’s a few areas where technology simply couldn’t put a dent in good old fashioned human processes. I don’t know why but when I saw the following video I thought there would be no way that modern processes could be better suited to the task than simply giving it over to a stone mason. By the end of the video however I was stunned at just how fast, and accurate, we could mill out a giant block of sandstone.
Honestly I probably should have expected it as I’ve seen numerous demonstrations of similar technology producing wildly intricate show pieces using all sorts of material. However I figured something like this, a craft that many would have thought was now in the domain of only a handful of dedicated practitioners, would be better suited to human hands. I have to say though that I doubt anyone today could carve out something like that in the space of 10 hours, even if you counted in all the preparation time they did before hand. It’s surprisingly hard to find out just how long it took to carve your average stone gargoyle unfortunately so I’m not sure how this compares to times when stone carving a s a profession was more common.
Realistically though that’s all a flimsy premise for me to post yet another large engineering demonstration video. I can’t help it though, they tickle me in all the right ways
In a world where Siri can book you a restaurant and Google Now can tell you when you should head for the gate at the airport it can feel like the AI future that many sci-fi fantasies envisioned is already here. Indeed to some extent it is, many aspects of our lives are now farmed out to clouds of servers that make decisions for us, but those machines still lack a fundamental understanding of, well, anything. They’re what are called expert systems, algorithms trained on data to make decisions in a narrow problem space. The AI future that we’re heading towards is going to be far more than that, one where those systems actually understand data and can make far better decisions based on that. One of the first steps to this is IBM’s Watson and it’s creators have done something amazing with it.
Whilst currently only open to partner developers IBM has created an API for Watson, allowing you to pose it a question and receive an answer. There’s not a lot of information around what data sets it currently understands (the example is in the form of a Jeopardy! question) but their solution documents reference a Watson Content Store which, presumably, has several pre-canned training sets to get companies started with developing solutions. Indeed some of the applications that IBM’s partner agencies have already developed suggest that Watson is quite capable of digesting large swaths of information and providing valuable insights in a relatively short timeframe.
I’m sure many of my IT savvy readers are seeing the parallels between Watson and a lot of the marketing material that surrounds anything with the buzzword “Big Data”. Indeed much of the concepts of operation are similar: take big chunks of data, throw them into a system and then hope that something comes out the other end. However Watson’s API suggests something that’s far more accessible, dealing in native human language and providing evidence to back up the answers it gives you. Compare this to Big Data tools, which often require you to either learn a certain type of language or create convoluted reports, and I think Watson has the ability to find widespread use while Big Data keeps its buzzword status.
For me the big applications for something like this come for places where curating domain specific knowledge is a long, time consuming task. Medicine and law both spring to mind as there’s reams of information available to power a Watson based system and those fields could most certainly benefit from having easier access to those vast treasure troves. It’s pretty easy to imagine a lawyer looking for all precedents set against a certain law or a doctor asking for all diseases with a list of symptoms, both queries answered with all the evidence to boot.
Of course it remains to be seen if Watson is up to the task as whilst it’s prowess on Jeopardy! was nothing short of amazing I’ve still yet to see any of its other applications in use. The partner applications do look very interesting, and should hopefully be the proving grounds that Watson needs, but until it starts seeing widespread use all we really have to go on is the result of a single API call. Still I think it has great potential and hopefully it won’t be too long before the wider public can get access to some of Watson’s computing genius.
I’m not exactly what you’d call a fashionista, the ebbs and flows of what’s current often pass me by, but I do have my own style which I usually refresh on a yearly basis. More recently this has tended towards my work attire, mostly because I spend a great deal more time in it than I did previously. However the act of shopping for clothes is one I like to avoid as I find it tiresome, especially when trying to find the right sizes to fit my not-so-normal dimensions. Thus I’ve recently turned towards custom services and tailoring in order to get what I want in the sizes that fit me but, if I’m honest, the online world still seems to be light years behind that which I can get from the more traditional fashion outlets.
For instance one of the most frustrating pieces of clothing for me to buy is business shirts. Usually they fall short in one of my three key categories (length, sleeve length and fit in the mid section) so I figured that getting some custom made would be a great way to go. So I decided that I’d last out for a couple shirts from 2 online retailers, Original Stitch and Shirts My Way, to see if I could get something that would tick all 3 categories. I was also going to do a review of them against each other to see which one of the retailers provided the better fit and would thus become my defacto supplier of shirts for the foreseeable future. However upon receiving both shirts I was greeted with the unfortunate reality: they both sucked.
They seemed to get some of the things right, like the neck size and overall shirt length, however they both seemed to be made to fit someone who weighed about 40kg more than I do with the mid section being like a tent. Both of them also had ridiculously billowy sleeves, making my arms appear to be twice as wide as they should be. I kind of expected something like this to happen with Original Stitch, since their measurements aren’t exactly comprehensive, but Shirts My Way also suffered from the same issues even though I followed their guidelines exactly. Comparing this to the things I’ve had fitted or tailored in the past I was extremely disappointed as I was expecting as good or better service.
The problem could be partially solved by technology as 3D scanning could provide extremely accurate sizing that online stores could then incorporate in order to ensure you got the right fit the first time around. In fact I’d argue that there should be some kind of open standard for this, allowing all the various companies to develop their brand of solutions for it that would be interoperable between different clothing companies. That is something of a pipe dream, I know, but I can’t be the only person who has had this kind of frustration trying to get the right fits from online retailers.
I guess for now I should just stick with the tried and true methods for getting the clothing that I want as the online experience, whilst infinitely more convenient, ultimately delivers a lacklustre product. I’m hopeful that change is coming although it’s going to take time for it to become widespread and I’m sure that there won’t be any standards across the industry for a long time after that. Maybe one day I’ll be able to order the right fits from the comfort of my own home but, unfortunately, that day is not today.
If you want Netflix in Australia there’s really only one way to do it: get yourself a VPN with an endpoint in the states. That’s not an entirely difficult process, indeed many of my less tech savvy friends have managed to accomplish it without any panicked phone calls to me. The legality of doing that is something I’m not qualified to get into but since there hasn’t been a massive arrest spree of nefarious VPN users I can’t imagine it’s far outside the bounds of law. Indeed you couldn’t really do that unless you also cracked down on the more legitimate users of VPN services, like businesses and those with regulatory commitments around protecting customer data. However if you’d ask the BBC users of VPNs are nothing but dirty pirates and it’s our ISP’s job to snoop on them.
In a submission to the Australian Government, presumably under the larger anti-piracy campaign that Brandis is heading, the BBC makes a whole list of suggestions as to how they should go about combating Australia’s voracious appetite for purloined content. Among the numerous points is the notion that a lot of pirates now use a VPN to hide their nefarious activities. In the BBC’s world ISPs would take this as a kind of black flag, signalling that any heavy VPN user was likely also engaging in copyright infringement. They’d then be subject to the woeful idea of having their Internet slowed down or cut off, presumably if they couldn’t somehow prove that it was legitimate. Even though they go on to talk about false positives the ideas they discuss in their submission are fucking atrocious and I hope they never see the light of day.
I have the rather fortunate (or unfortunate, depending on how you look at it) ability of being able to do my work from almost anywhere I choose, including my home. This does mean that I have to VPN back into the mothership in order to get access to my email, chat and all other corporate resources which can’t be made available over the regular Internet. Since I do a lot of this at home under the BBC’s suggestion I’d probably be flagged as a potential pirate and be subject to measures to curb my behaviour. Needless to say I don’t think I’m particularly unique in this either so there’s vast potential for numerous false positives to spring up under this system.
Worse still all of those proposed measures fall on the ISP’s shoulders to design, implement and enforce. Not only would this put an undue burden on them, which they’d instantly pass onto us in the form of increased prices, it would also make them culpable when an infringing user figured out how to defeat their monitoring system. Now everyone knows that it doesn’t take long for people to circumvent these systems which, again, increases pressure on the ISPs to implement even more invasive and draconian systems. It’s a slippery slope that we really shouldn’t be going down.
Instead of constantly looking towards the stick as the solution to Australia’s piracy woes it’s time for companies, and the Australian government, to start looking at the carrot. Start looking at incentives for rights holders to license content in Australia or mandating that we get the same content at the same time for the same price as it is elsewhere. The numerous Netflix users in Australia shows there’s demand for such a service, we just need it to match the same criteria that customers overseas expect. Once we get that I’m sure you’ll see a massive reduction in the amount of piracy in Australia, coupled with the increase in sales that the right’s holders seem so desperate to protect.
I honestly couldn’t tell you how long I’ve been hearing people talk about Apple getting into the smartwatch business. It seemed every time that WWDC or any other Apple event rolled around there’d be another flurry of speculation as to what their wearable would be. Like most rumours details on it were scant and so the Internet, as always, circlejerked itself into a frenzy about a product that might not have even been in development. In the absence of a real product competitors stepped up to the plate and, to their credit, the devices have started to look more compelling. Well today Apple finally announced their Watch and it’s decidedly mediocre.
For starters it makes the same mistake that many smartwatches do: it follows the current design trend for nearly all other smartwatches. Partly this is due to the nature of LCD screens being rectangular, limiting what you can do with them, however for a company like Apple you’d expect them to buck the trend a bit. Instead you’ve got what looks like an Apple-ized version of the Pebble Steel, not entirely unpleasing but at the same time feeling incredibly bland. I guess if you’re a fan of having a shrunken iPhone on your wrist then the style will appeal to you but honestly smartwatches which look like smartwatches are a definite turn off for me and I know I’m not alone in thinking this.
Details as to what’s actually under the hood of this thing are scarce, probably because unlike most devices Apple announces you won’t be able to get your hands on this one right away. Instead you’ll be waiting until after March next year to get your hands on one and the starting price is somewhere on the order of $350. That’s towards the premium end of the smartwatch spectrum, something which shouldn’t be entirely unexpected, and could be indicative of the overall quality of the device. Indeed what little details they’ve let slip do seem to indicate it’s got some decent materials science behind it (both in the sapphire screen and the case metals) which should hopefully make it a more durable device.
Feature wise it’s pretty much as you’d expect, sporting the usual array of notifications pushed from your phone alongside a typical array of sensors. Apple did finally make its way into the world of NFC today, both with the Apple Watch and the new iPhone, so you’ll be able to load up your credit card details into it and use the watch to make payments. Honestly that’s pretty cool, and definitely something I’d like to see other smartwatch manufacturers emulate, although I’m not entirely hopeful that it’ll work anywhere bar the USA. Apple also toutes an interface that’s been designed around the smaller screen but without an actual sample to look over I really couldn’t tell you how good or bad it would be.
So all that blather and bluster that preceded this announcement was, surprise, completely overblown and the resulting product really does nothing to stand out in the sea of computerized hand adornments. I’m sure there’s going to be a built in market from current Apple fans but outside that I really can’t see the appeal of the Apple Watch over the numerous other devices. Apple does have a good 6 months or so to tweak the product before release so there’s potential for it to become something before they drop it on the public.
Whilst I don’t spend as much time as I used to keeping current with all things PC hardware related I still maintain a pretty good working knowledge of where the field is going. That’s partly due to my career being in the field (although I’m technically a services guy) but mostly it’s because I love new tech. You’d think then that DDR4, the next generation in PC memory, making its commercial debut wouldn’t be much of a surprise to me but I had absolutely no idea it was in the pipeline. Indeed had I not been building out a new gaming rig for a friend of mine I wouldn’t have known it was coming, nor that I could buy it today if I was so inclined.
Double Data Rate Generation 4 (DDR4) memory is the direct successor to the current standard, DDR3, which has been in widespread use since 2007. Both standards (indeed pretty much all memory standards) were developed by the Joint Electron Device Engineering Council (JEDEC) who have been working on DDR4 since about 2005. The reasoning behind the long lead times on new standards like this is complicated but it comes down to a function of getting everyone to agree to the standard, manufacturers developing products around said standard and then, finally, them making their way into the hands of consumers. Thus whilst new memory modules come and go with the regular tech cycle typically the standards driving them remain standard for the better part of a decade or two which is probably why this writer neglected to keep current on it.
In terms of actual improvements DDR4 seems like an evolutionary step forward rather than a revolutionary one. That being said the improvements introduced with the new specification are nothing to sneeze at with one of the big improvements being a reduction in the voltage (and thus power) that the specification requires. Typical DDR4 modules will now use 1.2V compared to DDR3’s 1.5V and the low voltage variant, typically seen in low power systems like smartphones and the like, goes all the way down to 1.05V.To end consumers this won’t mean too much but for large scale deployments the savings from running this new memory add up very quickly.
As you’d expect there’s also been a bump up in the operating speed of DDR4 modules, ranging from 2133Mhz all the way up to 4266Mhz. Essentially the lowest tier of performance DDR4 memory will match the top performers of DDR3 and the amount of headroom for future development is quite significant. This will have a direct impact on the performance of systems that are powered by DDR4 memory and whilst most consumers won’t notice the difference it’s definitely going to be a defining feature of enthusiast PCs for the next couple years. I know that I updated my dream PC specs to include it even though the first generation of products is only just hitting the market.
DDR4 chips are also meant to be a lot more dense than their DDR3 predecessors, especially considering that the specification has also accommodated 3D layering technologies like Samsung’s V-NAND. Many are saying that this will lead to DDR4 being cheaper for a comparable amount of memory vs DDR3 however right now you’ll be paying about a 40% premium on pretty much everything if you want to build a system around the new style of memory. This is to be expected though and whilst I can eventually see DDR4 eclipsing DDR3 on a price per gigabyte basis that won’t be for several years to come. DDR3 has 7 years worth of economies of scale built up and they won’t become irrelevant for a very long time.
So whilst I might be a little shocked that I was so out of the loop I didn’t know a new memory standard had made its way into reality I’m glad it has. The improvements might be incremental rather than a bold leap forward but progress in this sphere is so slow that anything is worth celebrating. The fact that you can build systems with it today is just another bonus, one that I’m sure is making dents in geek’s budgets the world over.
Ask your IT administrator what medium they back up all your data to and the answer is likely some form of magnetic tape storage. For many people that’d be somewhat surprising as the last time they saw a tape was probably a couple decades ago and it wasn’t used to store much more than a blurry movie or maybe a couple songs. However in the world of IT archiving and backup there’s really no other medium that can beat tapes for capacity, durability or cost. Many have tried to unseat tapes from their storage crown but they’re simply too good at what they do and Facebook’s latest experiment, using Blu Ray disc caddies as an archiving solution, isn’t likely to take over from them anytime soon.
The idea Facebook has come with is, to their credit, pretty novel. Essentially they’ve created these small Blu Ray caddies each of which contains 12 discs. These are all housed in a robotic enclosure which is about the size of a standard server rack. Each of these racks is capable of storing up to 10,000 discs which apparently gives rise to a total 1PB worth of storage in a single rack. Primarily it seems to be a response to their current HDD based backup solutions which, whilst providing better turn around for access, are typically far more costly than other archiving solutions. What interests me though is why Facebook would be pursuing something like this when there are other archiving systems already available, ones with much better ROI for the investment.
The storage figures quoted peg the individual disc sizes at 100GB something which is covered off under the BD-R XL specification. These discs aren’t exactly cheap and whilst I’m sure you could get a decent discount when buying 10,000 the street price for them is currently on the order of $60. If they’re able to even get a 50% discount on these discs that means that you’re still on the hook for about $300K just for the media. If you wanted to get a similar amount of storage on tapes (say using the 1.5TB HP LTO-5 which can be had for $40) you’re only paying about $27K a tenth of the cost. You could even halve that again if you were able to use compression on the tapes although honestly you don’t really need to at that price point.
Indeed pretty much every single advantage that Facebook is purporting this Blu Ray storage system to have is the same benefit you get with a tape drive. Tapes are low power, as their storage requires no active current draw, are readily portable (and indeed there are entire companies already dedicated to doing this for you) and have many of the same durability qualities that DVDs do. When you combine this with the fact that they’re an already proven technology with dozens of competitive offers on the table it really does make you wonder why Facebook is investigating this idea at all.
I’d hazard a guess it’s just another cool engineering product, something that they’ll trial for a little while before mothballing completely once they look at the costs of actually bringing something like that into production. I mean I like the idea, it’s always good to see companies challenging the status quo, however sometimes the best solutions are the ones that have stood the test of time. Tapes, whether you love them or hate them, by far outclass this system in almost all ways possible and that won’t change until you can get Blu Ray discs at the same dollars per gigabyte that you can get tapes. Even then Facebook is going to have to try hard to find some advantage that Blu Rays have that tapes don’t as right now I don’t think anyone can come up with one.
The unfortunate truth about telecommunications within Australia is that everyone is under the iron rule of a single company: Telstra. Whilst the situation has improved somewhat in the last decade, mostly under threat of legal action from the Australian government, Australia still remains something of an Internet backwater. This can almost wholly be traced back to the lack of investment on Telstra’s behalf in new infrastructure with their most advanced technology being their aging HFC networks that were only deployed in limited areas. This is why the NBN was such a great idea as it would radically modernize our telecommunications network whilst also ensuring that we were no longer under the control of a company that had long since given up on innovating.
To us Australians my opening statements aren’t anything surprising, this is the reality that we’ve been living with for some time now. However when outsiders look in, like say the free CDN/DDoS protection service Cloudflare (who I’ve recently started using again), and find that bandwidth from Telstrat is about 20 times more expensive than their cheapest providers it does give you some perspective on the situation. Whilst you would expect some variability for different locations (given the number of dark fiber connections and other infrastructure) a 20x increase does appear wildly out of proportion. The original NBN would be the solution to this as it would upend Telstra’s grip on the backbone connections that drive these prices however the Liberal’s new MTM solution will do none of this.
Right now much of the debate of the NBN has been framed around the speeds that will be delivered to customers however that’s really only half of the story. In order to support the massive speed increases that customers would be seeing with the FTTP NBN the back end infrastructure would need to be upgraded as well and this would include the interconnects that drive the peering prices that Cloudflare sees. Such infrastructure would also form the backbone of wide area networks that businesses and organisations use to connect their offices together, not to mention all the other services that rely on backhaul bandwidth. The MTM NBN simply doesn’t have the same requirements, nor the future expandability, to necessitate the investment in this kind of back end infrastructure and, worse still, the last mile connections will still be under the control of Telstra.
That last point is one I feel that doesn’t get enough attention in the mainstream media. The Liberals have released several videos that harp on about the point of making the right amount of investment in the NBN, citing that there’s a cut off point where extra bandwidth doesn’t enable people to do anything more. The problem with that thinking is though that, with the MTM NBN, you cannot guarantee that everyone will have access to those kinds of speeds. Indeed the MTM NBN can only guarantee 50Mbps to people who are 200m or less away from an exchange which, unfortunately, the vast majority of Australians aren’t. Comparatively FTTP can deliver the same speeds regardless of distances and also has the ability to provide higher speeds well into the future.
In all honesty though the NBN has been transformed from a long term, highly valuable infrastructure project to a political football, one that the Liberal party is intent to kick around as long as it suits their agenda. Australia had such potential to become a leader in Internet services with an expansive fiber network that would have rivalled all others worldwide. Instead we have a hodge podge solution that does nothing to address the issues at hand and the high broadband costs, for both consumers and businesses alike, will continue as long as Telstra controls a vast majority of the critical infrastructure. Maybe one day we’ll get the NBN we need but that day seems to get further and further away with each passing day.
There seems to be this small section of my brain that’s completely disconnected from reality. At every turn with the Liberal’s and the NBN it’s been the part of my head that’s said “Don’t worry, I’m sure Turnbull and co will be honest this time around” and every single time it has turned out to be wrong. At every turn these “independent” reports have been stacked with personnel that all have vested interests in seeing Turnbull’s views come to light no matter how hard they have to bend the facts in order to do so. Thus all the reports that have come out slamming Labor’s solution are not to be trusted and the latest report, the vaulted cost benefit analysis that the Liberals have always harped on about, is a another drop on the gigantic turd pile that is the Liberal’s NBN.
The problems with this cost-benefit analysis started long before the actual report was released. Late last year Turnbull appointed Henry Ergas as the head of the panel of experts that would be doing the cost-benefit analysis. The problem with this appointment is that he’d already published many pieces on the NBN before which where not only critical of the NBN but were also riddled with factual inaccuracies. So his opinion of the NBN was already well known prior to starting this engagement and thus he was not capable of providing a truly independent analysis, regardless of how he might want to present it. However in the interests of fairness (even though Turnbull won’t be doing so) let’s judge the report on it’s merits so we can see just how big this pile of horseshit is.
The report hinges primarily on a metric called “Willingness to Pay (WTP)” which is what Australians would be willing to pay for higher broadband speeds. The metric is primarily based around data gathered by the Institute for Choice which surveyed around 3,000 Australians about their current broadband usage and then showed them some alternative plans that would be available under the NBN. Problem is the way these were presented were not representative of all the plans available nor did they factor in things like the speed not being guaranteed on FTTN vs guaranteed speed on FTTP. So essentially all this was judging was people’s willingness to change to another kind of plan and honestly was not reflective of whether or not they’d want to pay more for higher broadband speeds.
Indeed this is even further reflected in the blended rate of probabilities used to determine the estimation of benefits with a 50% weighting applied to the Institute for Choice data and a 25% modifier to the other data (take-up demand and technical bandwidth demand) which, funnily enough, find in favour of the FTTP NBN solution. Indeed Table I makes it pretty clear that whenever there was multiple points of data the panel of experts decided to swing the percentages in ways that were favourable to them rather than providing an honest view of the data that they had. If the appointment of a long anti-NBN campaigner wasn’t enough to convince you this report was a total farce then this should do the trick.
However what really got me about this report was summed up perfectly in this quote:
The panel would not disclose the costs of upgrading [to] FTTP compared with other options, which were redacted from the report, citing “commercial confidentiality associated with NBN Co’s proprietary data”.
What the actual fuck.
So a completely government owned company is citing commercial in confidence for not disclosing data in a report that was commissioned by the government? Seriously, what the fuck are you guys playing at here? It’s obvious that if you included the cost of upgrading a FTTN network to FTTP, which has been estimated to cost at least $21 billion extra, then the cost-benefit would swing wildly in the direction of FTTP. Honestly I shouldn’t be surprised at this point as the report has already taken every step possible to avoid admitting that a FTTP solution is the better option. Hiding the upgrade cost, which by other reports commissioned by the Liberals is to be required in less than 5 years after completion of their FTTN NBN, is just another fact they want they want to keep buried.
Seriously, fuck everything about Turnbull and his bullshit. They’ve commissioned report after report, done by people who have vested interests or are in Turnbull’s favour, that have done nothing to reflect the reality of what the NBN was and what it should be. This is just the latest heaping on the pile, showcasing that the Liberals have no intention of being honest nor implementing a solution that’s to the benefit of all Australians. Instead they’re still focused on winning last year’s election and we’re all going to suffer because of it.
Smartphones and laptops have always been a pain in the side of any enterprise admin. They almost always find themselves into the IT environment via a decidedly non-IT driven process, usually when an executive gets a new toy that he’d like his corporate email on. However the tools to support these devices have improved drastically allowing IT to provide the basic services (it’s almost always only email) and then be done with it. For the most part people see the delineation pretty clearly: smartphones and tablets are for mobile working and your desktop or laptop is for when you need to do actual work. I’ve honestly never seen a need for a device that crosses the boundaries between these two worlds although after reading this piece of dribble it seems that some C-level execs think there’s demand for such a device.
I don’t think he could be more wrong if he tried.
The article starts off with some good points about why tablet sales are down (the market has been saturated, much like netbooks were) and why PC sales are up (XP’s end of life, although that’s only part of it) and then posits the idea of creating “super tablets” in order to reignite the market. Such a device would be somewhere in between an iPad and a laptop, sporting a bigger screen, functional keyboard and upgraded internals but keeping the same standardized operating system. According to the author such a device would bridge the productivity gap that currently divides tablets from other PCs giving users the best of both worlds. The rest of the article makes mention of a whole bunch of things that I’ll get into debunking later but the main thrust of it is that some kind of souped up tablet is the perfect device for corporate IT.
For starters the notion that PCs are hard to manage in comparison to tablets or smartphones is nothing short of total horseshit. The author makes a point that ServiceNow, which provides some incident management software, is worth $8 billion as some kind of proof that PCs break often and are hard to manage. What that fails to grasp is that ServiceNow is actually an IT Service Management company that also has Software/Platform as a Service offerings and thus are more akin to a company like Salesforce than simply an incident management company. This then leads onto the idea that the mobile section is somehow cheaper to run that its PC counterpart which is not the case in many circumstances. He also makes the assertion that desktop virtualization is expensive when in most cases it makes heavy use of investments that IT has already made in both server and desktop infrastructure.
In fact the whole article smacks of someone who seems cheerfully ignorant of the fact that the product that he’s peddling is pretty much an ultrabook with a couple of minor differences. One of the prime reasons people like tablets is their portability and the second you increase the screen size and whack a “proper” keyboard on that you’ve essentially given them a laptop. His argument is then that you need the specifications of a laptop with Android or iOS on it but I fail to see how extra power is going to make those platforms any more useful than they are today. Indeed if all you’re doing is word processing an Internet browsing then the current iteration of Android laptops does the job just fine.
Sometimes when there’s an apparent gap in the market there’s a reason for it and in the case of “super tablets” it’s because when you take what’s good about the two platforms it bridges you end up with a device that has none of the benefits of either. This idea probably arises from the incorrect notion that PCs are incredibly unreliable and hard to manage when, in actuality, that’s so far from reality it’s almost comical. Instead the delineations between tablets and laptops are based on well defined usability guidelines that both consumers and enterprise IT staff have come to appreciate. It’s like looking at a nail and a screw and thinking that combining them into a super nail will somehow give you the benefits of both when realistically they’re for different purposes and the sooner you realise that the better you’ll be at hammering and screwing.