When the National Broadband Network was announced it was fairly obvious it was going to be a big stab to Telstra. Their monopoly on Australia’s copper line infrastructure was beneficial in the early years when it was wholly government owned. However in true Liberal style John Howard thought it best to privatise the company, and that’s when things started turning bad. There’s only one thing worse than a monopoly controlled by a government, and that’s a monopoly controlled by shareholders.
It seems however that in order to circumvent the mistakes of the past the Labor government (I highly doubt Conroy was the driving force behind this, but I’m willing to be proved otherwise) is in essence bringing Telstra back under government control, albeit in a rather weird fashion:
Communications minister Stephen Conroy has put the Government on a collision course with Telstra, warning the telco giant to split its wholesale and retail arms, or face business restrictions.
The major reforms to the telco giant’s structure and operation were announced today as the Government geared up to roll out its $43 billion National Broadband Network and move Telstra towards becoming part of it.
The Government is moving to restructure the sector to pave the way for the rollout of the network, which it plans to build in the next eight years.
Now don’t get me wrong I like this idea. Telstra has been using its monopoly on the copper lines as a bargaining tool for many years and it’s done nothing to improve the level of service that Australian’s recieve. In fact around 8 years ago they were caught selling broadband plans cheaper than what they sold wholesale to their competitor ISPs (I can’t find a direct article, but here’s the results). Putting their retail section at arms length from their wholesale division is a good move that will hopefully keep them a little more honest, but I still have this weird feeling about how they’re going about it.
Really the issue stems from the previous government selling off our assets in order to fund their surplus (although the historical position on the matter seems to be that the government couldn’t keep pace with technology, go figure). Bar actually buying Telstra back from it’s shareholders there’s not a lot the government can do apart from legislating against them. Although with the government hammering them with legislation that will, let’s be honest here, damage Telstra’s business they could get themselves a bargain with the share price dropping 6% on this announcement. I guess it’s probably the best solution we can hope for as the government’s hands are tied from any other course of action.
On the surface this would appear to be a hastening of Telstra’s defeat at the hands of the National Broadband Network but in reality it’s the opposite. The initial plans announced by the Labor government made no mention of Telstra’s involvement at all and in fact, their involvement in a previous Fibre-to-the-Node (FTTN) request for tender was seen as joke rather than an actual submission. It seems however that this separation might form part of some deal that will get Telstra in on the ground level for the National Broadband Network, with the public announcement just the government making sure Telstra stays honest.
Overall this appears to be a benefit for Australian’s at large. It should mean that the Internet service we get is more uniform and at a lower cost than what we currently get. It will also stop Telstra from standing on top of its monopoly hill in order to keep themselves going, hopefully leading to some actual innovation by the company. It’s going to be a long process to get this whole thing worked out and I’m sure we’ll be hearing much more of this over the coming months as negotiations take place.
I’ll be watching.
Way back in the days when the Internet was only a trickle into Australia I remember the information available being sparse and unreliable. Many teachers would not accept any information from a website as part of research for a school assignment and rightly so, there was little if anyway to verify that information. The exercise was then left to us to read through countless books in order to back up any statement or opinion we might but forward. Today however the Internet is bristling with information and authoritative sources are popping up all over the place. The interesting about this is that due to the sheer volume of information that’s available you can be almost guaranteed to find some article or news piece that agrees with what you say, which has lead me into a very confounding train of thought.
I’ll take something that I know well just as an example: the economy. Now I’ve made my stance known about this in the past and the data seems to be on my side. For the most part Australia is narrowly avoiding a recession due to our banks being well capitalized and a government not afraid of going into debt to spur the economy on. However I could easily argue the opposite, and in fact a lot of people are. Just to show you how crazy the situation is take for instance these two articles. Both written at the same time but both decreeing completely different viewpoints. These aren’t the only examples either, and it is quite easy to make your point using what appears to be authoritative sources. This then begs the question, is there really a correct answer for this?
The truth is often in the middle of two dissenting viewpoints, especially when it comes to issue that can’t have a definitive answer such as the economy. However due to the volume of information it becomes easy for one side to write off the others since they appear to have so much support for their side of the argument. This unfortunately leads phenomena best described as wikiality, or truth by majority vote. In the end it probably won’t matter that you have the majority of data on your side because if you’re in the minority, when debating using the information available on the Internet, you will eventually be “proven” wrong. It is an unfortunate consequence of this information overload.
There has been a lot of work done over the past decade to create authoritative information sources however with the advent of easy access to the Internet and its publishing capabilities they are soon lost in the noise. I often try to link onto articles from these sources in order to promote them but I can’t say that I’m innocent in this regard either. All too often I link to Wikipedia hoping that people will scroll to the bottom to read actual articles from proper sources but I know that’s not usually the case. Overall Wikipedia is a good source, however the mentality of wikiality makes some articles unusable, and it can be hard to tell them apart.
To be honest though we’re better off with having too much information than not enough. There is enough information out there for anyone to be able to make up their own mind on pretty much any issue that comes up. It is regrettable that the noise is so high but that is the price we pay for the ultimate freedom of allowing anyone instant access to both read and publish limitless information.
Arrrrrgggghhhh the cognitive dissonance! 😉
As I promised over a week ago the Labor Force results for August have been released by the Australian Bureau of Statistics. On the surface things appear to be great, as the unemployment rate has remained steady at 5.8%. Drudging through the figures though reveals another story, with a lot of things not painting such a rosy picture. Whilst they are a sign that things are stabilizing and won’t be as bad as some predictions they’re still far from the kind of green shoots many would like to see before breaking out the champagne. In fact the results were something of a turning point for me, as a metric that I believed was valueless for measuring an economy’s performance outlines the problem perfectly.
The first thing that crops up when looking at these recent figures is that in fact Australia’s working population decreased by 0.2%. This was offset by a decline of the same amount in those looking for work, which is what kept the figure steady. Other metrics remained steady as well, although this is likely due to the fact that all the people who can work are still working. The shocking metric that did it for me was the one of underemployment.
I’ll be the first to admit initially I felt that the use of this metric was an attempt at shifting the goal posts that the doom and gloomers were using in their arguments against me. Every time a sign of economic recovery comes about I always hear of some new metric that if you watch the trend over the past few years shows that we’re all doomed and there’s nothing we can do about it. So I was fairly sceptical when they started spouting underemployment and wrote it off for a good while. Then I came across the statistic in the Labor Force results and decided to have a look into the ABS’ methods and what it could mean for the workforce at large. In essence underemployment refers to people who have the ability and want to work more hours but simply can’t because the work is unavailable. Looking at the current statistics for this quarter pegs this rate at 13.9%. That figure in itself doesn’t mean a whole lot but the trend says quite a bit:
You can probably guess where I’m going with this. Since 2001 the underemployment metric was trending down quite nicely. This is as you would expect in economic good times where there is quite a lot of work available for anyone who wants it. However around 2008, when the GFC started to rear its ugly head, it started trending back up and did so at quite a high rate. This was as result of employer’s reaction to the GFC as it measures not only those losing their jobs but those who have had their hours cut but still remain employed.
The one good thing that we can take away from these figures is that while underemployment might be on the rise people who are still employed currently are much better placed when things begin to turn around. Many of those who lost their jobs will be finding it hard to break back into their industry whilst the GFC continues to unfold, but those who still have some employment can easily have their hours ramped back up when times come good. This bodes well for our economy as heavy losses of jobs means a much slower recovery once the crisis has abated. Nothing slows economic development more than the workforce trying to re-establish itself.
The figures made for some good reading for me and I’d urge you to have a look around so you can see what they mean for you. Working in Canberra means I’m often isolated to the employment troubles of the country (thank you Australian Public Service) but these figures really brought it home. I can only hope that the next quarters figures show underemployment steady, but only time will tell.
A few months ago I blogged about an amazing shuttle mission that set off to perform maintenance on one of the most important pieces of scientific equipment for humanity, the Hubble Space Telescope. Up until now the Hubble team has been performing calibration tests and making sure that everything is working properly after its last trip into space. Well now the time has come for them to release the images that they’ve acquired over the past few months, and I must say they are stunning.
Before you gallivant off and ogle the 56 pictures NASA has released to us I want to show you something that really drove home just how important the Hubble is. First off let’s have a look at a ground based observation of Stephan’s Quintet, a cluster of 5 galaxies who are very close together and have been very well studied over the past decade:
That picture was taken from the Kitt Peak National Observatory using a 2.1 meter telescope back in 1998. Whilst its not the most amazing picture it does give away some detail about the galaxies and their relationships with each other. For instance the one in the bottom left hand corner (NGC7320) looks very blue in relation to the others. This is due to a process called redshift where as light travels towards the observer it stretches to the lower energy (red) side of the spectrum. This would lead us to believe that NGC7320 is probably closer to us than its neighbours, although you can’t say that definitively with this picture.
Let’s step into the future 2 years from when this picture was taken and have a look at this beauty:
Well hello there gorgeous! The picture is basically a zoomed in version of the last one, but boy look at the detail! This really demonstrates the power of putting a telescope in space as the primary mirror on Hubble is only 2.4m big, a mere 30cms more than the previous picture. We can now quite clearly see the redshift in 3 of the galaxies visible here, with NGC7320 hiding off in the corner. This was even before its last camera upgrade in 2002 with the Advanced Camera for Surveys. 7 years later NASA gave Hubble its last upgrade and it seems that it was money very well spent.
Oh dear, could you all give me 5 minutes alone with this picture? 😉
But seriously just….look…at….that! The detail is phenomenal and even with these galaxies being so far away (most are above 200 million light years away) we can still pick out individual stars. The Hubble team has also kindly added some information to this picture and you can probably guess what my earlier ramblings have been leading up to. The numbers on the respective galaxies are the amount of redshift the light has undergone before it has reached us. Looking at NGC7320 we can see it is significantly lower than the rest, which means it is actually a lot closer to us at about 39 million light years. There’s also another clue as to why NGC7320 is close to us, can you guess what it is?
Have a gander at NGC7318A/B and NGC7319, aren’t they a little unusual? For starters NGC7318A/B are two galaxies in the last stages of merging. You can see some of the starts and other stellar material being thrown off towards the right side of the picture. NGC7319’s lower spiral arm is significantly distorted towards NGC7318A/B, showing that their combined masses are pulling them in. But what of NGC7320? It looks like your normal galaxy and that’s because it’s so far away from the other three that their gravity has little effect. There’s so much that this one picture shows us!
It’s things like this that really inspire me. In just a little over 10 years we’ve gone from a fuzzy picture of distance galaxies we can make guesses on to something like this which shows amazingly distance objects in spectacular detail. We still have another 5 years before the next space telescope takes off and it looks like Hubble will be doing a fantastic job until it comes online.
Now I just need to convince NASA to bring Hubble back to earth, as was their original intention.
As many people know I’ve been a long time opponent of the Internet filter. In fact if you wind back the clock to when I created this blog you’ll see that it was originally created as a place to collate my thoughts and actions on the issue. Whilst the majority of the opposition to the filter has been clear and reasonable it would seem that the time has finally come when the vigilantes come out of the woodwork and start wrecking all the solid work we have been doing:
The Federal Government is investigating reports a computer hacker managed to temporarily shut down the Prime Minister’s website.
Kevin Rudd’s site, www.pm.gov.au, was brought down for a short time last night due to what is described as a denial of service attack.
The hacker, apparently known by the nickname Anonymous, posted warnings that government websites would be targeted in protest against its plans to filter the Internet.
The Government is considering ways to block websites carrying material it believes is offensive.
The move has attracted widespread criticism, largely because of fears the filtering system will slow Internet speeds.
The first bit of stupid I’d like to point out here is that whilst the “hacker” was identified as operating under the name Anonymous the media failed to properly recognise that he/she was probably acting as part of the online group with the same name. Although they do quote people who allude to them being a group later on most news outlets have just been repeating the first few lines. They have voiced their disapproval for the Internet filter before and due to their spontaneous order like affiliation they are unpredictable in the action that they take. It would then seem that one member identifying with their principals decided to take matters into his own hands and try to make a point about the issue, albeit with the completely wrong methods.
Whilst I can appreciate the passion and dedication that the hacker/s must have felt in order to attempt something on this magnitude I can not condone their methods. The unfortunate truth about their actions is that it has done nothing to further the cause to have the filter abandoned and has only served to bring a small amount of news to the front pages saying that the prime minister’s website was attacked. Judging by the attack itself I can hazard a guess that the attacker is either from outside Australia or not current with news on the filter, as it is essentially dying on the vine. We still need to be vigilant to make sure that the government does not try to resurrect the policy under a different name however the filter as it was proposed is being swept away in the hopes it can die without taking any politicians with it. Unfortunate as I would’ve liked to have the sacrificial lamb to be Conroy for fervently supporting this legislation.
Acts like this do nothing to serve the cause and only help to strengthen the opposition’s resolve. The out pouring of support from other countries, like the UK naming Conroy as the Internet Villain of the Year, does far more to help than what amounts to petty vandalism of a government site. If they want to put their 1337 |-|a©Kz0r skills into practice maybe they should look to more persuasive ways, like google bombing Conroy. But that would be too much effort now wouldn’t it? 😛
It was fun to see the stupid explosion when they collided though 🙂
Our crater faced neighbour in the darkness of space is none other than the Moon. The only other celestial body to be visited by us humans it has been something of a curiosity to us for countless milennia and it is only recently that we’ve come to realise a few things about this ball of dust and rock that don’t quite seem to add up. Today I’d like to introduce a couple things that are not-so-common knowledge about our celestial cousin and give you a run down on what they mean for us here back on earth.
Firstly it’s massive and not just because it weighs a lot. Current estimates of its mass peg it somewhere around 7.347 7 × 1022 kg or around 1.23% of that of Earth and that’s the kicker right there. If you look at any other planet with another orbiting body the relationship of planet to moon mass is no where near that high. The most comparable planet would be Mars with its moons Phobos and Deimos, which weigh in at a measly 0.0000016% and 0.00000023% of their hosts respectively. It’s a similar story for moons of other celestial bodies, especially when you consider a moon like Lo or Europa which are about the same size as our moon but are orbiting the gas giant Jupiter. Our moon is then somewhat of a enigma and its presence has caused many interesting phenomena on our Earth. This then begs the question: How the heck did something that huge manage to get trapped with us?
There’s a lot of theories about its creation. If you were to look at other planets and extrapolate a hypothesis from them your first conclusion would be that we captured another celestial object. Again the mass of the moon says otherwise, as the Earth isn’t large enough to capture something of that size without some other forces acting which we can’t seem to account for. Another possibility is that the Moon and Earth formed at the same time however the composition difference between the Moon and Earth is significant enough to throw this theory into question. Additionally, all the previous theories also fail to account for the amount of spin the Earth/Moon system has, which leaves the current best hypothesis: something hit us. The idea is that another body on a similar orbit around our sun eventually came too close and of course this lead to a massive collision. This theory still has its problems, but for now it’s the best idea we have.
Another fun fact about the Moon is that it’s covered in a fine powder referred to as regolith. Due to the lack of geological activity and zero atmosphere the surface of the Moon is for the most part, stagnant. Any reshaping of the Moon’s surface occurs in the form of asteroid impacts. These have the tendency to smash whatever they hit into a lot of small pieces and over the course of the several billion years of it’s existence the Moon has taken quite a few hits. This has lead to the entire surface being covered in around 4 meters of fine dust that is best described as crushed glass. Regolith is one of the main issues facing an established lunar base as it’s quite coarse and loves to stick to everything. Plus it’s not the best thing in the world to get in your lungs either.
The Moon is a wonder for anyone on Earth and I love the fact that so much of it is still a complete mystery to us. I can’t wait for the day when we make a permanent presence on the Moon as the things we could accomplish there would be amazing. For now I’ll just keep gazing upwards for a look at the Moon whenever it floats by.
The last few decades haven’t been very kind to NASA. Ever since their heyday back in the 60’s and 70’s they have been the target of budget cuts, over-budget under delivering programs and constant congress involvement that has made innovation on their part extremely hard. Whilst I believe that their budget of 0.5% of GDP (as it was back in the Apollo program) is a small price to pay for phenomenally inspirational activities it has become apparent that it is easy to write off the benefits of space travel when there are many other things requiring attention back here on earth. You can then imagine my surprise when NASA announced, in essence, they were going to attempt to do Apollo again, albeit with modern technology and decades of experience in low earth orbit. They called this teh Constellation Program and you’d be forgiven for thinking that they were taking their inspiration from the past.
Constellation was born out of the former president’s vision for space exploration which at the time seemed like a boon for NASA and its cohorts. Realistically it was a political ploy for him to try and win votes from the scientific community as if he was not to be reelected how could we guarantee that the next president would share his vision? I can’t comment on how much of the vote swung his way because of this but he did manage to get reelected. However additional funding that would be required to ensure NASA’s continued presence in space as well as developing a completely new set of space vehicles never materialized. This then lead to the current situation whereby NASA has a large gap in its ability to keep a manned presence in space, currently relying on private industry and Russia to support them.
The vehicles themselves are a pretty big step up in terms of delivery payloads into space. The Ares I is a straight up replacement for the shuttle, with a slightly larger payload capability with the added bonus of having better safety features like a launch abort system. The Ares V is where the real changes are occurring, as it can deliver a phenomenal 188 tons into low earth orbit. Compared with the Saturn V it can deliver almost double the payload into lunar orbit at 71 tons. The lander vehicles and crew capsules follow the same route, basically being bigger brothers of their Apollo counterparts. Whilst they are a significant step up in NASA’s payload capability (and really nothing comes close to the Ares V) they are still many years away from being flight ready.
And here is where we get to the crux of the matter: should NASA really be creating a new space fleet? With companies like SpaceX and Bigelow Aerospace stepping up their presence and showing that they are capable of providing many of these technologies at a small percentage of the costs that NASA is incurring it doesn’t seem beneficial to have NASA be in the business of building new space craft. Realistically they could get so much more done by utilizing the services these new private space companies are providing as they are footing the research and development costs. This would then allow them to shift their focus away from the routine activities like maintaining the International Space Station and focus on the revolutionary things like lunar bases and a Mars shot.
It’s also entirely possible that because these private companies are doing so well that eventually they will overtake NASA in their ability to deliver those kinds of awe inspiring moments. Once some mega-billionaire gets a taste for the idea of being the first man to land on another planet you can be assured that the private space companies would be more than happy to step up and provide a means for them to achieve that dream. Whilst it would be a significant blow to NASA it would allow them to refocus back onto pure science based missions, something which is not politically palatable right now.
Constellation is one of those projects that I’m sure will bring many positive benefits to humanity. It’s just unfortunate that I can’t see what they are right now. With the barrier to space dropping at an increasing rate I’m sure that the industry will hit a critical point where a combination of private and government activities will lead NASA and its cohorts to inspire humanity once again.
Whilst I’d love to waffle on about the spectacular weekend I had that begun with a kidnapping and ended up with me covered in bruises, welts and wounds I am lacking the mental capacity to do anything. Which is why I will sum up this short post with the following picture:
Hopefully I’ll recover my bout of dumb by tomorrow and keep spamming the Internet with the tangled mess of thoughts that is my brain. Or at least the pain will subside enough for me to be able to type for more than 5 minutes at a time 😉
I’m not what you would call a garden variety sceptic. For the most part I let most things slide as there are enough people fighting the fight for me already. However if in conversation someone says something that is obviously incorrect or is based on hearsay and anecdotes I’ll usually point it out so that they have to prove their point. I often say to them that “you can’t bullshit a bullshitter, and I’m the best that’s around” since I’ve been known to use argumentative devices and sometimes questionable logic to get people to believe what I say. Once you develop critical thinking patterns it becomes quite easy to pick up on when someone is talking with authority on a topic they have no idea about. I guess that puts in me in the category of the casual sceptic, concerned with ensuring that everyone has the right information and makes their own decisions rather than accepting what they here without question.
It was then last night over my usual Thursday night drinks with friends that the topic of scepticism came up. For the most part my group of friends would fall into the casual sceptic category as we’re interested in facts and won’t blindly believe something until we’ve done our research. My fellow blogger then pulled out a copy of Richard Dawkin’s The Greatest Show on Earth a book which he said felt more like a transition away from the bible bashings that he was famous for and felt more rhetorical, something that is sorely lacking in the sceptic movement’s arguments. This then begged the question: Has the sceptic movement forgotten the art of rhetoric?
I read quite a lot of sceptical material and that’s because it appeals to me. This is probably in part due to my slight anti-mass-media bias as many of the online sceptic resources are targeted at major news outlets who are reporting misinformation. Additionally there’s something that I myself haven’t really thought of, and the sceptical pieces trigger that all too important research reflex in me that sends me off on a couple hour streak through online journals. However they’re preaching to the choir here as I’m already on their side. To the other side their argument isn’t persuasive at all, and comes off as an attack which only serves to strengthen their resolve.
It would then seem that most sceptics make the mistake of thinking that the people on the other side of their argument will be converted using the same techniques that would convince a fellow sceptic of their point. I came to realise this recently that hammering away at someone’s beliefs does not serve to improve their view of you or your argument. More you have to convince them that your side of the argument is the more sensible option, and this is usually done through the use of rhetorical devices and soft power techniques. Sceptics put forth a (usually) scientifically sound argument that would convince a like minded individual of their position but seem to give up in frustration when their argument falls on deaf ears.
Personally it feels like human nature to just assume that everyone thinks along the same lines as you do. It makes life considerably easier as you don’t have to spend half your time considering every aspect of someone in order to communicate with them. However it’s this assumption that seems act as a catalyst to the raging debate between sceptics and their targets. I’m sure one day we’ll see the rise of persuasive arguments from the sceptic movement and if they do it right they’ll get exactly what they’ve been fighting for.
That, or they could convert Oprah into a sceptic. 😉
If you’ve ever done any formal project management training (or spoken to someone who has) you’ll probably be familiar with the saying that the camel was a horse designed by a committee. There’s also this lovely picture that aptly describes what unfortunately happens with many projects:
You would then think that after many years of maturing the idea of project management that these jokes would end up sliding by the wayside, only serving as quaint reminders of project management of years past. Unfortunately this is no where near the case, and one of humanity’s greatest projects was doomed by the stereotypical problems that all project managers are trained to avoid.
The Space Shuttle was America’s grand idea to change the way space was accessed, being the first fully reusable craft. Up until the development of this craft all vehicles that reached space and returned could not be used again, as many of the components that made up the (like the ablative heat shields on Apollo) where beyond repair or replacing. Technically this isn’t a bad thing as it makes the craft cheaper and in most cases lighter allowing for more payload to be delivered to orbit. It does however mean that turn around for another launch means producing a completely new vehicle, along with all the testing that incurs. So, resigned to making access to space cheaper and faster America set out to design the first fully reusable launch vehicle.
Initial design of the shuttle went through several different revisions. Initially the craft was designed for smaller missions delivering a modest payload of around 9 tonnes. The reasoning behind this design was that although initial costs were high (in fact exceeding that of comparable non-reusable designs) the high launch rate that could be attained by the reusable craft would, in the long run, make up for it. Studies into the feasibility of reusing this craft showed that the number of launches required was far too high for the pay off to be achieved. In fact the combined launch requirements of the NASA and the Air Force were still not enough for this reusable system. It was therefore decided that all US launches (military, scientific or otherwise) would use this system, and this is when things started to get a little messy.
With so many different agencies now being told that this new reusable system was to be used for their space programs the capabilities of the shuttle had to change dramatically. No longer was the shuttle a ferry craft it also had to become a space transport vehicle as well. What this lead to was an increase in payload to around 25 tons which could accommodate the largest military and commercial satellites. This amounts to is that any satellite that needed to be launched also had a 65 ton orbiting vehicle tagging along with it in essence eliminating that mass as usable payload. This wasn’t the only issue with the design as some estimates required over 50 launches per year for the design to be feasible. Unfortunately this was simply not possible as due to the high payload capacity, and hence large fuel tank requirement, the non-reusable external tanks had a production limit of 24 per year. The writing really was on the wall early on in the design process.
Looking in from the outside the problems that plagued the shuttle seem obvious. The craft was initially planned as the ferry to the space station with a small payload capacity that would probably be used for supplies. Upon adding in the additional requirements of being able to launch satellites the craft swelled to almost 3 times its original size. In essence they were trying to attain the lifting capacity of a some of the larger rockets (like the Atlas V) whilst strapping on an additional 65 tons. The result was a jack of all trades but master of none design that has arguably lead to the massive cost over-runs that the shuttle has been burdened with. Had the orbiter retained the smaller design and other launch systems used in its place we might have attained the high launch number required to make the reusable craft dream attainable. It is unfortunate that we will never know.
For all the problems that plague the shuttle it has had over 120 successful launches and has served to be an icon for space travel. Whilst I lament the costs and design-by-committee process that burdened the shuttle with more than it was capable of handling I still get chills down my spine watching it launch. The shuttle might be a technical failure but it is hard to deny the image that it has left in all of humanities minds. It will be a very long time before that iconic image of a shuttle lifting off is replaced.
And so I await with bated breath, SpaceShipThree. Hopefully the next inspirational space icon.