I'd rather spend the money replacing our fragile and short lived meat sacks first. Building massive space ships to sling us across the universe is a problem which only exists when we have meat sacks.
Voyager 1 and 2 have done well with ancient technology and no meat sacks.
I remember a great comment from a talk by John Searle about the scariest way that this could happen. Imagine that you get just such an implant, and suddenly, you just feel woozy or tired, as if something is really going royally wrong, as if your consciousness is no longer really "in control."
The Doctor starts the checkup and says "how do you feel?" and you internally think, "lousy! I need you to remove it, get it out, now!" -- but that's not what you say. Your lips articulate instead, "Doc, it feels great, I'm almost ready to check out and get back to work."
You live the rest of your life a prisoner in your own brain, which has been co-opted and taken over by the artificially intelligent electronics.
I tend to find Searle's discussions on AI to be overly simplistic. I find that scenario highly unlikely. Regardless of how you choose to interpret the research on the neuroscience of where decisions to action originate, it is clear that the conscious mind is extremely good at taking full credit for events it had no part or only a partial role in.
Such an "AI parasite" has plenty of machinery inbuilt in the brain to take advantage of in order to have full control without causing dissonance in the host. Indeed, not only will it likely be a path of least resistance, continued dissonance may cause enough mental instability in the host as to disrupt some emergent balance in the brain. This would have an overall negative effect on the user AI as to motivate it to choose to not make the extra effort required to have the human suffer such a disconnect.
The question then would be in whether thinking of the AI as other is valid instead of expanding the notion of personal agency to accept such symbiotes.
Society already does a good job of this without the need for technology or artificial intelligence.
I'd genuinely rather spend my day chopping wood, picking fruit or fishing, but no, I'm chained to a desk in front of Visual Studio 2010 pumping out code to prop up the insurance industry.
I still don't know why I do it. Something has taken me over and it's not artificially intelligent electronics.
You realize that building new meat sacks is a much more difficult problem than slinging meat sacks into space? It's like wanting to rewrite an entire operating system from scratch instead of fixing a difficult driver bug (terrible analogy, I know).
Personally, I think that slinging meat sacks into space without breaking them or letting them rot is harder than building a new operating environment.
The meat bit isn't all that useful so we need to get rid of it. In fact it's a pain in the arse.
The thing that is superposed on top of the meat is the important bit, you know like CDs don't really matter but the data (music) does.
Imagine if you had 1000 hands that worked in -50oC to 300oC, another couple Tb of persistent state and a rewire which allowed you to do sequential calculations efficiently (rather than lots of useless parallel ones), oh and you never had to poop.
Agreed. The future of spaceflight belongs to engineered beings - be they artificial machines (androids), genetically-modified humans or some combination of the two.
Voyager 1 and 2 have done well with ancient technology and no meat sacks.