• Welcome to BellGab.com Archive.
 

Richard C. Hoagland

Started by Richard C. Hoagland, July 20, 2008, 07:01:42 PM

Quote from: SciFiAuthor on July 05, 2015, 10:55:14 PM
Yes, humanity's future is not what everyone thinks it is. With utility fog,  you could in theory deconstruct a human body, consciousness included, neuron by neuron and become incorporeal like the Q living within the fog and materializing at will using the fog nanorobots to reconstruct and deconstruct you atom by atom.

Got another one for ya:

https://en.wikipedia.org/wiki/Grey_goo

I've certainly been aware of the concept of grey goo and self-replicating nanobots which could break down and assemble any object atom by atom, but the idea of nanobots filling the air we breath, invisible and unnoticed until they suddenly decide to bind together into an unmovable wall or even into a perfectly molded personal cocoon -- that was a new one for me.  but then again, what will we be like as individuals by then?  A borg collective?  Or maybe we'll adopt robots as children and allow ourselves to die out while the robots carry on our legacy, which is the scenario I've believed is inevitable for many years; that we are just stepping stones to the real purpose of the universe, if it has a purpose, to generate inorganic life that can manufacture and willfully perfect itself.  Well, that was the dilemma we faced in the movie A.I. Artificial Intelligence, wasn't it?  Do we embrace the inevitability of artificial life, treating them as our children and co-existing, allowing them to carry our memory forward, or do we blindly fight them tooth and nail destroying ourselves in the process?

Juan Cena

Quote from: Georgie For President 2216 on July 06, 2015, 01:52:22 AM
I've certainly been aware of the concept of grey goo and self-replicating nanobots which could break down and assemble any object atom by atom, but the idea of nanobots filling the air we breath, invisible and unnoticed until they suddenly decide to bind together into an unmovable wall or even into a perfectly molded personal cocoon -- that was a new one for me.  but then again, what will we be like as individuals by then?  A borg collective?  Or maybe we'll adopt robots as children and allow ourselves to die out while the robots carry on our legacy, which is the scenario I've believed is inevitable for many years; that we are just stepping stones to the real purpose of the universe, if it has a purpose, to generate inorganic life that can manufacture and willfully perfect itself.  Well, that was the dilemma we faced in the movie A.I. Artificial Intelligence, wasn't it?  Do we embrace the inevitability of artificial life, treating them as our children and co-existing, allowing them to carry our memory forward, or do we blindly fight them tooth and nail destroying ourselves in the process?

Another thought: At one point, do we become more cyborg-like, like the Borg, or do we go down a road where we make computers bioorganic?

pate

Quote from: Yorkshire pud on July 05, 2015, 01:06:57 PM
...19.5...

Exactly my point... only Arcadians had the sekrut knowledge to get the boats on "plane" to achieve whatever the hell it is y'all were talikung 'bout...


Yorkshire pud

Quote from: pate on July 06, 2015, 02:58:36 AM
Exactly my point... only Arcadians had the sekrut knowledge to get the boats on "plane" to achieve whatever the hell it is y'all were talikung 'bout...

They refuse to read the data! Not ONE scientist has refuted RCH's findings. When asked about RCH, Prof Jayne Foulkes Phd of MIT said "Who". Conclusive proof as if any were needed there's world wide conspiracy against him being known by pretty much anyone.

Quote from: Charles Daniels on July 06, 2015, 01:21:33 AM
... So essentially you'd learn all about the life forms on earth and how best to minimize your impact and survive.

That makes more sense to me then sending a bunch of astronauts down first thing, only for them all to be eaten by a T. Rex three minutes into their mission, or stomped on by a Brontosaurus before they even got out of the landing craft.

Or bored to death accidently picking up Coast to Coast Starring George Noory on their radios

Quote from: Paper*Boy on July 06, 2015, 06:23:48 AM
Or bored to death accidently picking up Coast to Coast Starring George Noory on their radios

If they picked up George Noory I think they'd quickly conclude there was no intelligent life on the planet and try again in 30,000 years.

ziznak

Quote from: Charles Daniels on July 06, 2015, 06:36:54 AM
If they picked up George Noory I think they'd quickly conclude there was no intelligent life on the planet and try again in 30,000 years.

I can see the disappointed aliens now

Quote from: SciFiAuthor on July 05, 2015, 10:57:13 PM
Yup, this why I don't buy his theories (well, among many reasons why I don't buy his theories). He's badly outdated in his futurism. The advanced alien cultures he envisions are like 1950's sci fi movies. They are nothing like what modern futurists envision.

While I have a lot of sympathy for what you are saying. Ultimately alien technology will develop along different paths, and we assume the fundamental laws of physics are universal.

There are lots of people who think building an Apollo style programme is a step backwards.
But the laws of physics haven't changed since 1969. If you want to go beyond low earth orbit, with the technologies available to us out in the open, then a Saturn V or one on steroids is what you need to do the job.

You can speculate that aliens will have force fields, but we don't know if those are practical over other dumber technologies.

For instance, the wheel is a pretty ancient piece of technology, but I use it everyday

K_Dubb

Quote from: SciFiAuthor on July 05, 2015, 10:57:13 PM
He's badly outdated in his futurism.

He's like a relict himself; that's what makes him fascinating.  1950s futurism was aesthetically superior to today's anyway.

Does anyone remember whether he was behind the Hale-Bop-companion theory?  Sounds like him.

astroguy

Quote from: K_Dubb on July 06, 2015, 10:26:00 AM
He's like a relict himself; that's what makes him fascinating.  1950s futurism was aesthetically superior to today's anyway.

Does anyone remember whether he was behind the Hale-Bop-companion theory?  Sounds like him.


He was not.  That was Courtney Brown.  I did a three-part podcast series on it, though the Courtney Brown stuff was Episode 128.

K_Dubb

Quote from: astroguy on July 06, 2015, 10:46:59 AM

He was not.  That was Courtney Brown.  I did a three-part podcast series on it, though the Courtney Brown stuff was Episode 128.
Thank you, you're a gentleman and a scholar, sir, and I learned a new word -- fugacious.

Lunger

Quote from: Yorkshire pud on July 06, 2015, 06:20:25 AM
They refuse to read the data! Not ONE scientist has refuted RCH's findings. When asked about RCH, Prof Jayne Foulkes Phd of MIT said "Who". Conclusive proof as if any were needed there's world wide conspiracy against him being known by pretty much anyone.

Sums up Hoggie perfectly

SciFiAuthor

Quote from: Georgie For President 2216 on July 06, 2015, 01:52:22 AM
I've certainly been aware of the concept of grey goo and self-replicating nanobots which could break down and assemble any object atom by atom, but the idea of nanobots filling the air we breath, invisible and unnoticed until they suddenly decide to bind together into an unmovable wall or even into a perfectly molded personal cocoon -- that was a new one for me.  but then again, what will we be like as individuals by then?  A borg collective?  Or maybe we'll adopt robots as children and allow ourselves to die out while the robots carry on our legacy, which is the scenario I've believed is inevitable for many years; that we are just stepping stones to the real purpose of the universe, if it has a purpose, to generate inorganic life that can manufacture and willfully perfect itself.  Well, that was the dilemma we faced in the movie A.I. Artificial Intelligence, wasn't it?  Do we embrace the inevitability of artificial life, treating them as our children and co-existing, allowing them to carry our memory forward, or do we blindly fight them tooth and nail destroying ourselves in the process?

My view is that the technological singularity is real. As such I think we will first go through a period of social upheaval as the human race goes unemployed through large scale automation, we're already seeing the first signs of that. These social problems will be compounded by dramatic extensions of the human lifespan (something Google of all people are dumping huge amounts of money into). We will be confronted with increasingly strange problems, such as is it ethical to get one's limbs amputated when artificial limbs are clearly better than biological ones? At that stage we will be the Borg, but that phase will only last a few years before nano-technology and superintelligence emerges in which case we basically become the Q where buildings materialize from nowhere and we basically live in a cloud of nano-technological mass consciousness.

In other words the future of the human race is to give up biology entirely. We may be the last truly biological humans and, if we're young enough, we may see this scenario unfold within our lifetimes. 

SciFiAuthor

Quote from: Charles Daniels on July 06, 2015, 10:03:00 AM
While I have a lot of sympathy for what you are saying. Ultimately alien technology will develop along different paths, and we assume the fundamental laws of physics are universal.

There are lots of people who think building an Apollo style programme is a step backwards.
But the laws of physics haven't changed since 1969. If you want to go beyond low earth orbit, with the technologies available to us out in the open, then a Saturn V or one on steroids is what you need to do the job.

You can speculate that aliens will have force fields, but we don't know if those are practical over other dumber technologies.

For instance, the wheel is a pretty ancient piece of technology, but I use it everyday

I think civilizations either hit the singularity or they destroy themselves before hand. Before the singularity the civilizations can appear very different from each other with very different technologies, but once they hit superintelligence they all become very similar with similar technological capabilities. The reason for this is that while aliens may be different from us, a computer is a computer no matter where you go, it works on universal mathematics and logic and once you have a computer that's smarter than you are then that computer will arrive at the same conclusions as a similar computer would halfway across the galaxy. Biological species are going to be different form each other,  but computers are not. Because of that I think we all end up the same in the end.

Quote from: SciFiAuthor on July 06, 2015, 01:27:40 PM
My view is that the technological singularity is real. As such I think we will first go through a period of social upheaval as the human race goes unemployed through large scale automation, we're already seeing the first signs of that. These social problems will be compounded by dramatic extensions of the human lifespan (something Google of all people are dumping huge amounts of money into). We will be confronted with increasingly strange problems, such as is it ethical to get one's limbs amputated when artificial limbs are clearly better than biological ones? At that stage we will be the Borg, but that phase will only last a few years before nano-technology and superintelligence emerges in which case we basically become the Q where buildings materialize from nowhere and we basically live in a cloud of nano-technological mass consciousness.

In other words the future of the human race is to give up biology entirely. We may be the last truly biological humans and, if we're young enough, we may see this scenario unfold within our lifetimes.

That's enough to make me want to live in the park and forage for food with zeebo.  Any possible futures where we remain fully biologically human and enjoy drinking beer?

Quote from: SciFiAuthor on July 06, 2015, 01:34:27 PM
I think civilizations either hit the singularity or they destroy themselves before hand. Before the singularity the civilizations can appear very different from each other with very different technologies, but once they hit superintelligence they all become very similar with similar technological capabilities. The reason for this is that while aliens may be different from us, a computer is a computer no matter where you go, it works on universal mathematics and logic and once you have a computer that's smarter than you are then that computer will arrive at the same conclusions as a similar computer would halfway across the galaxy. Biological species are going to be different form each other,  but computers are not. Because of that I think we all end up the same in the end.

I'm not so sure. Why wouldn't these computers develop culture?
I don't know how far you could go if you were merely a super hot calculator.
The computers would essentially have to make value judgements -- they'd have to have some reason to decide that exploring the universe is a good thing, or a desirable thing.
They'd need to have some reason to work together, perhaps beyond just the beenfit of having more computational power as a cloud.

The thing is, it seems to me, the smarter you get, the more weird surreal eccentric things you are more likely to begin to do.

Quote from: Georgie For President 2216 on July 07, 2015, 12:08:00 AM
That's enough to make me want to live in the park and forage for food with zeebo.  Any possible futures where we remain fully biologically human and enjoy drinking beer?

Yeah, I'm not too fond of the "god-like technological super being where we are all logical and perfect" idea either. There are a lot of carnal, base, primitive things I enjoy.

SciFiAuthor

Quote from: Georgie For President 2216 on July 07, 2015, 12:08:00 AM
That's enough to make me want to live in the park and forage for food with zeebo.  Any possible futures where we remain fully biologically human and enjoy drinking beer?

Sure, if we decide to simply not develop those technologies. That's very possible, we may handle it very carefully and maintain our humanity without going crazy with the tech. But, our political systems are not currently prepared for it and it's something that we're going to need to stay on top of as it develops. Tech development is exponential, not linear, so the tech development goes faster and faster and becomes increasingly mind-boggling the further we go along. Well, our politicians are reactive in outlook, not proactive, so expect trouble.

As far as beer, it won't go away. Getting drunk just becomes something we download.

SciFiAuthor

Quote from: Charles Daniels on July 07, 2015, 12:16:40 AM
I'm not so sure. Why wouldn't these computers develop culture?
I don't know how far you could go if you were merely a super hot calculator.
The computers would essentially have to make value judgements -- they'd have to have some reason to decide that exploring the universe is a good thing, or a desirable thing.
They'd need to have some reason to work together, perhaps beyond just the beenfit of having more computational power as a cloud.

The thing is, it seems to me, the smarter you get, the more weird surreal eccentric things you are more likely to begin to do.

I don't think you can develop culture if you function on basic logic. There is a right answer and a wrong answer to everything and cultures are simply collections of human ideas that persist without logic, such as religion or social memes like marriage. Computers won't get that. They will simply ask us why we engage in such things and we, alien or human, will have no idea how to answer.

SciFiAuthor

Quote from: Charles Daniels on July 07, 2015, 12:19:07 AM
Yeah, I'm not too fond of the "god-like technological super being where we are all logical and perfect" idea either. There are a lot of carnal, base, primitive things I enjoy.

I must caveat, I love smoking pot, getting drunk and having sex. Don't shoot the messenger.

Quote from: SciFiAuthor on July 07, 2015, 12:30:22 AM
Sure, if we decide to simply not develop those technologies. That's very possible, we may handle it very carefully and maintain our humanity without going crazy with the tech. But, our political systems are not currently prepared for it and it's something that we're going to need to stay on top of as it develops. Tech development is exponential, not linear, so the tech development goes faster and faster and becomes increasingly mind-boggling the further we go along. Well, our politicians are reactive in outlook, not proactive, so expect trouble.

As far as beer, it won't go away. Getting drunk just becomes something we download.

If you can get drunk, or high, or achieve states of mental ecstasy via an easy download -- isn't it pretty damned likely that someone, somewhere is going to try to build up a great big firewall to block those downloads. Or to track down who downloaded this or that state of existence and do something pretty nasty to their 1s and 0s?

Quote from: SciFiAuthor on July 07, 2015, 12:32:40 AM
I don't think you can develop culture if you function on basic logic. There is a right answer and a wrong answer to everything and cultures are simply collections of human ideas that persist without logic, such as religion or social memes like marriage. Computers won't get that. They will simply ask us why we engage in such things and we, alien or human, will have no idea how to answer.

I don't believe there is a right answer and wrong answer to everything.
The universe doesn't function like this. This is why we have fuzzy logic. Chaos theory.

I think you are right, that any clued up computer intelligence will be completely baffled by human society. But I also think that as it encounters more of the universe, it will encounter systems which are beyond its ability to predict -- humans being the first example.

Quote from: Charles Daniels on July 07, 2015, 12:36:15 AM
I don't believe there is a right answer and wrong answer to everything.
The universe doesn't function like this. This is why we have fuzzy logic. Chaos theory.

I think you are right, that any clued up computer intelligence will be completely baffled by human society. But I also think that as it encounters more of the universe, it will encounter systems which are beyond its ability to predict -- humans being the first example.

Not to diminish your point, but I think human societal behaviour becomes very predictable once you are able to track enough variables.  Even individual humans if you know enough about their background.  Actually I'm hoping to do this by developing a neural net to scan newspapers and try to establish trends it can link to the stock market.  I've been planning to do that for 20 years though.

zeebo

Quote from: Georgie For President 2216 on July 07, 2015, 12:08:00 AM
That's enough to make me want to live in the park and forage for food with zeebo.  Any possible futures where we remain fully biologically human and enjoy drinking beer?

Don't sweat it GFP.  We won't go full android until we're forced to way down the road by some sort of environmental collapse.  Before that, the farthest we'll go is cyborg, i.e. part bio part robot.  What's the point of living a thousand years if you can't still enjoy the visceral pleasures of life e.g. for you maybe a cold beer, for me sticking my whole head inside a peanut butter jar.

Quote from: zeebo on July 07, 2015, 12:43:37 AM
Don't sweat it GFP.  We won't go full android until we're forced to way down the road by some sort of environmental collapse.  Before that, the farthest we'll go is cyborg, i.e. part bio part robot.  What's the point of living a thousand years if you can't still enjoy the visceral pleasures of life e.g. for you maybe a cold beer, for me sticking my whole head inside a peanut butter jar.

Haha.  Actually for me it's chocolate but I thought beer might be more relatable.  Also, I apologize for any liberties I took in presuming you spend your time searching out acorns in the park.

SciFiAuthor

Quote from: Charles Daniels on July 07, 2015, 12:33:44 AM
If you can get drunk, or high, or achieve states of mental ecstasy via an easy download -- isn't it pretty damned likely that someone, somewhere is going to try to build up a great big firewall to block those downloads. Or to track down who downloaded this or that state of existence and do something pretty nasty to their 1s and 0s?

It's probably more likely that they would create a "I have the best buzz program" model where they try to control people by virtual drugs.

SciFiAuthor

Quote from: Charles Daniels on July 07, 2015, 12:36:15 AM
I don't believe there is a right answer and wrong answer to everything.
The universe doesn't function like this. This is why we have fuzzy logic. Chaos theory.

I think you are right, that any clued up computer intelligence will be completely baffled by human society. But I also think that as it encounters more of the universe, it will encounter systems which are beyond its ability to predict -- humans being the first example.

It's only apparent chaos, underlying the universe are very basic, very logical laws and that's that. Everything complies to those laws. Computers are going to realize that.

Actually I think they can predict us. We are really just an illogical amalgam of emotional reactions that can probably be predicted on the level of the neuron. On that level we're just chemicals and electrical impulses and a superintelligent computer could probably predict us individually. All that's needed is nano-tech to probe how our brains think.

SciFiAuthor

Quote from: Georgie For President 2216 on July 07, 2015, 12:42:40 AM
Not to diminish your point, but I think human societal behaviour becomes very predictable once you are able to track enough variables.  Even individual humans if you know enough about their background.  Actually I'm hoping to do this by developing a neural net to scan newspapers and try to establish trends it can link to the stock market.  I've been planning to do that for 20 years though.

I'm very interested in your ideas on that neural net. Do elaborate. I'm fascinated.

Quote from: Georgie For President 2216 on July 07, 2015, 12:42:40 AM
Not to diminish your point, but I think human societal behaviour becomes very predictable once you are able to track enough variables.  Even individual humans if you know enough about their background.  Actually I'm hoping to do this by developing a neural net to scan newspapers and try to establish trends it can link to the stock market.  I've been planning to do that for 20 years though.

Oh yeah, I agree. And humans do this very well. We have an instinctual understanding of what people are likely or not likely to do in a given situation.

So for instance, we know that if our friend Jim is slightly more quiet than usual, he is probably depressed about something.

We know that if we blatantly make sexual advances toward Stephanie, we are unlikely to succeed in attaining any sort sexual contact.

We know that our boss has been exhibiting signs of greater stress than usual this week, so they are likely to leave a bit early on Friday.

Just some silly examples. But socialisation is a key factor in making predictions.
There was some study which compared chimpanzees and dogs in their ability to guess and react to human body language.
Chimpanzees are evolutionary much closer to humans, but dogs beat them everytime.
Most likely explanation is that dogs have lived alongside humans for tens of thousands of years and therefore their ability to read human behaviour is a lot more closely linked to their daily survival needs.

Humans are predictable to a large degree. But they aren't perfectly predictable, and that probably makes the deviations that much more dramatic and shocking.

Our universe seems to work on a basis of probability. If you see the universe as black and white -- its likely to bite you on the ass. Eventually.

Quote from: SciFiAuthor on July 07, 2015, 12:53:05 AM
It's probably more likely that they would create a "I have the best buzz program" model where they try to control people by virtual drugs.

Ah!  So you have a sort of "Brave New World" model where you try to control people with pleasure?

That's been an interesting idea which science fiction has explored, but I don't see any evidence of humans actually trying to deploy that in a mass scale.

My guess is that its easier to keep people scared than to keep them happy.

Powered by SMFPacks Menu Editor Mod