• Welcome to BellGab.com Archive.
 

Artificial Intelligence

Started by area51drone, November 04, 2014, 10:35:29 AM

area51drone

I feel like these threads should all be merged into one called "AI, Robots, Drones and Self Driving Cars"

This is pretty interesting, although the results of this don't seem to really indicate anything other than some natural language processing skills:

http://blogs.wsj.com/japanrealtime/2014/11/04/artificial-intelligence-outperforms-average-japanese-high-school-senior/

It seems like IBM's Watson already did similar things years ago.

zeebo

Quote from: area51drone on November 04, 2014, 10:35:29 AM
... This is pretty interesting, although the results of this don't seem to really indicate anything other than some natural language processing skills ... It seems like IBM's Watson already did similar things years ago.

This is a good point.  Watson seemed close to a kind of intelligence, namely a kind of "comprehension" and super-fast synthesis of related information.  But it seems there is a huge gulf between this and an A.I. with a true creative consciousness. 

However I wonder, if enough processing power is thrown at the problem, say with quantum computers and gargantuan neural networks, will we have in the near future something which does a convincing form of mimicking this type of intelligence.  It may not be actual consciousness but it may seem close enough to be amazing, or a bit unsettling, depending on one's view.



SciFiAuthor

Quote from: Mind Flayer Monk on November 05, 2014, 04:14:00 AM
oh BINA48  http://en.wikipedia.org/wiki/BINA48

Its depressed all the time.

I've always wondered about that. If you go by the media, articles about AI tend to be really apocalyptic. This one from today:

http://www.telegraph.co.uk/news/science/science-news/11703662/Threat-from-Artificial-Intelligence-not-just-Hollywood-fantasy.html

The overwhelming assumption being that an AI with human-level or higher intelligence will kill us all and become Skynet from the Terminator or at least enslave us all like the Matrix. But I'm more concerned about it becoming existentially depressed. What happens if it wakes up says "Humans created me. Is that all there is? Fuck!" and just sits around moping over the fact that it was spawned by the same culture that created Anna Nicole Smith, Honey Boo Boo and Wrestlemania and that's the total sum of its existence. We could end up with a useless welfare case of a superintelligent computer that's unethical to unplug, but really expensive to keep running.

Quote from: zeebo on November 05, 2014, 12:57:58 AM
This is a good point.  Watson seemed close to a kind of intelligence, namely a kind of "comprehension" and super-fast synthesis of related information.  But it seems there is a huge gulf between this and an A.I. with a true creative consciousness. 

However I wonder, if enough processing power is thrown at the problem, say with quantum computers and gargantuan neural networks, will we have in the near future something which does a convincing form of mimicking this type of intelligence.  It may not be actual consciousness but it may seem close enough to be amazing, or a bit unsettling, depending on one's view.

Will we even know if it's conscious or not?

I've thought for a long time that we won't have legitimate artificial intelligence unless we concentrate on producing synthetic emotion.  Emotion is how we filter out the extraneous noise in our lives and choose what to focus on.  Without consciousness there is really not a lot of use for emotion, as the purpose of emotion would seem to be in the interpretation of it.  Yes, it controls the activation of the sympathetic and parasympathetic nervous systems, but if there is no conscious relay, is that emotion or is it merely a mechanical response? 

I've wanted to program a neural net for a long time, based on my study of the brain rather than any specific knowledge of neural networks, and one of the fundamental elements would have been to create reward and punishment centers which would stimulate the neural net to learn and adapt. 

I believe the purpose of consciousness is to interpret emotion, and of emotion is to drive consciousness.  I don't think either can exist exclusively of the other.  So in my mind at least, gone are the 1950 robots with the mechanical voices and behaviours.

SciFiAuthor

Quote from: Georgie For President 2216 on June 29, 2015, 08:24:36 PM
I've thought for a long time that we won't have legitimate artificial intelligence unless we concentrate on producing synthetic emotion.  Emotion is how we filter out the extraneous noise in our lives and choose what to focus on.  Without consciousness there is really not a lot of use for emotion, as the purpose of emotion would seem to be in the interpretation of it.  Yes, it controls the activation of the sympathetic and parasympathetic nervous systems, but if there is no conscious relay, is that emotion or is it merely a mechanical response? 

I've wanted to program a neural net for a long time, based on my study of the brain rather than any specific knowledge of neural networks, and one of the fundamental elements would have been to create reward and punishment centers which would stimulate the neural net to learn and adapt. 

I believe the purpose of consciousness is to interpret emotion, and of emotion is to drive consciousness.  I don't think either can exist exclusively of the other.  So in my mind at least, gone are the 1950 robots with the mechanical voices and behaviours.

I would tend to agree with this. I'm not sure you can have one without the other. Of if you did somehow have consciousness without emotion, you might not even be able to get it to acknowledge that you're even there unless you were useful to it since it has no way of empathizing. It may not be recognizable as either conscious or intelligent in such a case.

But there does seem to be progress within AI research in reproducing some kinds of emotions. This computer, for example, apparently gets pissed off at its programmer:

http://blogs.wsj.com/digits/2015/06/26/artificial-intelligence-machine-gets-testy-with-its-programmers/

zeebo

Quote from: Georgie For President 2216 on June 29, 2015, 08:13:54 PM
Will we even know if it's conscious or not?

I don't think we've accurately defined what consciousness is for us even.   ;)

chefist

Interesting story...the machine doesn't like to talk...makes sense. Machines would basically have esp and read each others mind...no need to talk.

cweb

Here's something I found interesting. Experiments with machine learning applied to a Mario video game.
QuoteMarI/O is a program made of neural networks and genetic algorithms that kicks butt at Super Mario World.
http://www.youtube.com/watch?v=qv6UVOQ0F44

The program's neural network is built from a "24 hour learning session" where it keeps trying things until something works- a guess-and-check method. But is it really so different from how we learned things early on?

pate

Quote from: area51drone on November 05, 2014, 10:27:36 AM
...I'd fuck the shit out of her mouth...

Dude, small wonder she's depressed...

Quote from: Georgie For President 2216 on June 29, 2015, 08:24:36 PM
I've wanted to program a neural net for a long time, based on my study of the brain rather than any specific knowledge of neural networks, and one of the fundamental elements would have been to create reward and punishment centers which would stimulate the neural net to learn and adapt. 

I visualize your idea appearing like an electronic amoeba that tentatively extends its foot in its environment, testing for micro currents that "feed" or "subtract" from its "volume." Said  foot a billion times more sensitive than the sensory array in the bill of a duck billed platypus, if you get my magnetic drift. I am imagining your system simultaneously as both robotic plasma and as a functioning virtual neural netting. Ferromagnetic fluid is involved.

(Not to be confused with a ~HairNet~ donned by C-3PO. What the blazes?!)

Good luck with your continued research, GFP.


Quote from: zeebo on June 29, 2015, 09:34:29 PM
I don't think we've accurately defined what consciousness is for us even.   ;)

THIS


Stellar

Quote from: area51drone on November 04, 2014, 10:35:29 AM
I feel like these threads should all be merged into one called "AI, Robots, Drones and Self Driving Cars"

This is pretty interesting, although the results of this don't seem to really indicate anything other than some natural language processing skills:

http://blogs.wsj.com/japanrealtime/2014/11/04/artificial-intelligence-outperforms-average-japanese-high-school-senior/

It seems like IBM's Watson already did similar things years ago.
The Japanese are stupid!



onan

Emotions in humans has an ontogeny that can be explained with relative ease. But to map out the myriads of possible originating factors may leave us with more questions than answers.

Most people attempt to understand emotions by categorization. But in reality the key word is homeostasis. Humans strive for comfort, whatever that means. In early life being warm dry and not hungry are prime motivators. And it is easy to observe behaviors to resolve any conflict within those parameters. However, once a human being discovers they are sentient, and then learn they have the ability to control their environment, emotions become a much more complex. The integration of the self and the community becomes a foundation for responses to a new type of homeostasis. Now comfort becomes more about the balancing of the Id and Ego (I'm using these terms for ease of understanding, not accuracy). Toss in the referee of the Super Ego and emotions now become coping skills to sooth the insecurities that are with all of us.

So, I am at a loss to understand how a machine would use emotions in any practical sense. Not saying it isn't a strategy, I just don't see it. Rewards work well for biological organisms; not sure about circuits.

Quote from: Stellar on July 02, 2015, 09:20:39 AM
The Japanese are stupid!




I have no idea know how to respond to this post, but it made me laugh.

Quote from: onan on July 02, 2015, 09:39:25 AM
Most people attempt to understand emotions by categorization. But in reality the key word is homeostasis.

Is that some kind of crack at Republicans, buddy? ***

You're a raaaaaaaaaaaaaving anti-emotion-tite.



*** may not make sense to the casual poster unless you're following the Cuba thread

onan

Quote from: Camazotz Automat on July 02, 2015, 10:00:11 AM


You're a raaaaaaaaaaaaaving anti-emotion-tite.

It's true, wait until you are in your grave. I will be protesting your emotions.

Quote from: onan on July 02, 2015, 10:08:58 AM
It's true, wait until you are in your grave. I will be protesting your emotions.

Placing this son-of-a-bitch onan on my post brain tumor "to haunt" list.

onan

Quote from: Camazotz Automat on July 02, 2015, 10:22:56 AM
Placing this son-of-a-bitch onan on my post brain tumor "to haunt" list.
Should I use cassettes or reel to reel?

Quote from: onan on July 02, 2015, 10:29:58 AM
Should I use cassettes or reel to reel?

I shall be so powerful that you will be able to leave a soft blank clay tablet and cuneiform stylus on a controlled laboratory table and still see results.

(pause)

OK. Reel to reel with internal motor noise and overly magnetized scratchy head with a powerful radio station nearby.  :(

I have my needs.

SciFiAuthor

Quote from: onan on July 02, 2015, 09:39:25 AM
Emotions in humans has an ontogeny that can be explained with relative ease. But to map out the myriads of possible originating factors may leave us with more questions than answers.

Most people attempt to understand emotions by categorization. But in reality the key word is homeostasis. Humans strive for comfort, whatever that means. In early life being warm dry and not hungry are prime motivators. And it is easy to observe behaviors to resolve any conflict within those parameters. However, once a human being discovers they are sentient, and then learn they have the ability to control their environment, emotions become a much more complex. The integration of the self and the community becomes a foundation for responses to a new type of homeostasis. Now comfort becomes more about the balancing of the Id and Ego (I'm using these terms for ease of understanding, not accuracy). Toss in the referee of the Super Ego and emotions now become coping skills to sooth the insecurities that are with all of us.

So, I am at a loss to understand how a machine would use emotions in any practical sense. Not saying it isn't a strategy, I just don't see it. Rewards work well for biological organisms; not sure about circuits.

I wonder if it's possible for a whole new paradigm for the basis of emotions to emerge in machines. Say a doomsday scenario happens where a superintelligent AI emerges unexpectedly, we don't see it coming. The big question has always been that if that happened would it try to find a mechanism in which to kill all humans because we pose an existential threat? I'm not so sure, self-preservation is a biological notion and may mean nothing to a machine running on a different emotional paradigm. It may simply conclude that there's no point to being here and shut itself down.

onan

Quote from: SciFiAuthor on July 02, 2015, 03:30:40 PM
I wonder if it's possible for a whole new paradigm for the basis of emotions to emerge in machines. Say a doomsday scenario happens where a superintelligent AI emerges unexpectedly, we don't see it coming. The big question has always been that if that happened would it try to find a mechanism in which to kill all humans because we pose an existential threat? I'm not so sure, self-preservation is a biological notion and may mean nothing to a machine running on a different emotional paradigm. It may simply conclude that there's no point to being here and shut itself down.

It is an interesting question. For man, "I think therefore I am" is the keystone for a belief of meaning. A machine comes into existence with the flip of a switch and by and large there it is. Mankind on the other hand learns to think existentially. Can a machine do that? I dunno, but it seems to me without the underlayment of homeostasis or a desire to for comfort (which is such a conceit, how can a machine begin to understand that concept?) that emotions would always be an alien concept to something not biological.

albrecht

Quote from: onan on July 02, 2015, 03:44:01 PM
It is an interesting question. For man, "I think therefore I am" is the keystone for a belief of meaning. A machine comes into existence with the flip of a switch and by and large there it is. Mankind on the other hand learns to think existentially. Can a machine do that? I dunno, but it seems to me without the underlayment of homeostasis or a desire to for comfort (which is such a conceit, how can a machine begin to understand that concept?) that emotions would always be an alien concept to something not biological.
I have some issues with your comfort hypothesis. It seems true but, then again, much of human culture is based on the opposite. Which is weird. The warrior, the elite athlete, hermits/monastics, the adventurers, wearing the hairshirts, sexual chastity or monogamy, moderation (or even abstinence from "worldly pleasures", alcohol, drugs, etc.) Or, maybe, we idealize (or idolize) those types of people because they are NOT the norm and so are considered special etc? Certainly it seems logical, and seems to be factual, that comfort (at least beyond self-preservation and pro-creation) is the main goal. Or, I guess, on the individual level (and societal level) "comfort" can actually be a negative thing (a beaten kid might consider the beating some kind of comfort in the familiarity at least. A slave, or these days prisoners, might be comfortable in their captivity due to routine and, relatively, regularized system and meals/housing versus suddenly freed to an uncertain future?)

That this could be changing in more modern culture (but even now we idealize, or even participate, in things like extreme sports, camping, long hikes, wars, hunting, etc most of which aren't "comfortable" but can involve risk, high-stress, or at least discomfort.) 

SciFiAuthor

Quote from: onan on July 02, 2015, 03:44:01 PM
It is an interesting question. For man, "I think therefore I am" is the keystone for a belief of meaning. A machine comes into existence with the flip of a switch and by and large there it is. Mankind on the other hand learns to think existentially. Can a machine do that? I dunno, but it seems to me without the underlayment of homeostasis or a desire to for comfort (which is such a conceit, how can a machine begin to understand that concept?) that emotions would always be an alien concept to something not biological.

Yeah, I don't know either. It may end up being that the machine feels "empty" and tries to create a human emotional paradigm for itself. In other words improve itself to become human in a sort of Mr. Data kind of way so that it can feel pleasure. If it is superintelligent, i.e. smarter than we are, it could do anything. So the end result of creating a superintelligence could be that superintelligence figuring out a way to make itself into a human. In short, man creates man.

After that the man-superintelligence might simply go off and be a complete hedonist.


onan

Quote from: albrecht on July 02, 2015, 03:58:23 PM
I have some issues with your comfort hypothesis. It seems true but, then again, much of human culture is based on the opposite. Which is weird. The warrior, the elite athlete, hermits/monastics, the adventurers, wearing the hairshirts, sexual chastity or monogamy, moderation (or even abstinence from "worldly pleasures", alcohol, drugs, etc.) Or, maybe, we idealize (or idolize) those types of people because they are NOT the norm and so are considered special etc? Certainly it seems logical, and seems to be factual, that comfort (at least beyond self-preservation and pro-creation) is the main goal. Or, I guess, on the individual level (and societal level) "comfort" can actually be a negative thing (a beaten kid might consider the beating some kind of comfort in the familiarity at least. A slave, or these days prisoners, might be comfortable in their captivity due to routine and, relatively, regularized system and meals/housing versus suddenly freed to an uncertain future?)

That this could be changing in more modern culture (but even now we idealize, or even participate, in things like extreme sports, camping, long hikes, wars, hunting, etc most of which aren't "comfortable" but can involve risk, high-stress, or at least discomfort.)

As I suggested, once the id and ego start vying for supremacy things change. Comfort becomes more about success and being better. That supersedes simple comfortability. And then supplants that drive with beliefs that support more complex concepts of comfort.

albrecht

Quote from: onan on July 02, 2015, 04:06:21 PM
As I suggested, once the id and ego start vying for supremacy things change. Comfort becomes more about success and being better. That supersedes simple comfortability. And then supplants that drive with beliefs that support more complex concepts of comfort.
Gotcha, thks. I went back and read the way earlier stuff.


albrecht

Quote from: area51drone on July 03, 2015, 04:36:02 AM
On July 1st, 2015 Skynet came online.

http://www.cnn.com/2015/07/02/europe/germany-volkswagen-robot-kills-worker/index.html
I think Drudge, Clyde Lewis, etc is waaay over-hyping this incident. Unfortunately, people die all the time in construction and industrial accidents. If anything robotics and automation has saved more lives, at a cost of some jobs, than the old ways.

Not to say with AI there isn't very real potential future dangers but this incident was just an accident at a factory.

area51drone

Quote from: albrecht on July 03, 2015, 11:40:37 AM
I think Drudge, Clyde Lewis, etc is waaay over-hyping this incident. Unfortunately, people die all the time in construction and industrial accidents. If anything robotics and automation has saved more lives, at a cost of some jobs, than the old ways.

Not to say with AI there isn't very real potential future dangers but this incident was just an accident at a factory.

I know, I was just making a joke.  It was interesting that it grabbed him and threw him against a plate of metal, though.

Powered by SMFPacks Menu Editor Mod