RE: The A.I. Project

new topic     » topic index » view thread      » older message » newer message

> I would replace survival with "good-bad".

Let's say "positive-negative," just to avoid potential semantic 
confusions.

> If finding food and avoiding poison is most
> important thing in their life then they wont get bored.

They're not motivated to do either UNLESS you're just going to have this 
be instinct. Finding food will be important IF there's a purpose to it. 
Avoiding poison? Why put poison in the environment at all?

This also goes back to the question: will we EMULATE an AI entity or 
will we create an AI entity?

> I'm not sure if feelings are really needed in AI
> and if AI creature will have them.

A survival instinct will have some motivation component; otherwise, the 
creature would just exist to die. The creature can't be "apathetic," so 
to speak, or it won't be motivated enough to live. This begs the 
question, does one build "fear" into the survival instinct? (Fear of 
death?) Which will keep it alive (possibly) until such time as it learns 
about and understands what death is.

> I can't just make AI creature and world and
> enemies and leave it running all night and then
> next morning I will have real AI :)

Of course not over night, but in the long run this is exactly what you 
should expect. I'm not saying that it necessarily has to be hands-off, 
because you'll want to interact with it, maybe to let it know what's 
positive behavior and what's negative behavior (think about raising a 
child... this is how you will develop an AI entity).

> > Now, since your "enemies" are really just poison pills,
> > the "survival instinct" of your Pacman will simply be
> > a "maintain a high number" instinct.
> 
> What's wrong with that?

No problem- "maintain a high number" is just a less-fatal version of 
"avoid death." I guess I was thinking that "fear of death" would be a 
great motivator.

> I want to achieve intelligence, I'm not
> interested in what will AI creature feel...

But what if those feelings are necessary for intelligence? I hate to 
keep repeating myself, but "fear of death" might be REQUIRED just to get 
the creature functioning to survive. You can't really start an AI entity 
(can we get an officially approved name for our entity?!) with a "fear 
of death" because I don't think that's an instinct.

> > All I see for your Pacman is it remembering the pattern of
> > the maze and maybe locations of bite-sized healthy bits.
> 
> Remembering pattern is quite important.

But it's just a robot with a database at that point. Not really what we 
ultimately want.

As I think about it, human intelligence requires the hardware of a human 
brain... something we don't have and won't have for a very long time. 
The problem is, as we go down the scale of brains (ultimately ending 
with a worm?), we have to question where does intelligence end and 
instinct begin? I mean, how low can we go before we stop seeing 
intelligence and start seeing only instinct?

Would anyone say a worm is intelligent? If not, then we're just creating 
a lifeform (as opposed to an intelligent lifeform). If worms are 
intelligent, then how does that intelligence function on a hardware 
level? Can we answer these questions?!

new topic     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu