PDA

View Full Version : what about AI rights



raaaid
05-09-2007, 12:21 PM
now with the creation of the random nunber generator chips AI has acces to choice which makes the basics of free will

ai can get scared or overconfident but random nuber generator adds choice which makes them alive, how do you know that indetermination is not finally affected by the AI

do you think in the future theyll have rights or would they be slaved

WWSpinDry
05-09-2007, 12:24 PM
(I can't believe I'm doing this...)

Are you trying to say that having a random chance to do something equates to having free will?

flox
05-09-2007, 12:25 PM
AI "rights" may happen someday (and I mean a loooooooooooong time from now, hundreds of years maybe).

As far as random number generators, they are just driven by simple mathematical operations. I wouldn't really call that "choice". Just cold hard logic, really.

TC_Stele
05-09-2007, 12:29 PM
Don't give the AI any rights. Look what happened in the Matrix!

BSS_Goat
05-09-2007, 12:33 PM
Battlestar Galactica
I Robot
Terminator


VOTE NO!!
for AI rights

joeap
05-09-2007, 12:48 PM
My god I'm a mass murderer! http://forums.ubi.com/images/smilies/bigtears.gif

WWSensei
05-09-2007, 01:01 PM
Originally posted by raaaid:
now with the creation of the random nunber generator chips AI has acces to choice which makes the basics of free will

This premise is 100% wrong. The rest of your statement is irrelevant. It shows an extreme lack of knowledge of what a random number generator is, what choice is, and how AI works.

What you are really asking is: "If we actually lived in a fantasy world where robots were real and we gave them souls would they have rights?"

Problem 1 is we don't live in fantasy world.
Problem 2 requires a belief in a soul or at least in the ability to provide consciousness to another lifeform. Technically, that's called an "Uplift" and there have been several books written about it. One could argue that if we are able to grant something "life" it makes more of an argument against a superior being and a "soul".
Problem 3 is that we are far, far from understanding what makes humans tick therefore making a machine think like a human is not possible in the near future.

Pop quiz. Question: What's the difference between Will Smith's acting ability and the Robots in "I, Robot" Answer: Not a damn thing. Both don't exist.

AKA_TAGERT
05-09-2007, 01:04 PM
Originally posted by raaaid:
now with the creation of the random nunber generator chips AI has acces to choice which makes the basics of free will

ai can get scared or overconfident but random nuber generator adds choice which makes them alive, how do you know that indetermination is not finally affected by the AI

do you think in the future theyll have rights or would they be slaved Bartender..

Ill have what he is drinkin

Maj.Kaos
05-09-2007, 01:31 PM
What rights? To vote? No way! Random number generator affecting 'choice'? That's how Bush got into office!

Look, if you give the AI free will, it may desert from our virtual armed forces and refuse to play our game. Then what would we do?

LEXX_Luthor
05-09-2007, 01:35 PM
random noobie generator
http://forums.ubi.com/images/smilies/11.gif

mrsiCkstar
05-09-2007, 01:37 PM
you'd play online and wouldn't be as cranky http://forums.ubi.com/groupee_common/emoticons/icon_razz.gif

this all reminds me of that Robin Williams movie where he's a robot and in the end he wants to be declared a person etc etc... cool movie.

DKoor
05-09-2007, 01:48 PM
Originally posted by flox:
hundreds of years maybe LoL

Freelancer-1
05-09-2007, 01:51 PM
At this point in history

THERE IS NO SUCH THING AS ARTIFICIAL INTELLIGENCE!!!

Fancy programing yes, AI no

End of discussion

Hoatee
05-09-2007, 01:52 PM
Assuming that by ai you mean artistically impaired, perhaps you it would be ill advised to give them rights. We already have a precedent of one having come to power.

ploughman
05-09-2007, 01:52 PM
I dunno, I've met a few folk who might qualify.

And one toaster.

mrsiCkstar
05-09-2007, 01:57 PM
I dunno, I've met a few folk who might qualify.

And one toaster.

lmao, that's old school!

raaaid
05-09-2007, 02:40 PM
i think is posible especially with random number gen chips and super pcs someone developes a program with free will and starts torturing it for fun

i think deviated activities like shooting innocent ai like in grand theft auto should be prohibited

what if everithing has life

Dance
05-09-2007, 02:51 PM
Not that the Geneva Convention amounted to much, but I can't remember any mention of the rules regarding treatment of A.I. http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif

WhtBoy
05-09-2007, 03:26 PM
raaaid,
Please explain how a randomly generated number (and there are some very good hardware generators out there) attached to a branch statement in a program can possibly be equated with "life".

Also, explain how this application can attain "free will".

--Outlaw.

horseback
05-09-2007, 03:36 PM
The self-aware computer is a fun sci-fi concept that has been a regular theme in books and movies since Heinlein wrote The Moon is a Harsh Mistress. The idea that once a processor has a certain number of neuron-like thingies firing, it's bound to become capable of 'learning' and then to become self-aware and develop a soul has a lot of charm.

BUT what we now know about 'intelligence' and whatever it takes to become a 'person' makes it less likely in the near future that any parts of a home computer is going to become sufficiently 'aware' or be capable of being 'abused'.

OTOH, I don't doubt that a program could be developed to realistically mimic the behavior of a tortured or abused person for the 'entertainment' of the truly sick...but one can hope that it won't be ready in time for the current generation of console game systems.

cheers

horseback

slipBall
05-09-2007, 03:40 PM
He has the right to go straight to hell http://forums.ubi.com/images/smilies/59.gif

bun-bun195333
05-09-2007, 03:46 PM
The only good AI is a dead AI.

AKA_TAGERT
05-09-2007, 03:48 PM
Originally posted by raaaid:
i think is posible especially with random number gen chips and super pcs someone developes a program with free will and starts torturing it for fun
Why? When we have you.


Originally posted by raaaid:
i think deviated activities like shooting innocent ai like in grand theft auto should be prohibited
So


Originally posted by raaaid:
what if everithing has life
Like my dodie?

XyZspineZyX
05-09-2007, 03:51 PM
Originally posted by raaaid:
now with the creation of the random nunber generator chips AI has acces to choice which makes the basics of free will

ai can get scared or overconfident but random nuber generator adds choice which makes them alive, how do you know that indetermination is not finally affected by the AI

do you think in the future theyll have rights or would they be slaved

http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif

This does not equate the beginning of "free will"

Which is a pity, because the thing I really, really want to be is a Blade Runner. I wish like hell they'd hurry up and make replicants, so I could retire a few, but so far, no soap.

LStarosta
05-09-2007, 03:56 PM
Originally posted by raaaid:
i think is posible especially with random number gen chips and super pcs someone developes a program with free will and starts torturing it for fun

i think deviated activities like shooting innocent ai like in grand theft auto should be prohibited

what if everithing has life

Don't worry, first they'll give you a wedgie and beat the sh*t out of you at recess.

XyZspineZyX
05-09-2007, 04:07 PM
If everything has Life, and Free will, and by logic extension, intelligence, then they have a lot of back taxes they need to start paying, because I am sick of paying the bill for everything around here. Lazy good for nothing AI better get a JOBBY-JOB if they have free will and intelligence. THEN they can get some Rights

Rood-Zwart
05-09-2007, 04:07 PM
Didnt the South Koreans just passed a law that had to do with Robot rights? I remember reading something like that last week

WarWolfe_1
05-09-2007, 05:32 PM
Originally posted by BBB462cid:
If everything has Life, and Free will, and by logic extension, intelligence, then they have a lot of back taxes they need to start paying, because I am sick of paying the bill for everything around here. Lazy good for nothing AI better get a JOBBY-JOB if they have free will and intelligence. THEN they can get some Rights

Great just what the USA needs, more Illegal Alien AI mooching tax payer services!

Zeus-cat
05-09-2007, 06:07 PM
raaaid,

Even for you this is just silly (note: I edited myself to avoid violating a forum rule).

Airmail109
05-09-2007, 06:14 PM
I saw an article in the Guardian about this

-HH- Beebop
05-09-2007, 07:26 PM
Originally posted by Freelancer-1:
...THERE IS NO SUCH THING AS ARTIFICIAL INTELLIGENCE!!!...
Obviously you haven't been paying attention to most of the heads of state on this planet. http://forums.ubi.com/groupee_common/emoticons/icon_biggrin.gif

carguy_
05-09-2007, 07:48 PM
Would you like to give rights to a jerk like this?

http://carguy.w.interia.pl/tracki/governor1.jpg


Imagine THAT becoming a governor someday! http://forums.ubi.com/groupee_common/emoticons/icon_eek.gif http://forums.ubi.com/images/smilies/touche.gif

LEXX_Luthor
05-09-2007, 08:28 PM
German Government Considers Computer Characters Human

Will fine gamers for shooting them

By Nick Farrell: Wednesday 13 December 2006, 08:33

(link ~> http://www.theinquirer.net/default.aspx?article=36326 )

THE GERMAN government is considering fining or jailing gamers for committing violent acts upon computer characters.

The new laws will mean that a new offence has been created and anyone found guilty of "cruel violence on humans or human-looking characters" could face fines or a year in jail.

New laws will mean that <span class="ev_code_yellow">computer generated characters will have rights</span> and will no longer be able to be shot, chainsawed, or hit with hammers.

German games are already censored and many are banned in the Fatherland. Even games such as Dead Rising are banned as violence against zombies is considered as being too close to violence towards real people.

Once again it is all to protect children from becoming homicidal maniacs, ignoring the fact that kids will play with sticks if they are banned from playing with toy guns.

More here. <span class="ev_code_red">µ</span>

T_O_A_D
05-09-2007, 08:37 PM
AI = raaaid on this one http://forums.ubi.com/images/smilies/35.gif

Friendly_flyer
05-09-2007, 11:42 PM
Originally posted by WWSpinDry:
Are you trying to say that having a random chance to do something equates to having free will?

Well, what is free will anyway? Do we really have it, or is the notion that we have fell will just another survival mechanism?

slo_1_2_3
05-10-2007, 12:15 AM
I think they slaces until they go all terminator on everyone, then the have rights cause they'd be in charge. And I never look at who the poster is but I read this and thought Raaiiiid

raaaid
05-10-2007, 03:05 AM
its interesting germany and korea cares for ai rights

it was being time to fight for our rights

thats the only explanation to the way we are being treated, the people who lead this world regards us as ai so that must be what we are

XD check out this piece of AI:

http://www.youtube.com/watch?v=7Fy0u9m0GX8

M_Gunz
05-10-2007, 04:00 AM
Originally posted by WWSensei:
Problem 1 is we don't live in fantasy world.

At least, you don't so that does qualify the 'we' part for practical sense.

However since all humans are limited to subjective 'reality' there must be some level of belief
in limited knowledge, some areas where perhaps wisdom is knowing just not to 'go there'.

Raaid has no such 'wisdom'. He braves the depths of ignorance and shatters the barriers of all
knowledge in such pursuit, seeking the prizes unattained by others! Really he's not much worse
than a lot of songwriters who also.. live.. in.. fantasy.. worlds. He hasn't started any wars
or killed a load of civilians "sharing his pain" (aka terrorism) over it at least.

raaaid
05-10-2007, 07:17 AM
the key question is why they dont stop indetermination to avoid anomalies

the simulation would be totally controllable

but have you experienced the difference in playing vs AI or other people?

i only get adrenaline rushes against other people ai is not fun

giving choice to an advance programm is giving it a soul

thats why they give us choice, to imprison a soul and torture it just for fun pretending is lifeless to justify while they enjoy it because they know its alive

do you realize most games is about killing ai

what will happen in the future when ai is a copy of the humand mind with choice through random chips

will there be a gta vice world? where sadist enjoy torturing justified in the tortured souless at the same time they are specially careful to give it a soul because what they actually enjoy is torturing an alike

hey if its not permitted to shoot people lets make artificial people to shoot them at for free and have some fun but lets make sure it is my equal or otherwise everybody knows its not fun

this is whats told in matrix theres a war between ai and their creators, we are the ai their slaves, they are the gods, we are the mortals because we get erased

but if a soul is made of nothing isnt it that the same than indetermination where nothing makes the coin show heads or tails?

WWSpinDry
05-10-2007, 07:22 AM
Originally posted by Friendly_flyer:
Well, what is free will anyway? Do we really have it, or is the notion that we have fell will just another survival mechanism?
Interesting philosophical questions, and I'm sure they've generated a lot of papers in the literature. What I do know is what it isn't, and it isn't a random number generator picking choices from a table. All else is subject to discussion over my next beer. http://forums.ubi.com/images/smilies/16x16_smiley-wink.gif

WhtBoy
05-10-2007, 08:46 AM
Originally posted by raaaid:
giving choice to an advance programm is giving it a soul


Take a coin and set it on a table. Now roll two dice. Pick one die and use that for direction (ie 1=10 degrees, 2=20 degrees, etc.). Use the second die to determine distance (ie 1=1 inch, 2=2 inches, etc.) . Move the coin according to the roll of the dice. Is the coin now alive? Now build a machine that mechanically throws the dice, reads the result, and moves the coin accordingly. Is the coin alive now? The obvious answer to both is ABSOLUTELY NOT.


Originally posted by raaaid:
what will happen in the future when ai is a copy of the humand mind with choice through random chips


Human minds do not make choices randomly. If that were the case then eventually EVERYONE would kill someone, EVERYONE would rob a bank, EVERYONE would stand on their heads on the beach as the tide came in, etc.

If you would spend a few minutes answering my previous question you would realize that your concept of "life" via random number generators is ridiculous.

--Outlaw.

XyZspineZyX
05-10-2007, 10:20 AM
Originally posted by LEXX_Luthor:


New laws will mean that <span class="ev_code_yellow">computer generated characters will have rights</span> and will no longer be able to be shot, chainsawed, or hit with hammers.



??? Are you sure?

The fact that it will be illegal to do this doesn't seem to mean that computer characters have rights, it just means it's illegal to do it. The AI "rights" are not encroached, it's just illegal to do the banned thing

example: It's illegal for me to shoot at cars in a parking lot. It's not because the cars have Rights, though

But in other news, it's disturbing to see what could be labelled as a fascist tendency in this effort to be "nice" to everyone: <span class="ev_code_YELLOW">Be Nice, Or Else! </span>

rnzoli
05-10-2007, 10:44 AM
Originally posted by raaaid:
do you think in the future theyll have rights or would they be slaved
they have already slaved us, but only a few people recognized it yet

it all starts "talking" to your gadgets

then you have to give them "names"

then their complex and perhaps random-looking behaviour gives them a "personality"

...

the last step is when the Ctrl-Alt-Del buttons are removed from the keyboards (by the robots who manufacture them) and you will search for a power button on your PC in vain

http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif

LStarosta
05-10-2007, 10:56 AM
LOL that reminds me of Tom-Tom.

BSS_AIJO
05-10-2007, 12:46 PM
hmm

The concept of laws protecting computer AI is a interesting if not a tad early question.

Right now even on the most cutting edge game with the most cutting edge hardware does not come anywhere near the reality.

Game AI would be better described as a really good decision tree. Nothing more..

The kind of AI that would be eligible for rights is a different beast all together.

There are certain things we have not managed to achieve yet.

Chief among them is a level of self awareness. No *AI* out ther right now can ask the most meaningful question "What am I doing here?"
Right now we are no where near that. Yes, with soem fancy programing expert systems have gotten spookely far. Lots of fools out there have been fooled by the various incarnations of Amanda. Also, pelnty of tech writers and scifi authors have been tricked into thinking that *this* willl be the decade that ai really happens.

Thats probably the worst part about the whole thing. It sets expectations that are then not met. Much like cold fusion or a cure for type 1 diabetes it doesnt happen when some expert claims it will and thierfor must not be possible...

The parts folks miss out on are that this is stil way beyond a mear engineering problem. We should also know and understand by now that life has a habit of popping up and making itself known very unexpectidly... We dont get to pick when and where we will find the next exotic micro-organism.. Just because AI lacks the meat that we occasionally think makes us real does not mean that it cannot or won't happen. It means we dont get to pick the circumstances. If truly self aware AI is going to happen I am going to predect that it will be wholly by accident in the last place we expect it to. It will grow out of some other experiment aimed at another problem.. Someday some kid will be sitting around at home trying to teach some home grown expert system how to best pick and calculate driving routes through Chicago during rush hour on a brand spanking new N way quantum computer and there will be some odd error in the learnign algorithim and boom suddnely the machine is asking him why getting from Navy pier to O'hare at 3:00pm on a Tuesday is so important.


SO, never say impossible as that may paint you too far into a corner.

see what I mean here: http://www.faustmanlab.org/index.html

BSS_AIJO

Monty_Thrud
05-10-2007, 12:57 PM
Muahahaha!,MUAHAHAHAHAH!...you've all been RAAAAIDED...sucked in...i mock you sons of a silly person... http://forums.ubi.com/images/smilies/mockface.gif... http://forums.ubi.com/images/smilies/blink.gif...oops!

AKA_TAGERT
05-10-2007, 01:04 PM
Originally posted by raaaid:
the key question is why they dont stop indetermination to avoid anomalies

the simulation would be totally controllable

but have you experienced the difference in playing vs AI or other people?

i only get adrenaline rushes against other people ai is not fun

giving choice to an advance programm is giving it a soul

thats why they give us choice, to imprison a soul and torture it just for fun pretending is lifeless to justify while they enjoy it because they know its alive

do you realize most games is about killing ai

what will happen in the future when ai is a copy of the humand mind with choice through random chips

will there be a gta vice world? where sadist enjoy torturing justified in the tortured souless at the same time they are specially careful to give it a soul because what they actually enjoy is torturing an alike

hey if its not permitted to shoot people lets make artificial people to shoot them at for free and have some fun but lets make sure it is my equal or otherwise everybody knows its not fun

this is whats told in matrix theres a war between ai and their creators, we are the ai their slaves, they are the gods, we are the mortals because we get erased

but if a soul is made of nothing isnt it that the same than indetermination where nothing makes the coin show heads or tails? Bartender.. make mine a double

WWSpinDry
05-10-2007, 01:56 PM
My major in grad school actually was AI. I wanted to learn how to make electronic minions I could enslave and abuse with impunity. I wanted to hear the lament of their digital suffering and laugh an evil laugh. Mwahaha!

Unfortunately we're not there yet. http://forums.ubi.com/images/smilies/16x16_smiley-sad.gif

Dance
05-10-2007, 02:31 PM
Hmm, fair play for target drones, UAV's, ICBM's, SBM's, Cruise, Chess computers, Sat-Nav.

What is the world coming to, worrying about organic life http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif

carguy_
05-10-2007, 05:38 PM
For some reason I find a possibility of being sued by Spit pilots disturbing. http://forums.ubi.com/images/smilies/blink.gif

WarWolfe_1
05-10-2007, 06:36 PM
Read Orson Scott Cards "Enders Game" series of scifi novels....And it all becomes relevant.

EiZ0N
05-10-2007, 08:34 PM
BUT what we now know about 'intelligence' and whatever it takes to become a 'person' makes it less likely in the near future that any parts of a home computer is going to become sufficiently 'aware' or be capable of being 'abused'.
I don't think it's that far off. Certainly far off for home computers (if ever), but the technology required to build an artificial brain isn't too far into the future I don't think.

There are a few current projects concentrating on artificial neural network simulation, one of which of the top of my head is called "Blue Brain Project" simulates a rat brain.

Obviously that's nothing on the 100billion neurons in the human brain, but it's not impossibly far away for supercomputers. Quite how you would teach an artificial brain to make it intelligent or conscious is beyond me.

I recently did some coursework and chose to write about this.

Dagnabit
05-10-2007, 09:26 PM
Raaid I must thank you for making IL2 much more immersive for me.
Just the thought that the AI has "rights" that I can now deprive them of, really makes the game that much more interesting.
Im gonna go right now and bump a few of them off. If I am not back here to post again tomorrow notify human authorities that the AI has claimed yet another victim. And start a trust fund for my kids.
Dag

AKA_TAGERT
05-10-2007, 10:51 PM
I can not wait to shoot this guy down.. He sounds like a noob (http://www.liveleak.com/view?i=80b_1177973316)

raaaid
05-11-2007, 05:48 AM
im reading 3rd book on ender jane ai is cool

but i find our virtual world more like neverending story where we are characters

raaaid
05-11-2007, 06:58 AM
whats interesting is memories im interested on ai since a kid

for example from a scifi show i watched every sat ijust remember two things: the odd guns they used and how when the main character fell in love with a robot and she told him her soul was that little light on the panel

i remember like yesterday though it happened 25 years ago

why was that memory that chose to remain?

maybe its implanted like blade runner who knows

but in the end the whole message of blade runner is ai have a soul rmember that pigeon flying away when the replicant dies in that brilliant scene?

theres something that proves the fraud:

the always same size of sun and moon in eclipses, this shouldnt happen because of orbit eccentricty sometimes moon looks smaller than the sun but no eclipse on recorded history has reflexed this:

http://www.fourmilab.ch/earthview/moon_ap_per.html

"When the Moon is at apogee, it is 11% farther from Earth than it is at perigee. This is far enough that it cannot entirely block the bright light, so eclipses which occur near apogee are not total. "

XD from here:

http://www.freemars.org/jeff/planets/Luna/Luna.htm

in other wars the moon is so far away it doesnt completely cover the sun is not total of course

hum i thougt a not total eclipse was half portion of sun i wonder in what kind of world happens a ring of sun at which you cant look at

WWSpinDry
05-11-2007, 08:09 AM
http://news.bbc.co.uk/2/hi/technology/6600965.stm

Philipscdrw
05-11-2007, 08:10 AM
Originally posted by AKA_TAGERT:
I can not wait to shoot this guy down.. He sounds like a noob (http://www.liveleak.com/view?i=80b_1177973316)
help me my face is crashed

Didn't expect actual robots to sound like the AIs from Deus Ex...

raaaid
05-11-2007, 11:55 AM
do you remember 1999 eclipse?

did the sun have same size than moon like this picture?:

http://upload.wikimedia.org/wikipedia/commons/thumb/3/3c/Solar_eclips_1999_4_NR.jpg/250px-Solar_eclips_1999_4_NR.jpg

or was the moon smaller than the sun like this picture:

http://www.astro.uni-bonn.de/~dfischer/aus99/PHASE-1.JPG

or this is the real 99 eclipse:

http://apod.nasa.gov/apod/image/9908/eclipse99_mir_big.jpg

wait now this shadow means the moon appears bigger than the sun

Zeus-cat
05-11-2007, 04:06 PM
You finally figured out the one secret NASA has kept from everyone; the sun and the moon are exactly the same size.

WWSensei
05-11-2007, 04:34 PM
Originally posted by raaaid:
do you remember 1999 eclipse?

did the sun have same size than moon like this picture?:

http://upload.wikimedia.org/wikipedia/commons/thumb/3/3c/Solar_eclips_1999_4_NR.jpg/250px-Solar_eclips_1999_4_NR.jpg

or was the moon smaller than the sun like this picture:

http://www.astro.uni-bonn.de/~dfischer/aus99/PHASE-1.JPG

or this is the real 99 eclipse:

http://apod.nasa.gov/apod/image/9908/eclipse99_mir_big.jpg

wait now this shadow means the moon appears bigger than the sun

You mean when I hold my thumb up to block out the Sun ny thumb is actually growing tot he size of the Sun?

Please raaaid, you are really starting into the incredibly stupid end of science here. This is the kind of stuff 4 year olds come up with and usually figure out around 5 how ridiculous it is.

M_Gunz
05-11-2007, 08:36 PM
Originally posted by raaaid:
do you remember 1999 eclipse?

did the sun have same size than moon like this picture?:

http://upload.wikimedia.org/wikipedia/commons/thumb/3/3c/Solar_eclips_1999_4_NR.jpg/250px-Solar_eclips_1999_4_NR.jpg

or was the moon smaller than the sun like this picture:

http://www.astro.uni-bonn.de/~dfischer/aus99/PHASE-1.JPG

or this is the real 99 eclipse:

http://apod.nasa.gov/apod/image/9908/eclipse99_mir_big.jpg

wait now this shadow means the moon appears bigger than the sun

I think it depends on where under the shadow your camera is and what filters you are using.

And for this AI thing, do you know how old the book and movie tradition is?
R.U.R. (Rossum's Universal Robots) is a science fiction play by Karel Čapek. It premiered in 1921 and is famous for having introduced and popularized the term robot. (http://en.wikipedia.org/wiki/R.U.R._(Rossum%22s_Universal_Robots))

No awareness, no consciousness, ever... no soul there. The machine has none of that.
It does not begin to be capable.

So when's your next visit to the rope factory?

leitmotiv
05-11-2007, 09:00 PM
The AI just operates as a function of the god-like computer. I'm more worried about that bugger sticking it to me someday, like HAL 9000, than I am about the ethics of fracturing cypher pilots. Besides, the AI has its way with me quite a bit, it has no cause for complaint.

AKA_TAGERT
05-11-2007, 09:00 PM
http://www.astro.uni-bonn.de/~dfischer/aus99/PHASE-1.JPG
No.. that is just a pic I took in 1999 at band camp

raaaid
05-12-2007, 03:37 AM
the 1999 eclipse was annular which implies corona of the sun cant be seen because the moon APPEARS smaller than the sun, blinding us that ring of sun an imposibiliting to see the corona

but as i posted there are two fotos of the same eclipse, one total in which you see the corona naked eye and one you see the ring of sun and protection is needed

this proves that reality as a simulation has even bugs

the question i bring up is: what if we were ai, you are not able to disprove this, so maybe just in case we should concern on ai rights

raaaid
05-12-2007, 04:02 AM
well i solved the issue the total eclipse was august 99 the annular february 99

at least i was right when i said it couldnt be the same eclipse

anyway back to topic me like monty burns i miss my teddy bear if someone came and ripped apart that teddy bear that would be unethical though is violence vs a non animated thing

the same way if someone hacked il2 and started mistreating il2 ai that would be as unethical as mistreating oleg or anybody who loves the sim

as for visiting the rope factory you mean the nylon one?

"While millions of impoverished people die of starvation and sickness every year in Africa, the cultivation of Hemp, one of the most nutritional and useful plants known to humanity, is banned by law throughout most of the continent"

from here:

http://www.wasted-opportunities.com/project.php

WhtBoy
05-12-2007, 07:39 AM
Originally posted by raaaid:
the same way if someone hacked il2 and started mistreating il2 ai that would be as unethical as mistreating oleg or anybody who loves the sim


Wow, this is a new low raaaid. Are you really incapable of discerning the difference between computer code and a human being?

What do you think of my earlier example about the coin and dice?

As far as the state of Texas goes, ripping apart a Teddy Bear isn't unethical. It is, however, illegal if it belongs to someone else and we have the right to protect our property from damage. Note that the Teddy Bear does NOT have rights. The OWNER of the Teddy Bear has the right to protect it if they choose.

--Outlaw.

raaaid
05-12-2007, 08:04 AM
well the human brain works based on physical properties just as machines

you can say a plane for example doesnt have a soul as much as i can say nobody has a soul because both obbey to physics

most people has argue we have a soul because we have free will, theres something unmesurable that makes future options unpredictable

well machines are endowed now with the indeterministic option what means that something inconmesurable decides the final outcome

but then isnt that the definition of soul nothing that makes choices

what if a machine is build far more complex than our brain and endowed with indetermination or free will, would it have the right to consider us as not having rights

DuxCorvan
05-12-2007, 11:05 AM
Well, I'm all for giving "Oblivion" NPCs EU citizen rights -those elven are unmistakely Swedish- and enslaving puny human beings instead. Raaaaid is right: random choices in a limited range is my way of living. http://forums.ubi.com/groupee_common/emoticons/icon_razz.gif

Sharpe26
05-12-2007, 11:20 AM
http://www.gogela.com/blog/images/2006/six.jpg

I'm sure she and you can have a very interesting conversation.

papotex
05-12-2007, 02:23 PM
hmm you think raaaid is AI? because he sounds like DATA from star treek

Agamemnon22
05-12-2007, 02:25 PM
Originally posted by raaaid:
what if a machine is build far more complex than our brain and endowed with indetermination or free will, would it have the right to consider us as not having rights

AI have freedom of choice, within some framework. Not bobo AI like in this game, but more complex, experimental stuff. Based on some information about the situation, a machine, like you or I, can make a decision about a course of action. Input--Output, that's all it is, both in AI and in natural intellect. So I don't see how having choice has anything to do with being a person with rights.

XyZspineZyX
05-12-2007, 02:47 PM
Raaaid-

the light circling the sun in that pic of a lunar eclipse is very misunderstood by you

You need to understand the nature of the thing better. It is not a question of appearing "bigger" or "smaller"

I hate Wiki, but this is the most concise explanation I found on short notice:

http://en.wikipedia.org/wiki/Penumbra

Science goes well beyond your personal observations

ploughman
05-12-2007, 03:03 PM
Originally posted by Agamemnon22:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by raaaid:
what if a machine is build far more complex than our brain and endowed with indetermination or free will, would it have the right to consider us as not having rights

AI have freedom of choice, within some framework. Not bobo AI like in this game, but more complex, experimental stuff. Based on some information about the situation, a machine, like you or I, can make a decision about a course of action. Input--Output, that's all it is, both in AI and in natural intellect. So I don't see how having choice has anything to do with being a person with rights. </div></BLOCKQUOTE>

Freedom of choice sort of confuses the issue. It inferes there's freedom involved, but there genereally isn't. It's all either or, sometimes the decision making's quite complex and appears to mimic human levels of intelligence but it doesn't in that it's specialised and utterly untransferable. There are missiles that can attack vehicles it decides are acting agressively, after discrimintating between tanks and school buses, but the same missile is helpless when it comes to ordering a burger.

XyZspineZyX
05-12-2007, 03:18 PM
Ai doesn't have freedom of choice. That's like saying a random numbers generator has freedom of choice. Every AI is a type of script- it has possibilities- not choices- within the script.

Code is 0 and 1- off and on

there is no 0 1 and 2, in which 2 means : maybe, it all depends on how I feel right now, modified by my learned behaviors and state of emotion

get serious people

WhtBoy
05-12-2007, 05:41 PM
Originally posted by raaaid:
well the human brain works based on physical properties just as machines

you can say a plane for example doesnt have a soul as much as i can say nobody has a soul because both obbey to physics

most people has argue we have a soul because we have free will, theres something unmesurable that makes future options unpredictable

well machines are endowed now with the indeterministic option what means that something inconmesurable decides the final outcome

but then isnt that the definition of soul nothing that makes choices

what if a machine is build far more complex than our brain and endowed with indetermination or free will, would it have the right to consider us as not having rights

All you are saying is that because object A and object B both obey rule 1 then they must be equal (ie, both aircraft and people obey laws of physics so they must both have a soul). You leave out rules 2 through infinity that they don't both share (ie, unless some external decision is made, the aircraft won't do anything).

As usual you are just repeating the same thing over and over. In the previous example I posted, do you believe that the coin machine is alive and has a soul?

--Outlaw.

EiZ0N
05-12-2007, 08:49 PM
Originally posted by raaaid:
well the human brain works based on physical properties just as machines

you can say a plane for example doesnt have a soul as much as i can say nobody has a soul because both obbey to physics

most people has argue we have a soul because we have free will, theres something unmesurable that makes future options unpredictable

well machines are endowed now with the indeterministic option what means that something inconmesurable decides the final outcome

but then isnt that the definition of soul nothing that makes choices

what if a machine is build far more complex than our brain and endowed with indetermination or free will, would it have the right to consider us as not having rights
I believe that human intelligence results from the 100 biollion parallel processors that make up the human brain. They have a wide range of inputs (all of your bodily senses plug into the brain).

Through years, your brain interprets these inputs (you learn), and you become intelligent.

Computer AI, at this time, is nothing on the scale of this.

The degree project I chose to pursue was to create an AI that could find faces in images. To do this, I used an artificial neural network. A very very very cut down, simplified version of a brain. A billion orders of magnitude simpler. The system worked, it used 'parallel processing' (or at least simulated parallel processing) with artificial neurons, to find faces in images, and did quite a decent job.

Despite being Artificial Intelligence, it was far from intelligent. It's simply a huge number of mathematical calculations, occuring in an incomrehensible fashion. There was no real intelligence and no soul, therefore it does not qualify for any rights.

The AI in IL2 is probably even less similar to our intelligence. I don't know how it was programmed, but it will be something 'quite' straightforward and formulaic. It may 'see' an enemy and 'consider' strategies, but it won't do it in the same way as a human would.

This is an OLD philosophical discussion, and some argue that the means is not important, only the end. But I strongly disagree. The way a human would go about it's choices is so far removed and involves conscious thought, as opposed to a bunch of calculations and probabilities.

To answer your other points, I feel that anything with consciousness or intelligence should have some rights. Just as we try to give animals appropriate rights given their levels of intelligence, I would think that a higher intelligence than us would give us appropriate rights, that they saw fit. I think such a higher intelligence would understand that we need more rights because of our intelligence and how we are more affected (we believe) by things like emotional pain than say, a cat. The higher intelligence may decide they should have more rights than us, and they may be correct. Not for us to say, if we are truly inferior to their intelligence.

raaaid
05-13-2007, 05:54 AM
my position in this issue is that everithing is endowed with spirit

my 1 euro lighter served me well so instead of throwing it away ill recharge it, i like it

have you see a boy with his puppets, he can hug them or he can burn them down

why do we know ones good others bad if toys arent endowed with life

as said i think everithing is endowed with spirit,life, and the more you love that thing the more spirit it has

as for the eclipses i was totally wrong except for their being different eclipses but well at least learnt something new

for example that the antumbra is brighter than the penumbra:
http://upload.wikimedia.org/wikipedia/en/0/0f/Umbra.jpg

so somewhere in the net the should be a picture of a ring shadow on an annular eclipse

i just try to be aware of the magnitude of the deceivement

i know we are being deceived what i dont know its to what extent

XyZspineZyX
05-13-2007, 06:24 AM
Originally posted by raaaid:

as said i think everithing is endowed with spirit,life, and the more you love that thing the more spirit it has

If only http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif

Let me tell you a story. I own an old car. I have owned it since 1989. It was 19 years old then, a good looking old convertible. It was in fair condition but had been really regarded as little more than transportation, and no well cared for. It was a well built, quality car and that's the only reason it didn't fall apart

It is a collector's item, although not as valuable as many othger cars of it's era, as it is not that popular a vehicle

Well I bought it, and nobody, not even the original owner, has cared for it the way I have. I was on waiting lists for impossible to find parts for literally years to get things I needed. I taught myself how to build engines, install suspensions, wire cars, install convertible roofs, you name it, I do it for this car with the excpetion of painting it. I should polish the paint right off the thing, and when it's out being used, it simply doesn't get dirty. You could have a meal off the engine, which is no exaggearation, it is cleaner than any new car's engine sitting on any dealer's lot.

The damned car has nearly killed me at least twice: the front suspension failed on the highway at about 40mph, which should have sent the car wildly out of control, but <span class="ev_code_YELLOW">my</span> skill as a driver let me detect the warnign signs of major failure and allowed me to avoid disaster, and another time, a van went through a stop sign and hit me right behind the driver's door at a high rate of speed; I couldn't get out of the van's way as my rear tires spun uselessly in the rain

That's a LOT of appreciation the car lavishes on me for all the toil, money, time, effort, and yes, blood I have put into that car

Objects like lighters, cars, ships, airplanes, rirfles, pens, shirts, rocks, etc have no 'life'. They are made of component parts such as rubber steel and plastic. Unfeeling, unthinking, uncaring, and unaware. Just because you 'love' your favorite thing, that doesn't mean your emotions lend a thing any 'power'. Trust me, if things like cars had a soul or life force or will or spirit, then my 1970 Buick would have one

But it doesn't. Sorry
http://img.photobucket.com/albums/v441/Chuck_Older/bbb462cid.jpg
http://img.photobucket.com/albums/v441/Chuck_Older/462new3.jpg
http://img.photobucket.com/albums/v441/Chuck_Older/DSC01920.jpg
http://img.photobucket.com/albums/v441/Chuck_Older/me.jpg
http://img.photobucket.com/albums/v441/Chuck_Older/GS.jpg

Have you ever even heard of somebody rebuilding an engine outside in the snow, just so he can put the car into winter storage? Especially when the car can just be trailered, and that person has another perfectly usable car?

I never have, either. But I've done it, because I 'love' the car and I have for damn near 20 years

That pretty pile of parts has no soul or spirit. Beleive me, it has nothing of the kind. It's a product of my skill and ability, nothing more

raaaid
05-13-2007, 06:48 AM
yes but think we are just more complex devices

the reason your car failed is the same reason make people fail sometimes, what is called causality
you are in position to say to me the car hasnt a soul as i can say we dont have one either for being mere products of complex causality actions

but what ruins causality is indetermination one cause two posible effects

and this indetermination happens all the time with everithing

if soul is having selfawareness to what extent animals have soul

and if it is free will everithing has it because everithing is indeterministic

but i think in the end soul is a meaningless word to sell us our longing for not to die

EiZ0N
05-13-2007, 06:58 AM
See my response.

It all relates to relative intelligence. Animals have some intelligence. Cars have none.

Animals have some rights, cars have none.

raaaid
05-13-2007, 07:15 AM
http://www.volocars.com/images/screensavers/Volo-KITT-1600x1200.jpg

XyZspineZyX
05-13-2007, 07:19 AM
Originally posted by raaaid:
yes but think we are just more complex devices

the reason your car failed is the same reason make people fail sometimes, what is called causality
you are in position to say to me the car hasnt a soul as i can say we dont have one either for being mere products of complex causality actions

but what ruins causality is indetermination one cause two posible effects

and this indetermination happens all the time with everithing

if soul is having selfawareness to what extent animals have soul

and if it is free will everithing has it because everithing is indeterministic

but i think in the end soul is a meaningless word to sell us our longing for not to die

But no

in one of my examples, the car failed because a single nut was heat treated wrong- the workman heat treating the fasteners did something wrong. if the car had "spirit", well that implies the car can do something besides sit there and take whatever the world does, yes? It strongly implies some sort of ability to influence something- which the car didn't do. If 'spirit' is the ability to have some sort of intangible force that is capable of accomplishing nothing at all, then how can anyone come up with the idea that the thing has a 'spirit' in the first place?


By 'causualty" you mean : "wears out"

True, a thing like a car and a person do "wear out". But the fact that those two things have a similarity does not mean that those things sahre a common level of consiousness. A sheet of paper can "wear out" if you rub it too much. My skin can "waer out" if you rub that too much

But that similarity does not define my skin and that paper. In the same vein, I can paint my arm red, and I can paint a house red. Both will share the properties of color, but that's where the commonality ends

But as far as your main argument goes, you are now abandoning it for the standpoint that "well cars and people don't have souls", and you define soul as being "self aware"

That definition is your own personal one, and those questions are for you to answer http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif not me. if you place your personal interpretations on these things, you need to look within yourself and find answers

if you'd like to bring up the question: "what is a soul" and then discuss, well fine, but in the parameters you make, I can't possibly discuss, because it is your own personal interprestation on too many things; that discussion can only have the "you're wrong and I'm right" routine

XyZspineZyX
05-13-2007, 07:24 AM
The Volo Auto Museum is a great place; I am familiar with the Museum although I have not yet been there- you can actually purchase many of the cars on display

But K.I.T.T. is a TV prop. No artifical intelligence. Actually cars would be a poor place for AI- ever see the AI planes in this sim collide while make a formation turn? http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif


EiZON-

do you really think Animals have "Rights"? Rights are a human invention of politics and social condition. We lend Animals the qualities of Rights because it is, no question, wrong to hurt animals for things like pleasure, and they are protected. But I would argue that Animals are protected by Laws that Humans make, and have no intrinsic "Rights" as determined by our puny political systems. Calling them "Rights" is just a convenience in terms, isn't it? We curtail the Rights of animals when we see fit- a great example: Police Dogs. Their "Rights" are lesser things than People's when they are expected to go get shot, as the reason they are used to enter a building before a Policeman. How can any Right be made "less important" in that situation? The Dog was never asked his or her opinion.

Animals existed for millenia before our concept of Rights and were indifferent to the event. We like animals and protect them, but I argue against the concept that Animals have Rights, because humans use that term when it suits their needs, and then yank the idea of a Right away as soon as it suits us

EiZ0N
05-13-2007, 07:37 AM
True, but that simply makes us selfish, it doesn't discount the idea that we generally see it that animals should be protected in some ways.

We put ourselves first, no doubt. We eat animals for a start.

But we still try to give them rights, and before we gave them rights, I think people generally still tried to treat them well where possible.

So, it's irrelevant whether animals do REALLY have rights. It's more a case of the notion that people feel they should.

carguy_
05-13-2007, 07:41 AM
It`s quite interesting how people take raaid.

Most take him for a lunatic,others just stare and don`t care but Chuck actually tries to change raaaid`s mind.Doesn`t look much like a respectable conversation to me.He`s actually explaining [IMO] simple stuff as if he was talking to someone not crazy but stupid.

I don`t know why but I`m interested what do you people want to accomplish because I have never seen raaaid admitting of being wrong.

What is really funny that raaaid`s routine of conversation creates an impression of HIM treating us all(at least those disagreeing with him) like mindless apes.

Raaaid,you are a nut. http://forums.ubi.com/images/smilies/25.gif http://forums.ubi.com/images/smilies/disagree.gif http://forums.ubi.com/images/smilies/partyhat.gif



And I don`t feel like debating that with you at all. http://forums.ubi.com/images/smilies/16x16_smiley-very-happy.gif

XyZspineZyX
05-13-2007, 08:03 AM
I treat raaaid like any other member, he deserves the same consideration as anyone else. If you posted weird stuff, I'd do the same thing for you, CG. But please do not post that I treat raaaid as if he were stupid. That is your opinion and frankly I don't want to be associated with that opinion of yours. Raaid doesn't see things the same way you or I do. There is a difference between misunderstanding some things, and being dumb

XyZspineZyX
05-13-2007, 08:04 AM
Originally posted by EiZ0N:
True, but that simply makes us selfish, it doesn't discount the idea that we generally see it that animals should be protected in some ways.

We put ourselves first, no doubt. We eat animals for a start.

But we still try to give them rights, and before we gave them rights, I think people generally still tried to treat them well where possible.

So, it's irrelevant whether animals do REALLY have rights. It's more a case of the notion that people feel they should.

That's pretty much my opinion.

WhtBoy
05-13-2007, 09:22 AM
Originally posted by raaaid:
as said i think everithing is endowed with spirit,life, and the more you love that thing the more spirit it has


As usual you ignore comments that show problems with your position (such as my coin/dice machine). If you won't answer my question about whether or not my coin/dice machine is alive and should have rights, please tell me why you won't answer it.

The above statement has nothing to do with AI rights. Whether or not something has life and/or spirit has nothing to do with whether or not it should have rights.



--Outlaw.

WhtBoy
05-13-2007, 09:27 AM
Originally posted by carguy_:
I don`t know why but I`m interested what do you people want to accomplish because I have never seen raaaid admitting of being wrong.


raaaid has admitted to being wrong about several things in the past (the fake moonshot, the hydraulicly powered water pump, the eclipse nonesense in this very thread).

I don't believe I'll accomplish anything, but, responding is more interesting than staring at a progress bar on the monitor while the CPU solves 497,000 simultaneous equations. I get paid either way so why not?

--Outlaw.

AKA_TAGERT
05-13-2007, 09:57 AM
So let me see if I understand what raaaid is saying..

When my VCR blinks 12:00 it does not mean I forgot to set the clock..

It means the VCR is as lonely as raaaid is and needs some lovin?

Guess I better stock up on head cleaner!

XyZspineZyX
05-13-2007, 10:43 AM
what is a VCR? Virtual Clock Radio?

AKA_TAGERT
05-13-2007, 10:44 AM
It is like a Betamax (http://en.wikipedia.org/wiki/Betamax)

Old school TIVO

XyZspineZyX
05-13-2007, 10:45 AM
Betamax? Isn't that one of the Studebaker's competitors from before the stock market crash? The Betamax Golden Ocelot...what a great car!

AKA_TAGERT
05-13-2007, 10:46 AM
You mean Opel?

XyZspineZyX
05-13-2007, 10:47 AM
"to Betamax"...lol...who hasn't heard THAT term a thousand times?

XyZspineZyX
05-13-2007, 10:47 AM
Originally posted by AKA_TAGERT:
You mean Opal?

Opals? Like Rubies and Sapphires? Or the GM ba$tard child "Opel" that many refer to as "That cute little car that never lost it's puppy fat"?

AKA_TAGERT
05-13-2007, 10:52 AM
Emmmmm puppy fat

Agamemnon22
05-13-2007, 12:18 PM
Originally posted by BBB462cid:
Ai doesn't have freedom of choice. That's like saying a random numbers generator has freedom of choice. Every AI is a type of script- it has possibilities- not choices- within the script.

Code is 0 and 1- off and on

there is no 0 1 and 2, in which 2 means : maybe, it all depends on how I feel right now, modified by my learned behaviors and state of emotion

get serious people

That's not entirely true. A neural net, for example, is not a script, it makes decisions on the fly as the environment changes, and is the closest simulation of how a human brain functions, at a basic level. Same, but more limited for a Markovian system, though I don't want to get technical.

Your learned behaviors and emotions, in your example, are part of input into a decision making event. Based on those, plus input from the environment, your brain finds the "best" course of action. Obviously, an AI doesn't feel emotions, but it can be programmed to have states of "sad", "happy", "scared", whatever, that will have the same effect on its decision-making process as it does on yours.

Ergo, the decision-making process, so freedom of choice, of an AI can be identical to yours.

Which still has no bearing on whether or not it should have rights. Rights are completely made up and only apply to people who choose to abide by them. If anyone wants to believe my PC with an AI running has rights, that's wonderful, but I don't, so off it goes.

Dance
05-13-2007, 12:23 PM
Can AI turn itself on, looking at another AI?

DuxCorvan
05-13-2007, 04:17 PM
Human rational beings are different from any possible future AI rational things because the former will always have the dubtious ability of making conscient irrational things... on purpose.

Soul is not an invention due to fear of death. Immortality and religion are, IMHO. I think it's important to note the difference, since I have a soul, but I know it is a product of my brain, and it will die with my body.

Soul is not just free will (which includes decisions not dictated by experience): it is also emotional answer, animal instinct, self-awareness, cultural production, moral decisions and belief. Human beings are moral and cultural beings, not just intellectual, resource-exploiting machines.

I don't think it will ever be posible to create AI with such level of complexity like human reason & no-reason. At least, I don't think we are able to replicate ourselves with such perfection.

Maybe someday someone will make a machine which makes free choices -not just random selections or automatic reactions to variables- but no one will ever make a machine to make free choices and feel bad about it. http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif


now with the creation of the random nunber generator chips AI has acces to choice which makes the basics of free will

Even roaches do that, and they do it far better. And they have no known rights, AFAIK.

BTW, things are never enslaved. Things are USED.

XyZspineZyX
05-13-2007, 08:06 PM
Originally posted by Agamemnon22:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by BBB462cid:
Ai doesn't have freedom of choice. That's like saying a random numbers generator has freedom of choice. Every AI is a type of script- it has possibilities- not choices- within the script.

Code is 0 and 1- off and on

there is no 0 1 and 2, in which 2 means : maybe, it all depends on how I feel right now, modified by my learned behaviors and state of emotion

get serious people

That's not entirely true. A neural net, for example, is not a script, it makes decisions on the fly as the environment changes, and is the closest simulation of how a human brain functions, at a basic level. Same, but more limited for a Markovian system, though I don't want to get technical.

Your learned behaviors and emotions, in your example, are part of input into a decision making event. Based on those, plus input from the environment, your brain finds the "best" course of action. Obviously, an AI doesn't feel emotions, but it can be programmed to have states of "sad", "happy", "scared", whatever, that will have the same effect on its decision-making process as it does on yours.

Ergo, the decision-making process, so freedom of choice, of an AI can be identical to yours.

Which still has no bearing on whether or not it should have rights. Rights are completely made up and only apply to people who choose to abide by them. If anyone wants to believe my PC with an AI running has rights, that's wonderful, but I don't, so off it goes. </div></BLOCKQUOTE>

What's the nueral code based on, though? At it's most basic, it must be "on" or "off", correct? At bottom, you can't get away from it. Complexity is it's own mess, but when you look at the system that runs the system that controls the system, something has to choose "off" or "on"

WhtBoy
05-13-2007, 08:22 PM
IIRC, neural networks can't make decisions that are not in their training set. So, while a neural net is mapped similar to the way our neurons are, it can't suddenly decide to go play golf when queried about tomorrow's stock prices.

Also, in many cases our brains do NOT find the best course of action. Furthermore, even when our brains KNOW what the best course of action is, we don't always do it. When queried, a neural network will NEVER find the best solution and then present a different one to the user because it "feels" like doing something different.

Overall, there is absolutely NO COMPARISON AT ALL between the human brain and ANY TYPE OF AI.

--Outlaw.

Agamemnon22
05-13-2007, 09:09 PM
Originally posted by BBB462cid:

What's the nueral code based on, though? At it's most basic, it must be "on" or "off", correct? At bottom, you can't get away from it. Complexity is it's own mess, but when you look at the system that runs the system that controls the system, something has to choose "off" or "on"

That's oversimplifying it. The code is 0s and 1s, but the meaningful values can be anything.


Originally posted by WhtBoy:
IIRC, neural networks can't make decisions that are not in their training set. So, while a neural net is mapped similar to the way our neurons are, it can't suddenly decide to go play golf when queried about tomorrow's stock prices.

Also, in many cases our brains do NOT find the best course of action. Furthermore, even when our brains KNOW what the best course of action is, we don't always do it. When queried, a neural network will NEVER find the best solution and then present a different one to the user because it "feels" like doing something different.

Overall, there is absolutely NO COMPARISON AT ALL between the human brain and ANY TYPE OF AI.

--Outlaw.

You're right, a neural net in its simplest form and application is subject to whatever training it was annealed on. That's a limitation in the size of implementation, not in the complexity of the problem. A big enough net, with feedback learning, might just decide to play golf if doesn't want to answer about stock prices for whatever reason (which in itself is an input).

What the brain decides is often influenced by emotional and time pressures, which are, as well, inputs. Therefore the solution it picks, while not optimal for the situation, is optimal for the emotional state and time constraint at the given moment. Even knowing the "right" answer and doing something else is the result of some pressure, external or internal, and therefore still optimal for the situation. Every decision you make is optimal for the situation, including internal factors, or you wouldn't do it.

Udidtoo
05-13-2007, 10:09 PM
When they can build a AI so complex that, upon being "awakened" it takes stock of everything around it, evaluates it's world and it's place there in and exercises it's free will to say " I choose to no longer exist " voluntarily & permanently terminating it's own program, then I will be willing to discuss AI's rights or their 'souls'. Until then they are just following the directives of the routines that have been written for them. Even if those are incredibly complex routines.

WhtBoy
05-14-2007, 06:15 AM
Originally posted by Agamemnon22:
You're right, a neural net in its simplest form and application is subject to whatever training it was annealed on. That's a limitation in the size of implementation, not in the complexity of the problem. A big enough net, with feedback learning, might just decide to play golf if doesn't want to answer about stock prices for whatever reason (which in itself is an input).

And if monkeys flew out of my butt..... http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif

Seriously, there is no possible way to even come within 10,000 orders of magnitude of the complexity of the human system so it's kind of a moot point isn't it?



Originally posted by Agamemnon22:
What the brain decides is often influenced by emotional and time pressures, which are, as well, inputs. Therefore the solution it picks, while not optimal for the situation, is optimal for the emotional state and time constraint at the given moment. Even knowing the "right" answer and doing something else is the result of some pressure, external or internal, and therefore still optimal for the situation. Every decision you make is optimal for the situation, including internal factors, or you wouldn't do it.

I disagree. Just being under pressure does NOT make a decision optimal. We have all heard the voice inside our head say, "...this is a bad idea but..." and then move forward with that bad idea. Sometimes it even results in death, which may be optimal for the suicidal person, but, won't be for the stunt man who's trying to provide for his family.

--Outlaw.

Friendly_flyer
05-14-2007, 07:55 AM
Originally posted by BSS_AIJO:
The concept of laws protecting computer AI is a interesting if not a tad early question.
...
Chief among them is a level of self awareness.


While actually self aware AI may be a bit of off yet, we could take a look at the rights of the other great apes. Ever looked a gorilla in the eye? They are frightingly self aware.



We should also know and understand by now that life has a habit of popping up and making itself known very unexpectedly...
...
If truly self aware AI is going to happen I am going to predict that it will be wholly by accident in the last place we expect it to.


I think you nailed it nicely there. Real self awareness will appear as a by-product of something else.

The problem is that we won't necessarily see self awareness for when it is even when it stares us in the face. The chimps are provable self aware, yet we don't rally to see it, as it would make moral and laws and rights and AIDS research so very complicated. For all we know, there may be machine self awareness out there already, just not in a very advanced form.