[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Recycling/AI and super human computers

Again to Kelly,

>We will do genetic experimentation, and A.I. because both are very
>important/valuble.  In both cases we will try to be carefull and will make
>mistakes.  Thats just life.  Trying to put some wall that you can only make
>stupid A.I.s (assuming we had some way to control that) would be clumbsy and
>impractical.  Probably as dangerous as a hostile hi I.Q. A.I..

Yes, I was only saying we should be careful, indeed by constantly taking
small steps and not by taking big leaps without having a clue what would be
possible. This may sound logical but it would make a little difference to
use 1E6 or 1E12 neurons when you have enough computing power available.
>Humans are a very adaptable species.  We will probably be worth trading with.
> If not we have no obvious point of friction, so an intelegent A.I. species
>that didn't like us, would just leave.  (Thou a stupid one might try world

What would there be to trade? What would a smart computer need?
Indeed it would be smart of them to leave us because we are only a problem
for them. But this is only the case if we outnumber them. But what if they
got control over vital human needs like electricity or computing power. That
would be really weak points for humans.

>Are you this paranoid about aliens?  How would you deal with a hyper evolved
>E.T.?  If you don't think we'ld be able to deal with domestic aliens of our
>own creation, how can we deal with ultra-E.T.s?

I'm never paranoid, I was just sharing a few thoughts and looking what you
(or others) thought about it.
If these ETs indeed are more evolved and stronger/smarter than we are, then
the outcome would more depend on them than on us. (Unless there are just a few)

>> I think there is another possibility to control the AI, we let it "live" in
>> a virtual reality, which is created by us. So every action the AI
>> will not be a real one (so no harm to us), but since we can control it's
>> input it may never know that it is not real.
>You forget.  It isn't a physical creature, it is a data construct.  A
>'virtual reality' would be very alien to it.  It would probably bypass it to
>the more natural binary data space.  Our physical world of time and space
>could seem very alien.  Other concepts, like death, have caused some
>confusion for advanced A.I. prototypes.

I think it would very soon be bored about the binary space. But does it
matter how its VR would look? The main idea is that we control its input and
we could redirect its output to make it harmless.