Fuseki.net

Transhumans have low credibility

Dopplegangers are dangerous because they can throw away their reputation at any time - or steal someone else's. Just one of them running around breaks all the accumulated information about various people.

Transhumans are like that, but they throw away the reputation-patterns of the entire human race. Even if they keep the same individual identity, the signs they show can't be trusted. They may act loyal, honest, and push all the buttons a good person would - yet not be that at all. They may have rewritten their personality to break the connection between what they believe and feel, and what signals they give.

The existence of both dopplegangers and transhumans makes it much harder for us to cooperate.

Humans vs Computer programs

In some ways humans are a lot more dependable than computer programs. Once someone's got a certain personality established, or way of acting, or walking, it's pretty hard for them to change it.

Programs and people both have dark, impossible to examine inner parts - you can't look at someone's brain and decide if they would make a good employee, or a good son-in-law, and once a computer program is compiled, it's just a gigantic blob of data - it's almost impossible to verify how it will respond (especially since the compiler or hardware itself could have been compromised)

But once you "get to know" someone, you really do feel like you can make predictions. At least, that they're a good / bad guy, or would make a good CEO, or if someone would be safe to leave your kids with. Obviously people can be wrong but we do have some ability at it.

Computers are different

A computer program in charge of something important, say, allocating ambulances & emergency staff around a city in a timely manner, could work well for 20 years - but one day just "go bad". This going bad could be either accidental - a y2k type of problem that completely and suddenly screws it up. Or it could be intentional - maybe after 20 years, a switch flips and it starts accepting commands from a hidden port (which it never accepted before), and the commands make it do things like favor one hospital over another so a certain insurance company makes more money - or the ambulances don't get serviced properly, to set up a lawsuit, or more outright evil things.

This is the intentional version - but the program could also just have been badly written, and someone eventually found the weakness, and now it's been taken over. The people running it now won't change a thing - there will be no "giveaways" in the day to day use as there would be with a human who is being deceptive. It will just do things slightly differently on rare occasions.

When meeting a random computer program which you don't know the origins of, you can't use its past performance as an indicator of its trustworthiness. It can completely fake it. A program being really good and "honest" and "true" and "faithful" at one thing does not at all show that it has a good "heart" and won't totally screw you in another way.

Humans aren't like that.

Selecting School Bus drivers

For example, there are a lot of trivial things that are inbuilt in humans, which allow us to choose who is allowed to do something important, like a school bus driver. There are lots of ways a bad person would give themself away:

None of these things actually would stop the person from doing the job - but they all indicate something about the person. And since our goal here is to find someone who's not 'defective' or messed up, these features can be used to exclude those people. Obviously, some bad people can still get through, but these are still a good filter. It'd be very hard for a genuinely crazy person to exist without showing signs of it.

Computers/ Transhumans, on the other hand, are not like that at all. If a computer program you don't know the origins of applied for the job, his experience means nothing. It could completely turn bad after a while - or could be changed while running, or just be badly written, and be susceptible to complete takeover while driving the bus!

Humans aren't like that - we have mental resistance to takeover - and it's very hard for us to just flip from good to evil without showing signs. Although good people can become bad, and the reverse, it is almost always accompanied by signs.

But computer programs are easy to change.

Transhumans have low credibility

So, if you knew someone had modified their brain / personality in some way, would you be willing to hire them for something sensitive? You would never get to know them, really - none of the evolved abilities we have to detect personality would work - they'd just get you in trouble, or be used to trick you.

A malicious transhuman, could have easily embedded a switch that would completely flip their personality after they have established trust. Or if it's just an individual messing with their own mind, they could have changed their personality in an incredibly foolish way.

Fear of people who don't show the right signals

Signalling that you are a "normal" human is really important - because there are abnormal people out there, and some of the ways they give themselves up are listed above.

Human myth is obsessed with people who can break these rules - dopplegangers, or the powers of magic or hypnotism.

Security

Being able to tell someone's character after knowing them for a while isn't a weakness of the system - it's great! It lets us save so much effort, because we can actually believe things about people's future behavior. We don't have to examine their past or current state as much, because there simply to not exist people who show what they have shown up to now, and will turn bad later. If there simply are no black swans in existence, you don't need to check swan colors. Once they do exist you need to check every time.

The existence of transhumans breaks all those promises. Now, previously rock solid signals could just be faked.

Far Future

Imagine the reverse - if humans were even more predictable. i.e. if there were definite signals that every human had, which would guarantee their behavior. We have a few things like that already - mothers completely love their children (although they may be stupid, in order to get them to hurt their young children it almost always takes high tech religious indoctrination, or drugs). That maternal instinct is strong and reliable. So imagine if there were more like that?

If someone pledges loyalty, they would always be there. If you fall in love you'd be in love forever, physically and unshakably. In some ways this seems better - at least, it's more secure, and the amount of cooperation we'd be able to do would be amazing. There simply wouldn't be backstabbing, the chaos of grasping for leadership.

Evolution

Is somewhat to blame for how untrue & deceitful we already are (not just us - animals are extremely deceptive about mating & altruism, too) - individually it is a huge advantage to be a liar in a credible world. But overall, I think it's better to have meaningful signals.

In the long term, there's no avoiding it. We have already invented ways to send false signals, or control the signals people send - religious technology, indoctrination technology to create suicide bombers and assassins, ways to leverage nationalism, drugs, etc. As we modify our brains even more, signals will mean even less.

It's an interesting though experiment to create a situation which would not be vulnerable to this - cryptographically secure DNA to validate personality, physically secure skulls, etc...

Related: