I don’t mean that humans are machines that happen to feel emotions. I mean that humans are designed to be machines whose output is the feeling of emotions—“emotion-feeling” is the thing of value that we produce.
Humanity has wondered what the purpose of life is for so long that it’s one of history’s oldest running jokes. And while everyone is fairly concerned with the question, transhumanist singularitarian are particularly worried about it because an incorrect answer could lead to a universe forever devoid of value, when a superhuman AI tries to make things better by maximizing that (less than perfect) answer. I’m not here to do anything as lofty as proposing a definition of the purpose of life that would be safe to give to a superhuman AI. I expect any such attempt by me would end in tears, and screaming, and oh god there’s so much blood, why is there so much blood? But up until very recently I couldn’t even figure out why I should be alive.
“To be happy” is obviously right out, because then wireheading is the ultimate good, rather than the go-to utopia-horror example. Everything else one can do seems like no more than a means to an end. Producing things, propagating life, even thinking. They all seem like endeavors that are useful, but a life of maximizing those things would suck. And the implication is that if we can create a machine that can do those things better than we can, it would be good to replace ourselves with that machine and set it to reproduce itself infinitely. Imagining such a future, I disagree.
I recently saw a statement to the effect of “Art exists to produce feelings in us that we want, but do not get enough of in the course of normal life.” That’s what makes art valuable – supplementing emotional malnutrition. Such a thing exists because “to feel emotions” is the core function of humanity, and not fulfilling that function hurts like not eating does.
The point is not to feel one stupid emotion intensely, forever. It is to feel a large variety of emotions, changing over time, in a wide variety of intensities. This is why wireheading is bad. This is why (for many people) the optimal level of psychosis is non-zero. This is why intelligence is important – a greater level of intelligence allows a species to experience far more complex and nuanced emotional states. And the ability to experience more varieties of emotions is why it’s better to become more complex rather than simply dialing up happiness. It’s why disorders that prevent us from experiencing certain emotions are so awful (with the worst obviously being the ones that prevent us from feeling the “best” desires)
It’s why we like funny things, and tragic things, and scary things. Who wants to feel the way they feel after watching all of Evangelion?? Turns out – everyone, at some point, for at least a little bit of time!
It is why all human life has value. You do not matter based on what you can produce, or how smart you are, or how useful you are to others. You matter because you are a human who feels things.
My utility function is to feel a certain elastic web of emotions, and it varies from other utility functions by which emotions are desired in which amounts. My personality determines what actions produce what emotions.
And a machine that could feel things even better than humans can could be a wonderful thing. Greg Egan’s Diaspora features an entire society of uploaded humans, living rich, complex lives of substance. Loving, striving, crying, etc. The society can support far more humans than is physically possible in meat-bodies, running far faster than is possible in realspace. Since all these humans are running on computer chips, one could argue that one way of looking at this thing is not “A society of uploaded humans” but “A machine that feels human emotions better than meat-humans do.” And it’s a glorious thing. I would be happy to live in such a society.