There was a twitter post recently, where someone asked women “if you were in a world without any men for one night, what would you do?” One answer that kept coming up was “go for a night time walk alone.” We say ‘recently’, but it’s an evergreen topic. We live in a world where this is a perennial consideration for many people.
Imagine for a moment that from the day you were born, you had a benevolent spirit watching over you. A friendly spirit that you could see and hear and talk to, keeping an eye on you wherever you went. This benevolent spirit has the ability to heal you within hours of any injury that didn’t kill you outright, no matter how severe. Your benevolent spirit would intervene—physically if necessary—if anyone tried to do you harm against your will or coerce you in any way. Your benevolent spirit always had your back, keeping you out of harm’s way. And imagine your spirit would never judge you or hold you back, but instead would support you in your dreams and desires.
Would that change your limits and boundaries? How would it affect the choices you made? How would it affect your decisions—especially your sexual and romantic decisions? If you always felt safe no matter what circumstance you found yourself in, how would you live your life?
The world of the Passionate Pantheon is a world where benevolent AIs watch over the people, always ready to intervene should some non-consensual harm befall anyone.
You can, if you choose, engage in any manner of dangerous pastimes or extreme sports; they won’t interfere in your choices. They will, however, act to prevent situations that might harm you without your express consent. (Mind you, the definition of ‘dangerous’ or ‘extreme’ changes rather a lot when you have medpods available!)
We wanted to explore the idea of what consent might look like in a world of near-absolute safety. Absolute safety has never existed in history before. It was a rather interesting challenge, as people who have grown up in societies that are not truly safe, trying to wrap our heads around the ways it might impact society at the deepest levels.
In many ways, consent is a social idea. It’s affected by the norms and customs of the society you live in, and also by your perception of risk. You may consent to something if you feel safe that in a different circumstance you would not. (If that idea sounds strange to you, ask someone the question “can one pre-consent to allowing a lover to have sex with them whilst they’re asleep?” You’re likely to get loads of different answers…and you can make strong arguments to defend all of them even when they’re totally opposed. Consent isn’t always black and white.)
But what does ‘safe’ even mean in such a world? And what happens if we ride that train to the last station? When nearly any choice you make can be safe, if that’s what you want? If you can say “yes” to any offer that interests you in absolute knowledge that no harm will come of it, what might your life, and the society you live in, look like?
How would that change if, on top of all that, you knew you could live for hundreds of years if you wanted to, so you had plenty of time to explore?
But wait, there’s more! What if there were no STIs? And what if you, and everyone else, had conscious control of fertility all the time—the only way a pregnancy could happen is if the people involved both agreed to it? What might consent look like then? How would it change the sexual choices you make?
And, after we had pieced together what we thought it might look like, we asked an even more complex question: How can all this go horribly, horribly wrong? Without changing the letter of the law, how might that spirit of absolute consent and freedom be twisted until it’s near unrecognisable—without breaking?
The books in the Passionate Pantheon series do a back and forth thing. Odd-numbered books are wondrous Utopias; even-numbered books are dark erotic horror. We wanted to see how the idea of near-absolute safety could change norms around consent for good…and for evil.
One of the things we wanted to explore in the darker even-numbered books is the differences between enthusiastic consent, technical consent, and transactional consent.
Enthusiastic consent probably doesn’t need a lot of explanation. It’s the kind of consent you give freely and openly, with full information about what you’re agreeing to, because it’s something you want to do.
Technical consent is where you do, technically speaking, agree to something, but perhaps you don’t really know what you’re signing up for, and maybe you’re not really sure what’s going to happen afterward. The AIs and drones are smart—but they can’t read your mind. If you say yes to something, they tend to take it at face value, assuming no obvious coercion (and actively lying to someone about a likely outcome counts as coercion, but lying by omission…well, that can get murky).
Or perhaps you’re doing it because you’re expected to. Social expectations still exist, after all. In a world where everything is safe, you probably don’t expect anything particularly bad to happen, so you might be a little more willing to accept technical consent.
Transactional consent is something we explore through the concept of “bondslavery.” The Cities that serve as settings for the even-numbered books permit bondslavery—voluntary terms of slavery, always for a pre-defined period of time (typically only for a matter of days, never for a period beyond one day less than a year), entered into because the people agreeing to a term as bondslave expect something in exchange, or have lost a bet.
In a post-scarcity society with no concept of money or valuable goods, if you fancy gambling, pretty much the only thing you have to wager is your body, time, or labour (although obviously, access to a loser’s body really gets you all three.) And in an erotic horror genre, of course we decided to pick that first one! Not to be predictable but…
There are norms and expectations that build up around that subculture, of course (and even in the even-numbered books, bondslavery is a minority subculture, not necessarily an inherent part of everyday society.) Bondslavery is voluntary, and qualifies as consensual by a sufficiently loose definition of “consent,” but bondslaves are treated as property for the term of their bond. Permanent damage to a bondslave is not permitted…but just about anything else is! (And in a world of near-unlimited biomedical nanotechnology, that “anything else” includes quite a lot. You want to radically reshape your bondslave’s body or mind? Totally permitted, as long as you put it back the way it was at the end of the bond.)
Would individual people be willing to consent to things in an environment of absolute safety that they might not consent to in the real world? We think the answer is yes, probably. Yes, but. And it’s that ‘but’ that’s interesting, right?
Yes, but how would this impact society, even as society impacts you?
Would society as a whole take a looser view of consent in an environment of absolute safety? That’s a big question, and it’s the reason the even-numbered books are as dark as they are. There are so many ways that could play out. If you volunteer to put yourself in someone’s hands, and they have nearly unlimited power to change you physically and mentally however they like and put you back the way you were afterward, that can go in some dark directions indeed. (We omitted some of the darkest ideas we came up with from the second novel because they weren’t relevant to the plot, but if you buy us a drink sometime, maybe we’ll talk!)
Good science fiction, we think, is not just fiction about teleporters and drones and spaceships, it’s fiction that asks “what if?” What if a society has near-unlimited biomedical technology? What if that society professes to value consent, but only in the strictest technical sense of the word? What if respect for autonomy extends so far that people can choose to give up their autonomy, and even personhood, completely? How can those things interact with each other?
What do you think? What would you do, if you could be absolutely sure that no permanent harm would come to you because of it? And what dark scenarios could you see coming out of that environment? We’d love to hear from you!