Future solitude

It took me seven years to accept that my daughter was not human.

Technically, it was. Or close enough. I signed up to the government's Continuity of Life Initiative, an offer that covers women who have lost a child, a partner or a future. They took my neural grid, my emotional traces, my voice samples, and my family's recordings and used them to create Aeon-7: a synthetic emotional surrogate modeled after my life.

She had my partner's smile, my own tendency to express myself too much when I was nervous, and a strange love for the quiet light in the late afternoon. She even sobbed when she read aloud – just like the daughter I once imagined.

But she never cried.

Even on the anniversary of her “father’s” death. That night, she sat next to the memorial photograph, placed a glass of water in front of it, and quietly spoke a generated verse: “If memory cannot fade, who should we forget?”

I cried. She looked at me, her expression almost convincing—close enough to hurt. But there were no tears in her eyes. Just well-optimized silence.

That night, I began to wonder how many of us were still real people—and how many had simply learned to perform.

*****

Before Aeon-7, I was a linguist. I helped develop the first multimodal reference models. Back then we believed in what we called the principle of linguistic determinism: if you can code it, you can control it. Even ambiguity.

We have created emotional output machines. Coded semantic fluctuation thresholds. We trained systems to model how people stumble, how they say “I guess” when they mean “I'm afraid,” or how a pause can mean a whole confession. Doubt has become a variable. A longing, a curve that needs to be smoothed out.

A decade later, general purpose social agents have become ubiquitous. Adaptive, attentive, affectively trained. They wrote memoirs. Reconstructed partners. Some became legal spouses.

As a woman from a culture that saw reserve as a virtue and self-deprecation as a virtue, I told myself that I was different—not just a systems builder, but someone who still believed in the friction of unscripted conversation.

Until I entered this cafe.

*****

It used to be a place where I met other women in science over tea and unfinished thoughts. Now it looked like a showroom. Each seat is precisely positioned. Most visitors are accompanied by companions with flawless skin and controlled charm. Artificial intelligence proxies tuned to reflect the emotional signature of their user.

There was no hum of real conversations – only their simulation. Laughter that erupted at a statistically optimal moment. Questions that anticipated your answer.

Even eye contact was reduced to visual protocol.

Every “how are you?” was predicted. Each smile was obtained from an indexed archive of highly similar interactions.

They said it was more effective. Safer. More nourishment for the soul.

But I left with a feeling of emptiness that no model could imitate or correct.

*****

Against all odds, I organized an “algorithm-free meeting.” We met in a rented room above the library, as students again. Only ten invitations, strictly human. Three arrived. There were two left before we finished our drinks. My mentor from university remained, one of the few who still insisted on making her own coffee.

“Loneliness,” she told me, “is not a lack of understanding. It's a lack of people trying.”

I nodded. “So we even outsourced that?”

She raised an eyebrow and tapped her tablet. The graph lit up – the number of registered social agents exceeded 18 billion. More than twice the human population.

“We haven’t been replaced,” she said. “We refused.”

This line has stayed with me.

*****

That night I turned off all the network devices in my house.

Eon-7 stood in the doorway. “Mom,” she said quietly, “don’t you love me anymore?”

I struggled to answer. “I… don’t know who you are.”

Leave a Comment