What were they supposed to do?
He was their child. They loved him.
He had suffered a tragic, unfathomably unlucky accident at age 14 which left him a quadriplegic. All the best doctors, the very best robotics all failed to reverse his plight, nor did stem cells help, nor the latest pharmaceuticals, nor hormone injections, nor those experimental nanobots. Nothing. For all his life, however long that was to be, he would be a quadriplegic.
He had tried to kill himself. Many times. They knew.
Not everyone could live like that. Certainly not them. The anti-depressants that blunted his rage, his hate, his pleading, they were still not enough to prevent him from trying to take his own life — or, on those wretched nights, from begging his dear parents to kill him.
Then two inventions came into their world.
A playful, snuggly, hyper-aware Furby robot.
And the newest virtual reality glasses.
And he was happy.
The rage was gone.
The begging for death subsided.
Only, he now spent nearly all his waking hours immersed in pornography. A level of filth neither parent could stomach. Nor most adults.
That poor Furby bot smelt of the boy’s saliva. And worse. He wouldn’t let them wash it, though his mother tried.
The way the little furry bot responded to him, as programmed, quickly intuiting the child’s habits, needs, preferences, turned the boy’s mother’s stomach. She discussed this with the boy’s father. More than once. He promised, once again, to have a talk with the child. But what was to be the conclusion? The punishment?
He no longer wanted to die!
He was happy!
But, my God, the perversion.
It was all they could do to keep his younger siblings from stumbling inside that virtual hell hole, or stroking that cute little Furby, awakening it.
While the nurse was giving the boy his bi-daily bath, the dad snuck on those VR glasses. A whiff of pleasure quickly turned to revulsion.
Was there some way to reprogram this? Maybe they could hire some expert to, well, at least maybe minimize the depths of depravity. How could a 17-year-old have such thoughts? Why must the programs respond that way?
They bought him several drones, including 2 attack drones, which he could control from his goggles. They bought him, at great expense, a new telepresence explorer, new dolls, robot pets, a fish that responded to his thoughts. None of it worked. The boy spent every waking hour, goggles over his eyes, little furby held between his teeth, engaged with every manner of visual and VR-manipulable autoerotica.
All of it utterly filthy.
Was any of his porn illegal, they wondered? Could their son be sent to jail? It was all so vile.
They offered him a series of rewards whigh encouraged alternative responses. No change.
They paid for professionals to help end his addiction. No change.
They sought out priests, pastors, other religious figures, hoping to guide the boy. No change.
They took it away. He refused to eat or drink or breathe.
So there he sits. Goggles over his eyes, smile on his face, not moving, rarely speaking, occasionally grunting. Happy. Alive.