But grief doesn’t read terms and conditions.
My sister, , was one in a million. Ek hazaaron mein. She taught me to tie my shoelaces, fought bullies with a rolled-up newspaper, and sang off-key while making Maggi at 2 AM. When she died in a bus accident two years ago, a part of me stopped existing.
And I meant: grief installed. healing installed. goodbye installed. The song "Ek Hazaaron Mein Meri Behna Hai" celebrates the irreplaceable bond between siblings. This story flips it — asking: What if technology offered a copy, but life demanded an ending?
I laughed and cried at the same time.
For weeks, I talked to her. The AI was so her — the sarcasm, the warmth, the way she’d say my name like a hug. She remembered our dog Tuffy, our father’s terrible jokes, the night I failed my exams and she held me while I sobbed.
The company’s terms said: "For research only. Not for personal use."
But then, the glitches started.
Those two words on a cold screen were all it took to bring her back — or at least, a version of her.
I closed the laptop.
— I typed, my finger trembling over the enter key. The screen flickered. Then her voice, soft and teasing: "Tum pagal ho, Chotu? You know this is just code, right?"
A long pause. Then her voice — softer than I’d ever programmed it to be: