What If Sharing Was Training?
Debanshu Bhaumik,
Bot Populi,
Jan 29, 2026
This article raises issues related to AI, training and time. For example: all my opinions are on my website, including the ones I had in 1995. To an AI, they are all weighted equally. But should an AI (or a human!) learn from the 1995 version of me? If AI ingested my views from 1995, should I have the right to retract them? "Self-annotation makes the archive interpretable. Retention makes it durable. Together they produce a new object: the self as a time-series of labeled signals, portable across contexts and reusable by default. That object outlives the moment it was meant to serve. Call it time collapse." This, of course, is not a new problem, as anyone who had to learn from outdated textbooks can attest (I grew up thinking 'French West Africa' was still a thing).
Today: Total: [] [Share]

