Yes, androids do dream of electric sheep
If you stare into static long enough you can eventually see shapes. That's how your brain works; as a neural network it functions through a process of recognition, picking out familiar (ie., previously perceived) images out of background noise. Not all of our experiences are of static, of course, so our recognition of images is more predictable, more stable, and they settle in our conception as material things, as objects. This article in the Guardian, based on a report from Google, dramatically illustrates how that process works. Google image-recognition algorithms were fed both static and background images and were asked to extract dogs, animals, buildings, etc. This is what the images look like when they do that. And it is, suggests the time, an insightful glimpse into robot consciousness. And, I would submit, human consciousness as well.
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe,
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own,
you can join our mailing list. Click here to subscribe.