Interesting hypothesis: " The first conscious AI systems will have rich and complex conscious intelligence, rather than simple conscious intelligence." Why? Well because although "AI systems are -- presumably! -- not yet meaningfully conscious, not yet sentient..." nonetheless "they do already far exceed all non-human animals in their capacity to explain the ironies of Hamlet and plan the formation of federally tax-exempt organizations... Let's see a frog try that!" What would it take? Well, that's trickier. Well, according to Eric Schwitzgebel, developers would need to "create rich and complex representations or intelligent behavioral capacities." Maybe - but I don't agree. I don't think consciousness depends on representation. Consciousness is experience, and when we experience things, we experience them directly, not representationally.
Today: 3 Total: 934 [Share]
] [