Stephen Downes

Knowledge, Learning, Community

This sort of example illustrates what's wrong with so much writing on AI. Here are the instructions given to ChatGPT: "Using a light bulb, a battery, and a wire, draw all the different ways you can connect them to make the light bulb light up." First of all, you can't draw with a light bulb, wire or battery. Second, no drawing can make a light bulb light up. Third, for all practical purposes, you need two wires to complete such a circuit. Fourth, the set of 'all possible ways' is infinite, and can never be completed. And fifth, a lot of humans would fail the task, even if they were able to navigate their way through the mangled text. That ChatGPT proposed any solution to such a badly worded problem is a miracle. But here it is cited as a case of ChatGPT not possessing "fundamental knowledge". 

Today: Total: [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2025
Last Updated: Aug 28, 2025 9:15 p.m.

Canadian Flag Creative Commons License.