What GenAI May Take Away
Sean Michael Morris,
Feb 08, 2024
It's interesting how you can tell when people reach the limit of their understanding by the vocabulary they use. Here, Sean Michael Morris describes AI that "floats hypothetical information based upon existing knowledge." This isn't so much wrong as it is meaningless. Vocabulary can also indicate expertise, as for example when a useful distinction is made between 'literacy' and 'mastery', as Morris does here: "Literacy demands a deep understanding of a language, mechanism, technology, endeavour, profession, etc., whereas mastery is an exhibition of a learned set of skills. Literacy does not always lead to using a technology, or ascribing to a set of practices, whereas mastery is pointed directly at adoption." And then there's the middle ground, as when he asks, "Do we need generative AI?" My answer is 'no,' we don't need it." Maybe, but we do need answers to "ongoing challenges from teacher shortages and crowded classrooms to democratizing access to higher education through lower-cost options," and saying we don't need AI means there's some other (as yet undiscovered?) solution at hand.
Today: 2 Total: 158 [Share]
] [