Content-type: text/html Downes.ca ~ Stephen's Web ~ Why GPT-3 Matters

Stephen Downes

Knowledge, Learning, Community

Why GPT-3 Matters

Leo Gao, Jul 21, 2020

I've run a few posts on GPT-3 and it makes sense to include this item to put it into context. First is its size; "it's an entire order of magnitude larger" than the previously largest model. "Loading the entire model's weights in fp16 would take up an absolutely preposterous 300GB of VRAM." What this means is that GPT-3's language models are "few shot learners" - that is, they can "perform a new language task from only a few examples or from simple instructions." That's why it can create a Shakespeare sonnet after being given only the first few lines - it recognizes what you're trying to do and is able to emulate it. Now we're not quite at the point where artificial intelligence can write new open educational resources (OER) on an as-needed basis - but we're a whole lot closer with GPT-3.

Today: 2 Total: 1104 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Apr 16, 2024 4:52 p.m.

Canadian Flag Creative Commons License.

Force:yes