Content-type: text/html Downes.ca ~ Stephen's Web ~ IAB Workshop on AI-CONTROL (aicontrolws)

Stephen Downes

Knowledge, Learning, Community

Robots.txt (a.k.a. the Robots Exclision Protocol) is a small file web servers provide to tell web crawlers like the Google search engine where they can search and where they can't. The Internet Engineering Task Force (IETF), which creates the protocols for the internet, is considering the use of robots.txt to manage what crawlers used by AI companies can do. This page is a set of submissions to that task force, including contributions from OpenAI, Creative Commons, the BBC, Elsevier, and more. Most of the submissions are pretty short and all of them are interesting reading. Via Ed Summers.

Today: 1 Total: 111 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Oct 07, 2024 11:29 a.m.

Canadian Flag Creative Commons License.

Force:yes