There are two parts to the claim referenced in the title: first, do children actually learn as described here, and second, can an AI do the same thing? The answers to each are interesting. First up: how children learn. "Even before they speak their first words," we are told, "human babies develop mental models about objects and people." These models go beyond appearance and include "inferences about what other agents are doing or wish to accomplish." Next, can an AI do this? The paper being discussed proposes the Action, Goal, Efficiency, coNstraint, uTility (AGENT) test "to assess how well AI systems can mimic this basic skill." What's key here is that this is all pre-linguistic. What's interesting is whether we need traditional (and usually language-based) tools like models and whether we need cognitivist concepts like beliefs and intentions. I'm saying no.