AI Homework

# · 🔥 469 · 💬 286 · one year ago · stratechery.com · tejohnso · 📷
The obvious analogy to what ChatGPT means for homework is the calculator: instead of doing tedious math calculations students could simply punch in the relevant numbers and get the right answer, every time; teachers adjusted by making students show their work. AI output, on the other hand, is probabilistic: ChatGPT doesn't have any internal record of right and wrong, but rather a statistical model about what bits of language go together under different contexts. The difference is that ChatGPT is not actually running python and determining the first 10 prime numbers deterministically: every answer is a probabilistic result gleaned from the corpus of Internet data that makes up GPT-3; in other words, ChatGPT comes up with its best guess as to the result in 10 seconds, and that guess is so likely to be right that it feels like it is an actual computer executing the code in question. For what it's worth, I had to work a little harder to make ChatGPT fail at math: the base GPT-3 model gets basic three digit addition wrong most of the time, while ChatGPT does much better. The issue is that the correct code comes from experienced developers answering questions and having those questions upvoted by other developers; what happens if ChatGPT starts being used to answer questions? Overall, because the average rate of getting correct answers from ChatGPT is too low, the posting of answers created by ChatGPT is substantially harmful to the site and to users who are asking or looking for correct answers. The primary problem is that while the answers which ChatGPT produces have a high rate of being incorrect, they typically look like they might be good and the answers are very easy to produce.
AI Homework



Send Feedback | WebAssembly Version (beta)