What Will Transformers Transform?

# · 🔥 124 · 💬 58 · one year ago · rodneybrooks.com · KKKKkkkk1 · 📷
Together, the OpenAI and Wolfram reports give a very good technical understanding of most things GPT. state of the art GPTs from Open AI. For the last few months there has been lots of excitement about the 175 billion parameter GPT-3 from the company Open AI. It was set up, under the name ChatGPT, so that people could query it, type in a few words and have it "Answer" the question. Microsoft attached GPT to its search engine Bing at around the same time. Despite its capabilities, GPT-4 has similar limitations to earlier GPT models [1, 31, 32]: it is not fully reliable, has a limited context window, and does not learn from experience. If you are interacting with the output of a GPT system and didn't explicitly decide to use a GPT then you're the product being hoodwinked. Using an unreliable system sounds awfully unreliable, but in August 2021 I had a revelation at TED in Monterey, California, when Chris Anderson, was interviewing Greg Brockman, the Chairman of Open AI about an early version of GPT. He said that he regularly asked it questions about code he wanted to write and it very quickly gave him ideas for libraries to use, and that was enough to get him started on his project. No matter how big, and how many parameters, GPTs are not going to to do that themselves. There will be surprising things built with GPTs, both good and bad, that no-one has yet talked about, or even conceived.
What Will Transformers Transform?



Send Feedback | WebAssembly Version (beta)