- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Bill Gates feels Generative AI has plateaued, says GPT-5 will not be any better::The billionaire philanthropist in an interview with German newspaper Handelsblatt, shared his thoughts on Artificial general intelligence, climate change, and the scope of AI in the future.
The next big steps coming right now are AI trained on generative data and agents that act more automatically (rather than waiting for a prompt, take an action like searching the web and act on that to better complete the goal for example), and better indexed data so generated data can be informed by and cite sources in the moment.
This has already been shown to degrade the output very quickly.
I think the wall that generative AI is hitting is the lack of more training data. All the web has been scraped to get it to where it is today, more and more content on the web is itself generated by AI and therefore not only useless but harmful if used as training data.
https://www.newscientist.com/article/2382519-ais-trained-on-ai-generated-images-produce-glitches-and-blurs/
Orca 2 is an example of an opensource model that was built to better collect and build on synthetic data: https://www.microsoft.com/en-us/research/publication/orca-progressive-learning-from-complex-explanation-traces-of-gpt-4/
The case I think being made is that building LFMs on the Internet gets you closer to an average internet users level of our put, using reinforcement learning you can further curate the outputs, then finally using these techniques you can generate even tighter high quality models.
It’s interesting stuff for sure.