OpenAI’s next-generation Orion model is hitting a serious bottleneck, according to a new report – and here’s why
- OpenAI reportedly has issues with Orion in certain areas, such as encryption
- Progress is slower than expected due to quality issues with training data
- The next-gen model could also be more expensive
OpenAI is running into trouble with Orion, the next-gen model that powers its AI. The company is struggling in certain areas when it comes to the performance gains achieved with the successor to GPT-4.
This comes from a report by The informationciting OpenAI staff, who claim that the quality improvement at Orion is ‘much smaller’ than that seen when switching from GPT-3 to GPT-4.
We’re also told that some OpenAI researchers say Orion is “not reliably better than its predecessor [GPT-4] when performing certain tasks.” What tasks would those be? Apparently coding is a weaker area, with Orion possibly not surpassing GPT-4 in this area – although Orion’s language skills are also noted to be stronger.
So for general purpose searches – and for tasks like summarizing or rewriting text – it sounds like things are going (relatively) well. However, these rumors don’t sound so hopeful for those looking to use AI as a coding tool.
What’s the problem here?
By all accounts, OpenAI is hitting a wall when it comes to the data available to train its AI. As the report makes clear, there is a “dwindling supply of high-quality text and other data” for LLMs (Large Language Models) to work with in pre-release training to improve their skills in solving tougher problems, such as solving correct coding errors.
These LLMs have cut through a lot of the low-hanging fruit, and now finding this good quality training data becomes a significantly more difficult process, slowing progress in some respects.
Furthermore, this training will become more intensive in terms of computing resources, meaning that developing (and using) Orion – and even more AI models in the future – will become much more expensive. Of course, the user of the AI will ultimately have to foot that bill one way or another, and there is even talk of more advanced models actually becoming “financially unfeasible” to develop.
Not to mention the environmental impact in the form of ever-larger data centers sucking more and more power from our networks, all at a time of increasing concern about climate change.
Although we should approach this report with caution, disturbing rumors are emerging here, heralding a serious reality check for the future development of AI.
The Information further notes that a different approach may be taken when it comes to continually improving AI models after their initial training – this may even become a necessity at first glance. We’ll see.
Orion is expected to debut in early 2025 (and not anytime soon, as some rumors have hinted), and it may not be called ChatGPT-5, with OpenAI potentially completely changing its AI naming scheme with this next-gen model .