- OpenAI is reportedly having problems with Orion in certain areas such as coding
- Progress is slower than expected due to quality issues with training data
- Next-gen model could also be more expensive
OpenAI is having difficulty with Orion, the next-generation model that powers its AI. The company is struggling in certain areas when it comes to the performance improvements gained with the successor to GPT-4.
This comes from a report by The Information, which quotes OpenAI employees, who claim that the increase in quality seen with Orion is “much smaller” than that seen when moving from GPT-3 to GPT-4.
We're also told that some OpenAI researchers say Orion is “no better than its predecessor.” [GPT-4] in the handling of certain tasks.” What tasks would they be? Encoding is apparently a weak point, and Orion may not outperform GPT-4 in this area, although Orion's linguistic abilities are also noted to be stronger.
So for general purpose queries (and for work like summarizing or rewriting text) things seem to be going (relatively) well. However, these rumors don't sound so hopeful for those looking to use AI to aid in coding.
So what's the problem here?
By all indications, OpenAI is hitting something of a wall when it comes to the data available to train its AI. As the report makes clear, there is a “dwindling supply of high-quality text and other data” that LLMs (Large Language Models) can work with in pre-release training to hone their powers in solving more complicated problems, How to resolve coding errors.
These LLMs have gobbled up many of the low-hanging fruit and now finding this good quality training data is becoming a considerably more difficult process, slowing progress in some respects.
On top of that, this training will be more intensive in terms of computing resources, meaning that developing (and running) Orion (and other AI models in the future) will be much more expensive. Of course, the AI user will end up footing that bill, one way or another, and there is even talk that developing more advanced models will effectively become “financially unviable.”
Not to mention the impact on the environment in terms of larger data centers running and absorbing more energy from our networks, all at a time of growing concern about climate change.
While we should take this report with due caution, there are worrying rumors that portend a serious reality check for AI development in the future.
The information further notes that a different approach can be taken in terms of improving AI models continuously after their initial training; In fact, this may become a necessity from the looks of it. We'll see.
Orion is expected to debut in early 2025 (and not imminently, as some rumors have hinted), and may not be called ChatGPT-5, with OpenAI possibly changing its AI naming scheme entirely with this model. next generation.