Sakana AI Redefines the AI Frontier: Multi-Model Teams Outperform Single Models by 30%
Sakana AI, a futuristic lab from Japan, has pulled back the curtains on an innovative technique involving the co-operative effort of multiple large language models (LLMs). This method results in an AI ‘dream team’ capable of tackling a common task more efficiently than any individual model, surpassing single model performance by about 30%. The unveiling of the method, named Multi-LLM AB-MCTS, has sent ripples across the world of AI and enterprise.
This trailblazing technique presents an intriguing prospect for building more robust and adaptable AI systems. It essentially allows businesses and enterprises to shake off the dependency on a solitary model or provider. In grounds-breaking progress, it allows a dynamic utilization of the best elements of different frontier models based on the demands of the task, thereby delivering superior results.
Having underlined the importance of co-operation among AI models, Sakana AI has also shed light on its cutting-edge algorithm referred to as an “inference-time scaling” technique. While the AI world has been largely engrossed in ’training-time scaling’, i.e., making models bigger and training them on larger datasets, this approach from Sakana AI is a departure. It seeks to enhance the performance coming out of the network by allocating more computational resources once the model is fully trained.
The remarkable approach takes inspiration from reinforcement learning and repeated sampling. The system co-ordinates to refine a good idea but also takes a divergent path to explore something new, graphing wider or deeper as required. This smart technique sprouts from the core of a remarkable algorithm, the Adaptive Branching Monte Carlo Tree Search (AB-MCTS). Leveraging the power of MCTS, a decision-making algorithm lauded for its use in DeepMind’s AlphaGo, the AB-MCTS allows for effectively balancing between two search strategies. In essence, Sakana AI is redefining the technological front lines, unlocking a gateway to superior AI capabilities.
- •Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30% venturebeat.com04-07-2025