By Max A. Cherney
(Reuters) – Artificial intelligence chip startup SambaNova Systems announced a new semiconductor on Tuesday, designed to allow its customers to use higher quality AI models at a lower overall cost.
The SN40L chip is designed to run AI models that are more than twice the size that the advanced version of OpenAI’s ChatGPT is said to use, said the Palo Alto, California, company.
“SN40L is specifically built for large language models running enterprise applications,” SambaNova CEO Rodrigo Liang said. “We’ve built a full stack that has allowed us to really understand the enterprise use case really well.”
Big businesses that are looking to deploy AI in novel ways face a different and more complex set of considerations than consumer software like ChatGPT, Liang said.
Security, accuracy and privacy are all areas that AI technology must be designed differently to be useful for enterprise customers.
Nvidia dominates the market for AI chips, but a surge in demand triggered by interest in generative AI software made the coveted chips difficult for some companies to obtain. Intel, Advanced Micro Devices and startups like SambaNova have moved to fill the void.
The new SambaNov chip is capable of powering a 5 trillion parameter model, and includes two advanced forms of memory. Memory can sometimes be a bottleneck to crunching AI data. The company said that its combination of hardware enables customers to run larger AI models without trading size for accuracy.
Taiwan Semiconductor Manufacturing Company manufactures the chip for SambaNova.
(Reporting by Max A. Cherney in San Francisco; Editing by Michael Perry)