News

The Cerebras Wafer Scale Engine, effectively the world's largest and fastest chip. “There is no supercomputer on earth, regardless of size, that can achieve this performance,” said Andrew ...
At over 2,500 t/s, Cerebras claims to have has set a world record for LLM inference speed on the 400B parameter Llama 4 ...
Cerebras CEO Andrew Feldman said his hope is that the AI chipmaker goes public in 2025 after a delay last year.
Cerebras launched its AI inference service last August. Inference refers to the process of running live data through a trained AI model to make a prediction or solve a task, and high performance is ...
The speed and the cost advantages of the inference service derive principally from the design of the company's WSE-3 chip, the third generation of Cerebras's processor, unveiled this year.
On Tuesday, Cerebras Systems, the company primarily known for wafer-scale processors for AI, filed for an IPO on the Nasdaq under the symbol CBRS. Cerebras, which offers both servers based on its ...
SAN FRANCISCO, Aug 27 (Reuters) - Cerebras Systems launched on Tuesday ... than industry-standard Nvidia (NVDA.O), opens new tab processors. Access to Nvidia graphics processing units (GPUs ...
Cerebras, which makes processors for artificial intelligence workloads, filed to go public in September but hasn't provided an update on the expected size or timing of an offering. In March ...
Cerebras "design(s) processors for AI training and inference," the filing said, as quoted on investors.com. "We build AI systems to power, cool and feed the processors data. We develop software to ...