This brings total funding to more than $200 million for Cerebras, which showed off its CS-1 system at the Supercomputing conference in Denver last year. The system is based on its Wafer Scale ...
B AI model on its wafer-scale processor, delivering 57x faster speeds than GPU solutions and challenging Nvidia's AI chip ...
Cerebras Systems today announced what it said is record-breaking performance for DeepSeek-R1-Distill-Llama-70B inference, ...
The upstart AI chip company Cerebras has started offering China’s market-shaking DeepSeek on its U.S. servers.
The DeepSeek-R1-Distill-Llama-70B model is available immediately through Cerebras Inference, with API access available to select customers through a developer preview program. For more information ...