The AI inference chip specialist will run DeepSeek R1 70B at 1,600 tokens/second, which it claims is 57x faster than any R1 ...
For a slew of AI chip companies chomping to dethrone Nvidia, DeepSeek is the opening they’ve been waiting for.
The upstart AI chip company Cerebras has started offering China’s market-shaking DeepSeek on its U.S. servers. Cerebras makes ...
Cerebras Systems today announced what it said is record-breaking performance for DeepSeek-R1-Distill-Llama-70B inference, ...
B AI model on its wafer-scale processor, delivering 57x faster speeds than GPU solutions and challenging Nvidia's AI chip ...
In contrast to Blaize, though, Cerebras focuses on data center chips. Blaize going public is ultimately a bet on a future where AI chips move from those centralized data centers to being more ...
Cerebras makes uncommonly large chips that are particularly good at speedy inference—that is, running rather than training AI models. The company claims that its hardware runs the midsize 70 ...
January 30, 2025--(BUSINESS WIRE)--Cerebras Systems, the pioneer in accelerating generative AI, today announced record-breaking performance for DeepSeek-R1-Distill-Llama-70B inference, achieving ...
The DeepSeek-R1-Distill-Llama-70B model is available immediately through Cerebras Inference, with API access available to select customers through a developer preview program. For more information ...