Listen on:

Spotify Icon
Airdrop On Icon 256x256 Ws9xktad
62b1e81756b6848f8bec9037
Playerfm
Youtube Logo Youtube Logo Transparent Youtube Icon Transparent Free Free Png

Most everyone remembers the tale of David and Goliath, one of the greatest underdog stories. We have seen the story repeat itself in business, across sectors from food and car companies to consumer products and tech. But what about AI?  The AI computer chip wars are heating up—on fire actually—as companies attempt to topple the Goliath that Nvidia has become. Our David for this episode is Cerebras Systems, one of the rapidly-growing AI chip powerhouses that has built some of the world’s fastest supercomputers. This increase in speed wasn’t just a few percent faster either, tests are showing that Cerebras’ work came in up to 75 times faster than their competitors.

Cerebras CMO, Julie Choi, joins this episode of The Reboot Chronicles Show for an insider’s look into how they created the next-gen technology which is driving their record-breaking success. Julie has seen Cerebras through their $700 million in funding, filing to take the company public to raise $1B, and building strategic partnerships with businesses like the Mayo Clinic and the Department of Energy. Now, on a mission to accelerate generative AI by building a new class of AI supercomputer, that could change the course of progress, Julie shares stories about both the nerdy side of chips and the future of technology as this industry hits breakneck speeds around the globe.

A Chip As Big As Your Head

Previously, on the Reboot Chronicles, we talked about Silicon Box and their work on making smaller chips that can be combined to make a larger system based on your needs. Cerebras has gone a different route and has made a chip that is larger than your head. Literally, Julie came prepared for the interview and had their new Wafer Scale Engine(WSE for short) on hand, and when she held it up to the camera, she almost disappeared behind it. While holding it up, she even noted, “We’re the first and only company to have invented a chip of this size.” The WSE is what has launched Cerebras to their record-breaking speeds that we talked about in the intro, and is their slingshot that might topple Nvidia.

Building The Machine To Train More Machines

With the massive leap in hardware that Julie and her team have made, the question is, what are they doing with the hardware? And who are they giving it to? One of their primary customers is the Mayo Clinic, which is using Cerebras to train and create genomic foundation models. And that is just one example of what Cerebras WSE tech can do. Not only are Cerebras giving companies the hardware to create their personal models, but they also have a services side of the company where they take on clients to use the hardware they have already set up to advance their businesses.

The Future Of AI And Chip Technology

Cerebras has been around for 9 years, and in that time, they have made a meteoric splash in the AI industry. In the upcoming decade, expect to see a lot more changes in both AI hardware, software, and even the physical architecture used to house all these supercomputers. As AI continues to take on more and more data, data centers have to find ways to keep up or else the AI advancements will mean nothing. “We have a limited real estate on Earth to build all of these AI data centers,” Julie says, and those data centers constantly need to think about making themselves as efficient as possible when building. So expect massive innovation in data center cooling, efficiency, and how fast they can move data, and if we don’t see that, maybe Julie and her team need to do it themselves and probably break even more records.

Similar Posts