CEREBRAS SYSTEMS


Associated tags: System, Deep learning, AI, Medical device, Software, Artificial Intelligence, Research, Hardware, Data Management, Graphics processing unit, CS-2, GPU, G42, HPC, Argonne National Laboratory, Engineering, Memory, Cerebras, Artificial intelligence

Locations: UAE, UNITED ARAB EMIRATES, BAHRAIN, IRAN, IRAQ, JORDAN, KUWAIT, LEBANON, OMAN, PALESTINE, QATAR, SAUDI ARABIA, SYRIA, YEMEN, SOUTH KOREA, MONTE CARLO, APACHE, UNITED STATES, NORTH AMERICA, CALIFORNIA, BARCELONA, SPAIN, RA, SAN FRANCISCO

Cerebras and G42 Break Ground on Condor Galaxy 3, an 8 exaFLOPs AI Supercomputer

Retrieved on: 
Wednesday, March 13, 2024

Cerebras Systems , the pioneer in accelerating generative AI, and G42 , the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy.

Key Points: 
  • Cerebras Systems , the pioneer in accelerating generative AI, and G42 , the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy.
  • The Cerebras and G42 strategic partnership already delivered 8 exaFLOPs of AI supercomputing performance via Condor Galaxy 1 and Condor Galaxy 2, each amongst the largest AI supercomputers in the world.
  • Located in Dallas, Texas, Condor Galaxy 3 brings the current total of the Condor Galaxy network to 16 exaFLOPs.
  • By doubling the capacity to 16 exaFLOPs, we look forward to seeing the next wave of innovation Condor Galaxy supercomputers can enable.”
    At the heart of Condor Galaxy 3 are 64 Cerebras CS-3 Systems.

Cerebras Selects Qualcomm to Deliver Unprecedented Performance in AI Inference

Retrieved on: 
Wednesday, March 13, 2024

“These joint efforts are aimed at ushering in a new era of high-performance low-cost inference and the timing couldn’t be better. Our customers are focused on training the highest quality state-of-the-art models that won’t break the bank at time of inference,” said Andrew Feldman, CEO and co-founder of Cerebras. “Utilizing the AI 100 Ultra from Qualcomm Technologies, we can radically reduce the cost of inference – without sacrificing model quality -- leading to the most efficient deployments available today.”

Key Points: 
  • “Utilizing the AI 100 Ultra from Qualcomm Technologies, we can radically reduce the cost of inference – without sacrificing model quality -- leading to the most efficient deployments available today.”
    Leveraging the latest cutting-edge ML techniques and world-class AI expertise, Cerebras will work with Qualcomm Technologies’ AI 100 Ultra to speed up AI inference.
  • NAS service from Cerebras: Using Network Architecture Search for targeted use cases the Cerebras platform can deliver models that are optimized for the Qualcomm AI architecture leading to up to 2x higher inference performance.
  • For more information on the Qualcomm Technologies and Cerebras AI training and inference solutions, please visit the Cerebras blog .
  • The Cerebras CS-3 for AI training and Qualcomm AI 100 Ultra for inference at scale will be available in Q2/Q3 2024.

Cerebras Systems Unveils World’s Fastest AI Chip with Whopping 4 Trillion Transistors

Retrieved on: 
Wednesday, March 13, 2024

The latest Cerebras Software Framework provides native support for PyTorch 2.0 and the latest AI models and techniques such as multi-modal models, vision transformers, mixture of experts, and diffusion.

Key Points: 
  • The latest Cerebras Software Framework provides native support for PyTorch 2.0 and the latest AI models and techniques such as multi-modal models, vision transformers, mixture of experts, and diffusion.
  • We could not be more proud to be introducing the third-generation of our groundbreaking wafer-scale AI chip,” said Andrew Feldman, CEO and co-founder of Cerebras.
  • “WSE-3 is the fastest AI chip in the world, purpose-built for the latest cutting-edge AI work, from mixture of experts to 24 trillion parameter models.
  • Condor Galaxy 3 will be built with 64 CS-3 systems, producing 8 exaFLOPs of AI compute, one of the largest AI supercomputers in the world.

Cerebras and G42 Break Ground on Condor Galaxy 3, an 8 exaFLOPs AI Supercomputer

Retrieved on: 
Wednesday, March 13, 2024

SUNNYVALE, Calif., March 13, 2024 /PRNewswire/ -- Cerebras Systems, the pioneer in accelerating generative AI, and G42, the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy. Featuring 64 of Cerebras' newly announced CS-3 systems – all powered by the industry's fastest AI chip, the Wafer-Scale Engine 3 (WSE-3) – Condor Galaxy 3 will deliver 8 exaFLOPs of AI with 58 million AI-optimized cores.

Key Points: 
  • Featuring 64 Cerebras CS-3 Systems, Condor Galaxy 3 Doubles Performance at Same Power and Cost
    SUNNYVALE, Calif., March 13, 2024 /PRNewswire/ -- Cerebras Systems , the pioneer in accelerating generative AI, and G42 , the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy.
  • Featuring 64 of Cerebras' newly announced CS-3 systems – all powered by the industry's fastest AI chip, the Wafer-Scale Engine 3 (WSE-3) – Condor Galaxy 3 will deliver 8 exaFLOPs of AI with 58 million AI-optimized cores.
  • The Cerebras and G42 strategic partnership already delivered 8 exaFLOPs of AI supercomputing performance via Condor Galaxy 1 and Condor Galaxy 2, each amongst the largest AI supercomputers in the world.
  • Located in Dallas, Texas, Condor Galaxy 3 brings the current total of the Condor Galaxy network to 16 exaFLOPs.

Cerebras Systems and Barcelona Supercomputing Center Train Industry-Leading Multilingual Spanish Catalan English LLM

Retrieved on: 
Wednesday, January 31, 2024

Cerebras Systems, the pioneer in accelerating generative AI, today announced that the Barcelona Supercomputing Center (BSC) has completed training FLOR-6.3B, the state-of-the-art English Spanish Calatan large language model.

Key Points: 
  • Cerebras Systems, the pioneer in accelerating generative AI, today announced that the Barcelona Supercomputing Center (BSC) has completed training FLOR-6.3B, the state-of-the-art English Spanish Calatan large language model.
  • FLOR-6.3B continues Cerebras’ leading work on multilingual models, a trend that started with the introduction of Jais , the leading Arabic English model.
  • As Catalan has a fraction of the data that is typically needed to train a model, innovative AI training techniques were created.
  • Catalan and Spanish are low and mid-resourced languages relative to English.

Cerebras Collaborates with Mayo Clinic to Advance AI in Healthcare

Retrieved on: 
Monday, January 15, 2024

Cerebras Systems , a pioneer in accelerating generative AI, today announced a collaboration with Mayo Clinic as its first generative AI collaborator for the development of large language models (LLMs) for medical applications.

Key Points: 
  • Cerebras Systems , a pioneer in accelerating generative AI, today announced a collaboration with Mayo Clinic as its first generative AI collaborator for the development of large language models (LLMs) for medical applications.
  • To create the first truly patient-centric healthcare AI, Mayo Clinic selected Cerebras for its proven experience in designing and training large scale, domain-specific generative AI models.
  • Mayo Clinic and Cerebras seek to develop a similar model for other disease states.
  • The Cerebras CS-2, powered by the WSE-2, is purpose-built for generative AI and delivers a rare combination of world-leading AI compute, software and AI expertise.

Cerebras Systems Appoints Shirley Li as General Counsel

Retrieved on: 
Thursday, January 4, 2024

Cerebras Systems , the pioneer in accelerating generative AI, today announced the appointment of Shirley Li as General Counsel, reporting directly to CEO Andrew Feldman.

Key Points: 
  • Cerebras Systems , the pioneer in accelerating generative AI, today announced the appointment of Shirley Li as General Counsel, reporting directly to CEO Andrew Feldman.
  • With more than a decade of experience as an advocate and operator at leading technology companies, Li brings a wealth of business and legal experience to Cerebras.
  • Prior to that, Li held several legal roles at Cadence Design Systems, Inc., most recently serving as the Associate General Counsel overseeing corporate legal, securities, governance, M&A, strategic investments, and global stock administration.
  • For more information on Cerebras Systems and the company’s current employment opportunities, please visit https://cerebras.net/careers/ .

Cerebras, Petuum, and MBZUAI Announce New Open-Source CrystalCoder and LLM360 Methodology to Accelerate Development of Transparent and Responsible AI Models

Retrieved on: 
Monday, December 11, 2023

While previous models were suitable for either English or coding, CrystalCoder achieves high accuracy for both tasks simultaneously.

Key Points: 
  • While previous models were suitable for either English or coding, CrystalCoder achieves high accuracy for both tasks simultaneously.
  • Trained on Condor Galaxy 1, the AI supercomputer built by G42 and Cerebras, CrystalCoder-7B has been released under the new reproducible LLM360 methodology that promotes open source and transparent, responsible use.
  • CrystalCoder and the LLM360 release methodology are available now on Hugging Face.
  • "Petuum and MBZUAI are excited to announce the release of the CrystalCoder-7B large language model (LLM).

Cerebras Systems Hires Industry Luminary Julie Shin Choi as Senior Vice President and Chief Marketing Officer

Retrieved on: 
Wednesday, November 29, 2023

Cerebras Systems , the pioneer in accelerating generative AI, today announced that it has hired industry expert Julie Shin Choi as its Senior Vice President and Chief Marketing Officer.

Key Points: 
  • Cerebras Systems , the pioneer in accelerating generative AI, today announced that it has hired industry expert Julie Shin Choi as its Senior Vice President and Chief Marketing Officer.
  • Choi will also join the Cerebras executive team where she will help accelerate the growth and adoption of Cerebras technology into existing markets, as well as new ones.
  • View the full release here: https://www.businesswire.com/news/home/20231129709810/en/
    Choi joins Cerebras from MosaicML where she was Chief Marketing and Community Officer and the company’s first exec hire.
  • Previously, Julie led product marketing roles at HPE, Mozilla, and Yahoo, where she focused on enterprise customers and developers.

Cerebras Systems Announces 130x Performance Improvement on Key Nuclear Energy Simulation over Nvidia A100 GPUs

Retrieved on: 
Monday, November 13, 2023

Cerebras Systems , the pioneer in accelerating generative AI, today announced the achievement of a 130x speedup over Nvidia A100 GPUs on a key nuclear energy HPC simulation kernel, developed by researchers at Argonne National Laboratory.

Key Points: 
  • Cerebras Systems , the pioneer in accelerating generative AI, today announced the achievement of a 130x speedup over Nvidia A100 GPUs on a key nuclear energy HPC simulation kernel, developed by researchers at Argonne National Laboratory.
  • This result demonstrates the performance and versatility of the Cerebras Wafer-Scale Engine (WSE-2) and ensures that the U.S. continues to be the global leader in supercomputing for energy and defense applications.
  • This kernel represents the most computationally intensive portion of the full simulation, accounting for up to 85% of the total runtime for many nuclear energy applications.
  • “These published results highlight not only the incredible performance of the CS-2, but also its architectural efficiency,” said Andrew Feldman, CEO and co-founder of Cerebras Systems.