Elon Musk Launches Colossus, the World's Most Powerful AI Learning System. What's Known About It?

Elon Musk's Tennessee-based artificial intelligence startup xAI is a massive data center. The SpaceX founder's new AI is called Colossus, a reference to one of the first electronic computing devices used to decipher data during World War II. Colossus, as Nvidia notes, runs on more than 100,000 chips, more than any other AI system on the planet. As the billionaire recently announced, Colossus has finally been put into operation. It took just 122 days to assemble the world's most powerful AI training system, a record-breaking feat to date. Here are the details!

Elon Musk launched Colossus, the world's most powerful AI training system. What is known about it? Elon Musk is working on creating a new AI called Colossus. Image: googleusercontent.com. Photo.

Elon Musk is working on a new AI called Colossus. Image: googleusercontent.com

Nvidia graphics processing units (GPUs) are special chips that speed up AI, making it learn more efficiently.

Contents

  • 1 Elon Musk and Artificial Intelligence
  • 2 Colossus – what do you need to know?
  • 3 Battle for the throne
  • 4 Quantum computers

Elon Musk and artificial intelligence

Elon Musk is known for his innovative approaches in companies such as Tesla and SpaceX and is actively interested in the development of artificial intelligence. His views on AI can be called cautious, while the entrepreneur himself is involved in the creation of technologies that can integrate this new tool into everyday life.

One of Musk's career highlights in the AI ​​space was his founding of OpenAI in 2015, which aimed to advance and develop safe AI that could benefit humanity. However, in 2018, the billionaire stepped down from OpenAI's board of directors to avoid a conflict of interest with his work at Tesla, which uses AI to develop self-driving cars.

Elon Musk and artificial intelligence. Elon Musk often criticizes the management of OpenAI. Image: wired.com. Photo.

Elon Musk has been a frequent critic of OpenAI's leadership. Image: wired.com

In the years since, Musk has spoken out about the potential dangers of AI, emphasizing the need for government regulation. The entrepreneur also signed an open letter demanding that the development of AI systems more powerful than GPT-4 be halted.

More on the topic: Super AI Will Appear in 2027. True or False?

In 2023, the billionaire announced the creation of a new company called xAI, whose main goal is to develop next-generation AI. Musk said the new platform will help optimize the work of other projects he leads, including SpaceX and the Boring Company.

Colossus – what you need to know?

Musk's AI company is collaborating with X Corp (which includes the social network X (formerly Twitter)) and Nvidia, whose chips helped revolutionize the OpenAI artificial intelligence in 2023. xAI employees are currently working on their main project, the Grok neural network (1, 1.5, 2, mini), as well as an IDE, an integrated development environment for rapid design and interpretability research. Information about this can be found on the company's official website.

On September 2, 2024, Musk announced the launch of an AI training cluster called “Colossus,” which is essentially a huge data center. It uses more than 100,000 Nvidia H100 GPUs, the billionaire reported on his X account.

This weekend, the xAI team launched the Colossus training cluster online. The work was done in 122 days. At the moment, Colossus is the most powerful AI training system in the world. In a few months, the number of Nvidia chips will increase to 200,000, Musk wrote.

Colossus – what you need to know? Musk’s new company is breaking all conceivable and inconceivable records. Image: trendspider.com. Photo.

Musk's new company is breaking all conceivable and inconceivable records. Image: trendspider.com

At the moment, the entrepreneur's project surpasses all existing clusters. For example, Google uses only 90 thousand Nvidia GPUs, and OpenAI – 80. Let us remind you that the company's chips are the most in-demand in the industry and are used to train and run artificial intelligence systems, such as chatbots and image generators.

Read even more interesting articles about artificial intelligence and quantum computers on our channel in Yandex.Zen – articles that are not on the site are regularly published there!

The fact that in the coming months Colossus will “double” in size to 200,000 chips, including 50,000 H200 GPUs, means that the memory capacity of this AI will be almost twice as large as it is now, and its throughput will increase by 40%.

It turns out that Musk's company has managed to catch up in technological capabilities with such leaders in the field as OpenAI and Microsoft, who had a long-term advantage, is impressive, to say the least. Of course, provided that the billionaire is not exaggerating.

Colossus – what you need to know? Grok is one of the main projects of xAI. Image: digital-report.ru. Photo.

Grok is one of xAI's main projects. Image: digital-report.ru

As Fortune notes, Nvidia counts Musk among its best customers, as he had already purchased tens of thousands of GPUs for Tesla for between $3 billion and $4 billion before partnering with xAI.

Don't miss: How will AI change in 2024?

What's more, some of the acquired chips, originally intended to train Tesla's fully autonomous system, will be used to train an early version of Grok.

Battle for the throne

Despite its impressive achievements, the launch of Colossus, which can be broadly described as a supercomputer, has been plagued by controversy. In late August, Memphis residents who live near the Tennessee data center complained about the “unbearable levels of smog” created by Musk's company, which could lead to further conflicts in the future.

These problems are likely to be the least of xAI's concerns, as its title as the most powerful AI training system is certainly at risk. It's unlikely that OpenAI, Microsoft, Google, and Meta will rest on their laurels, waiting for a competitor to triumph. And some of the leaders in the field already have hundreds of thousands of their own GPUs.

Battle for the throne. Elon Musk spent billions of dollars on his new project. Image: laecuaciondigital.com. Photo.

Elon Musk Has Spent Billions of Dollars on His New Project. Image: laecuaciondigital.com

For example, Microsoft plans to assemble 1.8 million AI chips by the end of the year (though the figure sounds overly optimistic), and in January, Mark Zuckerberg announced that Meta intends to acquire another 350,000 Nvidia H100s by the same date.

For now, however, Colossus remains the only provider of computing power. Journalists note that the cluster will be used to train Grok-3, which Musk intends to release in December.

You might be interested in: Neural networks have learned to lie and they do it intentionally

Quantum computers

Quantum computers, such as Colossus, use qubits (quantum bits) instead of traditional bits. Qubits can be in two states at the same time, that is, they can be in a quantum superposition. This feature allows quantum computers to process huge amounts of data in a matter of seconds. In addition, qubits can become entangled with each other, regardless of the distance between them.

Needless to say, quantum computers require special algorithms to operate. For this reason, Google has developed a unique set of tools and software that allows scientists and engineers to create and test quantum algorithms.

Quantum computers. Quantum computers, like artificial intelligence, are changing the world. Image: wired.com. Photo.

Quantum computers, like artificial intelligence, are changing the world. Image: wired.com

One of the main problems with quantum computers, including Musk's Colossus, is quantum decoherence – the loss of the quantum state of qubits due to interaction with the environment, which requires the use of complex cooling and isolation systems to minimize any external influences.

Read also: Supercomputer turned back the cosmic clock

In addition, increasing the number of qubits to scale quantum computers remains a technical challenge. However, despite the current difficulties, the prospects for the emergence of such computers are quite encouraging. With their help, revolutionary changes will occur in many areas of science, including pharmaceuticals and AI.

For example, scientists around the world continue to search for new ways to optimize and scale quantum technologies, which can lead to the creation of technologies capable of solving the most complex problems. In short, let's wish them and Elon Musk good luck and wait for news!


Date:

by