Necessary Always Active
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
|
||||||
|
||||||
|
||||||
|
American AI chip manufacturer Groq has launched a data center in Helsinki, Finland. The new Groq data center is the first in a European country. According to CNBC, the latest facility will enable Groq to expand AI inference across Europe.
Demand for AI Referencing
The US chipmaker has been planning to expand internationally. With Groq’s Helsinki data centre launch, the company will cater for the rising demand for AI inference services in Europe. Currently valued at $2.8 billion, the company makes a chip that it has named language processing unit (LPU). Groq says the new European facility will reduce latency, improve data governance and improve response times.
“As demand for AI inference continues at an ever-increasing pace, we know that those building fast need more – more capacity, more efficiency, and with a cost that scales,” he further added.
Groq chip LPU benefits include the ability to support AI inference as opposed to training. Inferencing involves interpretation of live data by pre-trained AI models in order to generate results, which is similar to the responses generated by chatbots like ChatGPT.
Groq, which has been backed by Cisco and Samsung, has partnered with Equinix to set up the data center in Helsinki. Equinix terms Finland an excellent choice for the new data center due to its reliable power grid, free cooling, and sustainable energy policy.
“Combining Groq’s cutting-edge technology with Equinix’s global infrastructure and vendor-neutral connectivity solutions enables efficient AI inference at scale,” Equinix Managing Director for the Nordics Regina Donato Dahlstrom said.
The Nordic region has become a popular location for data centers due to its cool climate and proximity to renewable energy. US chip makers have increased investments in the region in recent years. In June this year, Nvidia CEO Jensen Huang signed multiple infrastructure deals in Europe, including some that relate to data centers.
Staying Ahead of Competition
Nvidia’s strength lies in producing graphics processing units that support AI model training. Many startups are looking to get a share of the AI inference market to compete with the AI chip giant. Some of these startups include Ampere, SambaNova, Fractile, and Cerebras.
Groq CEO Jonathan Ross says his company stands out from chip manufacturing rivals like Nvidia in several ways. Speaking in a CNBC interview, Ross said that Nvidia chips depend on costly components like high-bandwidth memory whose supply is currently limited. Groq LPUs, on the other hand, don’t rely on chips. Instead, its supply chain is mostly based in North America.
“We’re not as supply limited, and that’s important for inference, which is very high volume, low margin. And the reason that we’re so good for Nvidia’s shareholders is, we’re happy to take that high volume but lower margin business and let others focus on the high-margin training,” Ross added.
Additionally, Groq has the ability to deploy its technology fast. The California-based chip maker decided to set up the Helsinki data center about a month ago. Currently, Groq is unloading server racks into the facility location.
“We expect to be serving traffic by the end of this week. That’s built fast and so it’s a very different proposition from what you see in the rest of the market,” Ross said.
In recent months, politicians in the European Union have been advocating for sovereign AI, which means data centers should be located in the region. Setting up data centers closer to users improves service speeds.
Groq develops AI accelerator application-specific integrated circuits (ASICs). The chip maker has already established data centers that use its LPU technology in Canada, the US, and Saudi Arabia. Equinix, on the other hand, links different cloud service providers like Google Cloud and Amazon Web Services together.
This connection makes it easier for enterprises to work with multiple vendors. Groq will install its LPUs in the Equinix data center to give companies access to AI inference capabilities.