d-Matrix gets $44 million to fund efficient AI chips for servers • The Register

Another day, another American chip startup raised tens of millions of dollars.

This time it’s the one you probably haven’t heard of yet: d-Matrix, which raised $44 million from Microsoft, SK Hynix, Marvell Technology and others for a new type of chip design. Efficient AIs for data center servers.

The Santa Clara-based startup unveiled its funding and initial roadmap Last weekclaiming it will use an “innovative digital in-memory computing” architecture to build chiplet-based processors that deliver what it claims is faster AI inference performance than CPUs and GPUs for large models of transformers.

As such, d-Matrix seeks to win business with so-called hyperscalers like Meta and Alphabet, which rely on transformer models with millions upon billions of parameters to power popular applications, and become the latest beneficiary of billions pouring into chip funding.

d-Matrix examines cloud computing use cases such as recommendation, text classification, social media analysis, search, and content moderation. The startup also hopes to pack edge data centers with its chip, which it says offers a major advantage in computational efficiency thanks to its in-memory digital computing architecture.

For d-Matrix, the edge opportunity spans both mainstream enterprises, with targeted use cases such as chatbots and document processing, and 5G networks, with use cases such as search. voice and the “Metaverse AI”.

“The hyperscale and edge data center markets are approaching performance and power limits, and it is clear that a breakthrough in AI computational efficiency is needed to match the exponentially growing market,” said Sasha Ostojic, venture partner at venture capital firm Playground Global, who led the funding round with fellow companies Nautilus Venture Partners and Entrada Ventures.

Ostojic said the startup’s other differentiation comes from its software, which d-Matrix markets. on its website as “open, simplistic, scalable and frictionless for easy adoption”. Its software capabilities include the ability for users to “seamlessly map existing trained models” onto its hardware.

“d-Matrix is ​​a defensible new technology that can outperform traditional CPUs and GPUs, unlocking and maximizing efficiency and power utilization through their software stack,” Ostojic added.

The startup’s initial roadmap consists of its first silicon chiplet, named Nighthawk, and a follow-up called Jayhawk, which it says will be released “soon”. Investor funding will help d-Matrix develop this roadmap and hire more people for its current team of 50 people.

Using a chiplet design, d-Matrix said it could integrate multiple programming engines together like Lego in a single package. The startup assembles these chiplets using an advanced packaging technology it calls hetero-modular organic packaging, which it says “allows chiplet heterogeneity and scalability, while being easily available and profitable”.

“d-Matrix has embarked on a three-year journey to build the world’s most efficient computing platform for large-scale AI inference,” said Sid Sheth, co-founder and CEO of d-Matrix. “We have developed a revolutionary computational architecture that is fully digital, making it practical to implement while advancing AI computational efficiency far beyond the memory wall it has reached today. today.”

Sheth and his co-founder, Sudeep Bhoja, both have respectable experience in the semiconductor industry. The two veteran engineers were previously executives at broadband interconnect maker Inphi Corporation, which was acquired in 2020 in a $10 billion deal by Marvell Technology, an early investor in d-Matrix.

Prior to that, Sheth was director of marketing for network connectivity at Broadcom, which he joined through the company’s acquisition of NetLogic Microsystems in 2012. He was an engineer in the Pentium III processor group and a networking group at Intel. in previous years.

Bhoja also worked at Broadcom, as Technical Director of Broadband Interconnects. Previously, he was chief architect at optical networking startup Big Bear Networks, which was acquired in 2005 by Finisar Corporation, now known as II-VI Inc. He also worked at Lucent Technologies and Texas Instruments.

d-Matrix is ​​entering a somewhat crowded market dominated by Nvidia, so it has a lot to prove before making any serious headway. Another inference chip startup, Esperanto Technologies, has just begun sampling its silicon with companies after starting in 2014. ®

About Florence L. Silvia

Check Also

Soul Hackers 2 Soul Matrix: Types, Upgrades and More

One of the most confusing elements of the new Soul Hackers 2 game is Soul …