[ad_1]
San Francisco: For the past year, Jean Paoli, CEO of the artificial intelligence startup Docugami, has been scrounging for what has become the hottest commodity in tech: computer chips. In particular, Paoli needs a type of chip known as a graphics processing unit, or GPU, because it is the fastest and most efficient way to run the calculations that allow cutting-edge AI companies to analyze enormous amounts of data.
So he’s called everyone he knows in the industry for help. He’s applied for a government grant that allows access to the chips. He’s tried making Docugami’s AI technology more efficient so it requires fewer GPUs. Two of his scientists have even repurposed old video gaming chips . “I think about it as a rare earth metal at this point,” Paoli said of the chips.
More than money, engineering talent, hype or even profits, tech companies this year are desperate for GPUs. The hunt for the essential component was kicked off last year when online chatbots like ChatGPT set off a wave of excitement over AI, leading the entire tech industry to pile on and creating a shortage of the chips. In response, startups and their investors are now going to great lengths to get their hands on the tiny bits of silicon and the crucial “compute power” they provide.
The dearth of AI chips has been exacerbated because Nvidia, a longtime provider of the chips, has a virtual lock on the market. Inundated with demand, the Silicon Valley company – which has surged to a $1 trillion valuation – is expected to report record financial results next week.
Tech companies typically buy access to AI chips and their compute power through cloud computing services from the likes of Google, Microsoft and Amazon. But the AI explosion has meant that there are long wait lists – stretching to almost a year in some cases – to access these chips at cloud computing companies, creating an unusual roadblock at a time when the tech industry sees nothing but opportunity and boundless growth for businesses building generative AI, which can create its own images, text and video.
The largest tech firms can generally get their hands on GPUs more easily because of their size, deep pockets and market positions. That has left startups and researchers, which typically do not have the relationships or spending power, scrambling.
Their desperation is palpable. On social media, blogposts and conference panels, startup founders and investors have started sharing highly technical tips for navigating the shortage. Some are gaming out how long they think it will take Nvidia’s wait-list to clear. There’s even a groan-worthy YouTube song, set to the tune of Billy Joel’s ‘We Didn’t Start the Fire,’ in which an artist known as Weird A.I. Yankochip sings “GPUs are fire, we can never find ’em but we wanna buy ’em.”
Some venture capital firms are now using their connections to buy chips and then offering them to their portfolio companies. Entrepreneurs are rallying startups and research groups together to buy and share a cluster of GPUs.
At Docugami, Paoli weighed the possibility of diverting GPU resources from research and development to his product, an AI service that analyzes documents. Two weeks ago, he struck gold: Docugami secured access to the computing power it needed through a US government program called Access, which is run by the National Science Foundation, a federal agency that funds science and engineering.
The strain recently prompted two founders, Evan Conrad and Alex Gajewski, to start the San Francisco Compute Group, a project that plans to let entrepreneurs and researchers buy access to GPUs in small amounts. After hundreds of emails and a dozen phone calls to cloud companies, equipment makers and brokers, they announced last month that they had secured 512 of Nvidia’s H100 chips and would rent them out to interested parties.
The announcement went “hilariously viral,” Conrad said, and resulted in hundreds of messages from founders, graduate students and other research organizations.
Conrad and Gajewski plan to raise $25 million in a specialized kind of debt that uses the computer chips as collateral. Their vendor, whom the founders declined to name for fear that someone would swoop in and buy the GPUs out from under them, has promised access in around a month. The duo said they hoped to help startups save money by buying only the computing power they need to experiment, rather than making large, years-long commitments. “Otherwise, the incumbents all win,” Conrad said. nyt
So he’s called everyone he knows in the industry for help. He’s applied for a government grant that allows access to the chips. He’s tried making Docugami’s AI technology more efficient so it requires fewer GPUs. Two of his scientists have even repurposed old video gaming chips . “I think about it as a rare earth metal at this point,” Paoli said of the chips.
More than money, engineering talent, hype or even profits, tech companies this year are desperate for GPUs. The hunt for the essential component was kicked off last year when online chatbots like ChatGPT set off a wave of excitement over AI, leading the entire tech industry to pile on and creating a shortage of the chips. In response, startups and their investors are now going to great lengths to get their hands on the tiny bits of silicon and the crucial “compute power” they provide.
The dearth of AI chips has been exacerbated because Nvidia, a longtime provider of the chips, has a virtual lock on the market. Inundated with demand, the Silicon Valley company – which has surged to a $1 trillion valuation – is expected to report record financial results next week.
Tech companies typically buy access to AI chips and their compute power through cloud computing services from the likes of Google, Microsoft and Amazon. But the AI explosion has meant that there are long wait lists – stretching to almost a year in some cases – to access these chips at cloud computing companies, creating an unusual roadblock at a time when the tech industry sees nothing but opportunity and boundless growth for businesses building generative AI, which can create its own images, text and video.
The largest tech firms can generally get their hands on GPUs more easily because of their size, deep pockets and market positions. That has left startups and researchers, which typically do not have the relationships or spending power, scrambling.
Their desperation is palpable. On social media, blogposts and conference panels, startup founders and investors have started sharing highly technical tips for navigating the shortage. Some are gaming out how long they think it will take Nvidia’s wait-list to clear. There’s even a groan-worthy YouTube song, set to the tune of Billy Joel’s ‘We Didn’t Start the Fire,’ in which an artist known as Weird A.I. Yankochip sings “GPUs are fire, we can never find ’em but we wanna buy ’em.”
Some venture capital firms are now using their connections to buy chips and then offering them to their portfolio companies. Entrepreneurs are rallying startups and research groups together to buy and share a cluster of GPUs.
At Docugami, Paoli weighed the possibility of diverting GPU resources from research and development to his product, an AI service that analyzes documents. Two weeks ago, he struck gold: Docugami secured access to the computing power it needed through a US government program called Access, which is run by the National Science Foundation, a federal agency that funds science and engineering.
The strain recently prompted two founders, Evan Conrad and Alex Gajewski, to start the San Francisco Compute Group, a project that plans to let entrepreneurs and researchers buy access to GPUs in small amounts. After hundreds of emails and a dozen phone calls to cloud companies, equipment makers and brokers, they announced last month that they had secured 512 of Nvidia’s H100 chips and would rent them out to interested parties.
The announcement went “hilariously viral,” Conrad said, and resulted in hundreds of messages from founders, graduate students and other research organizations.
Conrad and Gajewski plan to raise $25 million in a specialized kind of debt that uses the computer chips as collateral. Their vendor, whom the founders declined to name for fear that someone would swoop in and buy the GPUs out from under them, has promised access in around a month. The duo said they hoped to help startups save money by buying only the computing power they need to experiment, rather than making large, years-long commitments. “Otherwise, the incumbents all win,” Conrad said. nyt
[ad_2]
Source link