News in English

San Francisco Compute Co. to Build Computing Power Trading Platform

San Francisco Compute Co. has reportedly raised $12 million in an early funding round to launch a trading platform for computing power. With the platform, the company aims to help companies working with artificial intelligence (AI) meet the challenge of getting access to the semiconductors they need, Bloomberg reported Tuesday (July 16). “If you aren’t […]

The post San Francisco Compute Co. to Build Computing Power Trading Platform appeared first on PYMNTS.com.

San Francisco Compute Co. has reportedly raised $12 million in an early funding round to launch a trading platform for computing power.

With the platform, the company aims to help companies working with artificial intelligence (AI) meet the challenge of getting access to the semiconductors they need, Bloomberg reported Tuesday (July 16).

“If you aren’t one of the holy few, you are effectively priced out of the market,” Evan Conrad, who co-founded San Francisco Compute Co. with Alex Gajewski, said in the report. “There is no option for you without major funding.”

Tech entrepreneur Jack Altman, whose firm Alt Capital led the funding round, said in the report that San Francisco Compute Co.’s goal is to “allow regular startups to use humungous amounts of compute for a short period of time.”

With the funding, the company will double its staff to 30 people and build out the trading platform, according to the report.

The company joins other startups that provide fractional access to computing power and infrastructure, including Lambda, Vast.ai, RunPod and CoreWeave, per the report.

Training generative AI requires either owning or renting time on hardware, significant data storage needs and intensive energy consumption, according to the PYMNTS Intelligence and AI-ID collaboration, “Understanding the Future of Generative AI.”

The cost of simply training OpenAI’s GPT-3 — the version before the one employed in ChatGPT — was more than $5 million.

Companies interested in developing their own generative AI solutions will come up against the cost of training them before dealing with the cost of running them.

The high cost of AI, driven primarily by the computing power an AI model requires — which grows in step with the number of customers using the product — is an uncomfortable and expensive reality that businesses need to adapt to in order to remain competitive, PYMNTS reported in October 2023.

Analysts estimated at the time that Microsoft’s Bing AI chatbot, which is powered by OpenAI, needs at least $4 billion of infrastructure just to do its job. OpenAI spends up to $700,000 a day maintaining its underlying infrastructure and server costs, and the company recorded total losses of $540 million in 2022.

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.

The post San Francisco Compute Co. to Build Computing Power Trading Platform appeared first on PYMNTS.com.

Читайте на 123ru.net