NVIDIA H100 Tensor Core GPU (Source: NVIDIA)
In the era of artificial intelligence (AI) big model, computing power plays an increasingly important role and becomes a key productivity.
On December 20th, Titanium Media App recently learned from the first "AI Tech Day" conference that the report "White Paper on the Development of Intelligent Computing Industry in China in 2023" released by iResearch showed that the demand for large-scale model computing power broke out in 2023, and the industries related to AI large-scale model accounted for 58.8% of the total demand for intelligent computing power in China, close to 60%, which was the largest demand side of intelligent computing power. The second place is algorithm recommendation, accounting for 14.2%.
The report also shows that in 2022, the scale of intelligent computing power will account for 22.8% in China’s computing power structure, but in the same period, the shipment of AI servers will only account for 6.8% in China’s server structure. This means that intellectual computing resources are scarce, and the construction of intellectual computing center will effectively improve the supply and demand structure of intellectual computing resources in China.
Xu Fanlei, head of iResearch Industry Digital Research Institute, said that in the future, the general artificial intelligence (AGI) technology will continue to develop, which will promote the overall demand for intelligent computing. Therefore, it is necessary to have enough high-end intelligent computing power and constantly optimized software and hardware integration solutions, as well as to build ecological convergence upstream and downstream, full chain, and participants in various fields, and the use threshold is low and convenient enough.
In fact, over the past year or so, the generative AI technology represented by ChatGPT has been popular all over the world. Nvidia, which monopolized the global Al training chips with 95% market share, became the biggest winner of this round of AI melee, and its research and development of A100/A800, H100/H800 and other AI chips became the "explosive products" in the AI craze.
As NVIDIA itself said: "GPU has become a rare metal of artificial intelligence, even gold, because they are the foundation of today’s generative AI era."
From a technical point of view, GPU is superior to CPU, especially in parallel computing power, energy efficiency and CUDA ecology. Its high computing power and scalability make NVIDIA GPU the first choice in the AI accelerated chip market.
According to a recent report released by Stanford University, since 2003, the performance of GPU has increased by about 7000 times in the past 20 years, and the unit performance price has also increased by 5600 times. The report also pointed out that GPU is the key driving force to promote the progress of AI technology.
Nowadays, computing power is accelerating the evolution of applications in government affairs, industry, transportation, medical care and other industries, promoting the deep integration of the Internet, big data and artificial intelligence with the real economy, and greatly stimulating the innovation vitality of data elements.
According to public data, since 2018, the compound annual growth rate of the number of racks in China data center has exceeded 30%. By the end of 2022, there were more than 6.5 million standard racks in use. According to the calculation of China Information and Communication Research Institute, in 2022, the scale of the core industry of computing power in China will reach 1.8 trillion yuan, and every investment in computing power in 1 yuan will drive the economic growth of GDP from three to 4 yuan.
Then, with the increasing demand for AI models in China, the gap of NVIDIA’s GPU computing chips reaches 432,000 pieces. So, how will the "golden" GPU chips play more space?
At the first "AI Tech Day" meeting of InBev Mathematics Department held on December 18th, InBev Mathematics Department announced a strategic upgrade, from positioning a large model computing service provider in stage 1.0 to positioning an "AGI full-stack ecological service platform" in stage 2.0-providing a series of computing and full-stack solutions including IaaS infrastructure layer, PaaS platform layer, data engineering service, data store and model store for customers in the entire industrial chain of AGI, forming a "full-stack solution". In addition, InBev Mathematics Department has also launched "Bobo Cloud" service, as well as commercial landing projects such as "AGI Pan-Entertainment Alliance", "Wutong Project" and "MM Bean Project".
According to public information, Hongbo shares is a listed company whose main business is bill printing, but InBev Digital Branch, a wholly-owned subsidiary of NVIDIA’s server product supply, was established a year ago, and then Hongbo shares became a "computing power" concept stock and its share price soared.
On August 10th, 2022, Hongbo Co., Ltd. signed a four-party cooperation agreement with Zhongguancun Zhongheng Cultural Science and Technology Innovation Service Alliance, Nvidia Company and Beijing InBev Digital Technology Co., Ltd. to establish the "Beijing AI Innovation Empowerment Center". InBev Digital, a wholly-owned subsidiary of the company, is the main body of the "Beijing AI Innovation Empowerment Center".
On December 4th this year, Hongbo shares disclosed a major contract announcement, and InBev Digital recently signed a Cloud Service Agreement with Baichuan Intelligent, founded by Wang Xiaochuan. During the agreement period, InBev Digital provided the latter with services such as high-performance computing or GPU computing technology. The total transaction amount involved in the agreement is estimated to be 1.382 billion yuan.
Zhou Wei, vice president of Hongbo Co., Ltd. and CEO of InBev Mathematics Department, said that the development from AI to AGI requires deep interdisciplinary integration. The company should not only build a one-stop service ecological closed loop, but also gather the top technical partners in China and even the world through this ecology, focusing on finding innovative computing solutions on the hardware side and data, model and algorithm optimization and upgrading solutions on the software side, thus relying on the single brand path of high-end intelligent computing equipment at this stage.
"Take a service provider who serves 15-20 small and medium-sized customers every year as an example. Through the calculation service fee and the sales rebate of Boboyun’s value-added service fee, combined with resource locking tools such as prepaid leverage, it is roughly calculated that every year (customers) can achieve gross profit income of hundreds of thousands." Zhou Wei said.
It is reported that the minimum computing power rental unit that InBev Mathematics Department will provide is a complete server, with a minimum lease term of 3 months. Each machine is charged separately every month, and the rent is paid monthly in advance. The target customers include AI technology companies that have a large demand for intelligent computing power, such as multimodal large model training and development, cloud rendering, driverless reasoning, and quantitative trading model training. Taking a single monthly computing service of H800 as an example, the pricing of computing power for strategic partners is 96,000 yuan p.u/m, and the potential customers are 256,000 yuan p.uhttps://m.jrj.com.cn/madadapter/finance/2023/12/m.
Up to now, InBev’s partners include head companies such as Baichuan Intelligent, MiniMax, Luchen Technology, Wuxinxin Dome, Jingneng International, Anqing Computer, ProLogis, iSoftStone, Century Internet, Zhongke Langke and Chinese Online, as well as chip leaders such as NVIDIA, Intel and AMD.
"We want to make computing power more efficient and universal, like water and electricity." InBev Mathematics Department said.
(This article is the first titanium media App, author | Lin Zhijia)