Global Technology Arms Race

October 10, 2023 – Woody Preucil, Senior Managing Director at 13D Research and Strategy, discusses the global arms race underway over technology, which is helping to drive a trillion-dollar upgrade cycle expected over the next 5 years or more. To sign up for 13D's weekly reports and investment ideas, go to 13D Research & Strategy.

Transcript:

Cris Sheridan: There's a global arms race taking place over technology today, and the two largest players are the US. And China. But the major focal point of this competition is over advanced semiconductors or computer chips because of the foundational role that they play in our modern digital world, especially when it comes to the evolution of artificial intelligence, or generative AI. To discuss this and more with us today is Woody Preucil. He is the Senior Managing Director at 13 D Research and Strategy, and you can follow all of their work at 13 D.Com.

Cris Sheridan: Woody, thanks for joining us today.

Woody Preucil: Pleasure to be here, Chris. Thank you.

Cris Sheridan: So Woody, we've been discussing this global tech arms race with you now for a number of years. I know that this is something you began highlighting back at least in 2016, I would say, talking about what was happening with artificial intelligence, semiconductors, even quantum computing, a number of areas that were a little bit off the radar for most of the public. But things are continuing to move in this direction. So let's start things off today. You wrote a recent piece talking about the role that generative AI is playing in all of this.

Woody Preucil: Well, Generative AI really represents a step change in how artificial intelligence is used. You probably remember we had discussion back in 2020 about how GPT-3 could change the world. And then in November 2022, they introduced GPT Four and it became the fastest adopted consumer application with 100 million users within two months. And it's driving innovation across the technology landscape. This includes fraud detection and credit scoring and finance, image enhancement, and generation of an individual treatment plans in healthcare.

Woody Preucil: It's helping accelerate drug discovery, improve predictive maintenance and quality control, and supply chain optimization in manufacturing. And it's helping redefine vital aspects of the industrial supply chain. Generative AI is also poised to play a critical role in warehouse and inventory management to optimize stock levels, reorder points, overall efficiency by stimulating various inventory or simulating various inventory scenarios, and balancing carrying costs against service levels. McKinsey did a study recently that estimated that generative AI will add on the order of somewhere between 2.6 and $4.4 trillion in economic value to the global economy by 2030, by which time artificial intelligence will be a $15 trillion market. And one of the things that two factors that are really driving this are the large language models, which are foundational for generative AI are doubling in complexity every few months.

Woody Preucil: And at the same time, AI training and hardware and software costs are falling at about a 70% annual rate. So McKinsey also estimates that generative AI has the potential to automate work activities that account for like 60% to 70% of knowledge workers time. And they've reduced the midpoint for when this will occur to 2045, which is nearly a decade earlier than its previous estimate of 2053, which they made back in 2016. Key point here is that generative AI well, prior to this artificial intelligence had most of the productivity gains that we'd seen through digitization were really realized in a few major sectors like tech, media and finance. But now with generative AI's natural language processing capabilities, it's really going to affect all industries.

Woody Preucil: And some of the areas within businesses that are going to see the most impact are customer operations, marketing and sales, software engineering, R D those sectors could account for three quarters of the total annual value from generative AI. In fact, the CEO of an AI startup called Stability AI has argued that traditional human software coders could disappear in five years because chat GPT already executes tasks at a level of a junior software engineer. And the idea here is that these models, although they make mistakes today, are improving on a Moore's Law trend line. So they're just going to keep getting better and better. Among the industries that are going to see some of the biggest impacts are banking, high tech, retail, consumer goods, life sciences.

Woody Preucil: So in banking you could see generative AI could add somewhere between 200 and $340,000,000,000 annually just by improving on efficiencies, by taking on lower value tasks. In risk management like reporting or monitoring regulatory developments or collecting data. In retail and consumer packaged goods you could see a 400 to 600 billion dollar per year incremental value add by automating functions like customer service, marketing, sales, inventory, supply chain management. And in life sciences, McKinsey estimates $60 to $110,000,000,000 incremental value add through drug discovery. In fact, Gartner, which is a market research firm in the technology space, they estimate that generative AI will be used in half of all drug discovery and development initiatives by 2025.

Cris Sheridan: It's amazing when you're talking about all these different potential applications of generative AI because I think that the public, they think of OpenAI's Chat GPT or Google Bard, the large language models that we have the ability to basically chat back and forth with or ask it to create an article. I use it for editing, for checking, all sorts of stuff, summarizing information, research. I would say the broader public does have a general familiarity now with generative AI, but it's particularly through the large language models. But as you're pointing out, generative AI extends well beyond that and really touches nearly every industry. How is it that it can do all of these different things?

Woody Preucil: Well, these models are getting so big and even though the training costs are falling on a per unit basis, the overall costs of training these models are increasing still. And so you saw the GPT-3 had 175,000,000,000 parameters and Chat GPT four has something like 1.7 trillion parameters. This is driving innovation in smaller, more specialized models that are targeted for specific purposes and also open source generative AI models that are making it easier to leverage the technology. And there's something like 1500 smaller finely tuned models out there. Generative AI leverages lots of different technologies, aspects of deep learning, such as reinforcement learning, unsupervised learning.

Woody Preucil: The key breakthrough came in 2017 with the transformer model. And instead of analyzing a word at a time, the transformer model, which was invented by google, researchers could look at whole sentences or paragraphs or even whole articles to understand the context better and deliver more accurate and insightful content.

Cris Sheridan: It seems to me like, you know, this is really an extension of some major technological breakthroughs which we've discussed with you in the past. But a lot of this stems as well from the idea that generative AI. And most of the advancements that we see currently are springing out of neural networking. And so neural networking, which goes back to, I mean, the 60s, when humans were researchers, psychologists were actually developing very simple models of a neuron to try to figure out how they work. They weren't even trying to develop artificial intelligence.

Cris Sheridan: They were just trying to figure out the human brain. And they developed a very simple model of the neuron, and then that extended into neural networks. And then eventually, this became a key place for researchers that wanted to actually develop artificial intelligence itself. And that has now blossomed to the point where we've developed these very large supercomputers that have complexity that doesn't necessarily rival the human brain in terms of the number of connections, but it certainly has a similar architecture. And I think, as you had just said, one of the key features of this is the fact that they are able to learn themselves, right?

Cris Sheridan: So we see neural networking combined with machine learning. And so you can feed into these electronic brains, if you will, as much information as you want, far more information than any human can process, and then they will learn themselves which variables are key in order to optimize their functioning and to produce a certain output. And because of that, now they're able to outcompete nearly every single human on a broad range of tests and achieve a level of intelligence that no one human could ever hope to attain. I think that's clear. And so, like you're pointing out, there's so many different applications that come from this.

Cris Sheridan: It's not just engaging in some chat.

Woody Preucil: Yeah, well, more to your point there, the big breakthrough really came in. So the transformer model for generative AI 2017. But the other key breakthrough was in 2012, when Jeffrey hinton and other scientists from the university of toronto, they submitted a new neural network architecture called AlexNet, to an annual competition where AI researchers competed with their models identifying image recognition. And Jeffrey Hinton's model beat the field that year by 10.8 percentage points. That was about 41% better than the next closest competitor.

Woody Preucil: And so over that following seven year period, the accuracy of classifying these objects and data sets rose from 71.8% to 97.3%. And the key takeaway was that bigger data leads to better decisions. And that's an underlying factor why you're seeing these large language models at 1.7 trillion. And I've heard that GPT Five is going to be something like 100 trillion.

Cris Sheridan: Yeah, and that's where we're now actually getting rivaling human brain complexity when you're talking about 100 trillion parameters now.

Woody Preucil: Yeah. The last time we talked earlier this year, we were talking about the huge energy consumption that these models are taking up. So the carbon footprint of AI is really driving innovation in new chips that are more efficient and also it's going to drive clean energy investment, et cetera. But one of the fallouts from this huge push on artificial intelligence by all countries is particularly China and the US. Is this race for computing power.

Woody Preucil: And it's really going to be the next battlefront of the technology arms race. When you think about advances in computation capacity, the smartphone that we carry today in our pocket is about 900 million times faster than the guidance computer that was used by NASA for the Apollo Eleven mission. And it's also about 5000 times faster than the 1985 Cray supercomputer, which was super cutting edge at the time. And over the next the rest of this decade, AI computing power is on track to increase something like 500 fold, reaching 105 Zeta flops, which is I mean, the human brain can't even imagine what this is basically ten to the 21st power floating operations per second. While the US has been broadening these technology export restrictions, china is really doubling down on computing power.

Woody Preucil: And so actually today it was revealed that China is aiming to grow their country's computing power by more than a third in less than three years. And so they're aiming for 300 exaflops of computing capacity across their technology sector by 2025. And that would be up from 220 exaflops this year. Now, as of I think last year, the US. Accounted for about 34% of global computing power and China kind of for about 33%.

Woody Preucil: So this represents a huge investment by them. And they're aiming to build an additional 20 smart computing centers over the next two years. And they're going to be installing bigger optical networks, more advanced data storage. So it's a huge initiative from them in computing power. And computing power is important because it really drives a lot of these technology breakthroughs that we've been seeing, such as generative AI.

Cris Sheridan: Yeah. One of the things that we've been discussing on our show is how, when we think about the commodity bull cycle of the lot of that was led because of China's admission to the WTO in the them entering into the global workforce and all of the infrastructure spending that they put into their manufacturing sector becoming a major global exporting powerhouse. That led a major bull market in commodities from 2000 to 2011 once that commodity cycle peaked. And we've been talking about how we are seeing another commodity cycle we believe unfold for this decade, but this time it's not going to be over the same raw materials. It's going to be more focused on renewable energy, digital technology, and that includes a wide range of different critical materials and minerals, rare earth metals, even uranium.

Cris Sheridan: Of course, they're building nuclear power plants to power all their electricity needs. It's going to be copper. It's going to be graphite. There's a number of different things that are likely to participate in this bull market in the years ahead. So they are spending a lot of money.

Cris Sheridan: They're just not doing it in the same place that they did before. They're putting it into the advancements in modernizing their economy.

Woody Preucil: Right. And a couple of areas on the compute power know they're aiming to build like a supercomputer Internet. So this project would improve the research capabilities. It's China's Ministry of Science and Technology, and they're leading this build out of a network of connected supercomputers, and that will help with drug discovery and all sorts of different technologies. China is already a leader in supercomputers.

Woody Preucil: They have 162 known supercomputers versus 126 in the US. And of the world's, 500 fastest systems, according to a German ranking. And China was the first to build an exit scale supercomputer in 2021. They plan to have ten in operation by 2025. exoscale computers are systems that can handle ten to the power of 18 calculations per second, so much faster than current previous supercomputers.

Woody Preucil: And China is also doubling down on quantum technologies. The Chinese scientist recently revealed that one of the nation's quantum computers can perform AI tasks something like 180,000,000 times faster than the world's most powerful supercomputer. And the Chinese government has allocated about 15 billion for quantum research and advanced applications. And that's roughly four times the amount that the US. Government has allocated and double the total EU commitment.

Woody Preucil: And over the last decade, china has garnered 84,000 patents related to quantum technologies. And that's excluding university groups. 30 plus Chinese companies are now developing quantum systems.

Cris Sheridan: Yeah, it's amazing. And quantum computing still seems like Sci-Fi, I think, for most people, even though there are leaps and bounds being made in this area, in China, it seems, is really taking the lead. I'm not sure if it's more of like it's neck and neck with US. And China and even Europe there, but it seems mean China is really making some big strides in the quantum computing.

Woody Preucil: You know, I would say neck. And this I think we talked about this before, but there was a study done, and it estimated that based on research, scientific research papers published, high impact papers, china was in the lead in 37 and 44 critical technologies. They are generally considered to be the most advanced country in quantum communications. And while the US. Is still probably ahead in quantum computing, but increasingly, it's a neck and neck race with China in a lot of these different technologies.

Woody Preucil: And I would say one of the other big implications of all the innovation that's going into generative AI is it's ushering in a trillion dollar upgrade cycle in data center architecture in the Western world. And they're calling this accelerated computing. And this is going to help revolutionize AI development. Three key factors are driving this unprecedented demand for computing power and data storage globally. One is the rapid adoption rate of generative AI.

Woody Preucil: The second is the exponential expansion in the AI model parameters that we were talking about a few minutes ago, and then just the computing burden necessary to design and test and create these next generation semiconductors. It really takes enormous amounts of capital and energy, too. And energy, yeah. So the conventional cloud and data center architecture are inadequate to keep up with this exponential growth in data generation and compute power. And so accelerated computing is a novel technology because it separates the computational needs and allows the computing power to be streamlined rather than wasted.

Woody Preucil: So it does this by leveraging specialized hardware like graphics processing units, GPUs, Nvidia chips, et cetera, and other companies application specific integrated circuits, tensor processing units, field programmable gate arrays. And this specialized hardware can speed up complex computing workloads by orders of magnitude, while reducing energy consumption up to 42 time. And so accelerated computing helps solve two of the biggest challenges coming out of this exploding use of computing. It's the huge carbon footprint that we talked about previously, and then also the need for this high performance computing. Accelerated computing can dramatically reduce computation times, and it achieves this by offloading the data intensive parts of applications and processes to a GPU while leaving the control functionality with the standard central processing unit.

Woody Preucil: And so, as a result, tasks that previously took preexisting computing capabilities hours or days to complete can be done in much shorter time frames. The other key aspect is accelerated computing systems are scalable, so it allows for the addition of powerful processing nodes as needed to handle these increased demand and workloads. And so, overall, you can see the global data center industry is at the front end of a potential trillion dollar infrastructure upgrade over the next few years. This digital Bridge Group CEO Mark Gonzi kind of summarized the equation. So, in the last decade, we've put about $5 trillion into the public cloud invested, but over the next few years, we're going to be investing six to 700 trillion in building out the AI infrastructure.

Woody Preucil: This is because AI requires three times more compute power and requires more infrastructure and more facilities than the public cloud, just to give you a sense of what's coming down the pike. So it's a huge investment on the data center side over the next several years. And Nvidia's CEO remarked recently on their earnings call that you're seeing the beginning of call it a ten year transition to basically recycle or reclaim the world's data centers and build it out as accelerated computing.

Cris Sheridan: Woody, can you give the numbers you gave one more time. I just wasn't sure if I heard you correct in terms of the amount that is anticipated to be invested here.

Woody Preucil: Yes. So in the last decade, this is a digital Bridge Group CEO Mark Gonzi, and this is a quote from him. So he says, in the last decade, building the public cloud, 5 trillion was invested, and we're going to spend six to 7 trillion building AI. So AI requires three times more compute power and requires more infrastructure and more facilities than the public cloud. Others had made other estimates.

Woody Preucil: Goldman Sachs foresees the upgrade cycle reaching 200 billion by 2025. By 2027, another research group, Del orio Group, foresees data center Capex, growing to 400 billion. So either way you slice it, it's a big capital cycle. Capex upgrade cycle.

Cris Sheridan: Yeah. And it's kind of an invisible world for, I think, most people. And that was something that Guillaume Pitron discussed in his most recent book, The Dark Cloud, touching upon just the amount of energy and the types of metals and minerals that are required for all of these data centers, for the fabrication of these advanced semiconductors. There's a lot that goes into this. It's sort of unknown, I think, to the average public, most of the public, but because when we think of infrastructure, we think of buildings, we think of concrete, we think of steel.

Cris Sheridan: But in this case, we're talking about graphite, which is a huge component of a lot of this stuff. We're talking about copper, which is for the electricity. We're talking about even water, just because there's a lot of cooling that needs to be taking place. There's a number of things that all come together for not just the fabrication of a semiconductor, but for the building of a data center. And like you said, China is building them hand over fist, along with nuclear power and coal power and all their electricity needs, which continue to expand.

Cris Sheridan: So they're doing all hands on approach there. But we're seeing this data center upgrade cycle and build out not just in China, but in the US. And globally as well, because, like you said, generative AI is becoming a key foundational technology for not just consumers, but also for major businesses around the globe, across all of these different applications, like you mentioned earlier.

Woody Preucil: Right. And the increasing tensions with China has big implications as you touch upon. China dominates the supply of a wide range of key materials. They refine 60% of the world's lithium, 68% of nickel, 40% of copper, 80% of the world's cobalt, 68% of global graphite, each of which are core inputs in a lot of these technologies. They also account for over 90% of global production of downstream rare earth products and technologies, which are also key to the whole technology landscape.

Woody Preucil: Many of the technologies that we take for granted, like smartphones.

Cris Sheridan: Well, Woody, as we close out today's show, I want to thank you again for coming on and giving us an update on some of the recent pieces that you've been writing. When it comes to generative, AI, compute power, where data centers fit into this and some of the expectations for what many believe is going to be an extremely large build out in this area for not just the months ahead, but for many years to come. Especially given the global competition that we see over us. And China over this. This is likely going to be a trend that's going to be with us for some time.

Cris Sheridan: And I know that you write about all of these subjects and many more. You also have various indices that you've constructed based on a basket of companies that are exposed to these various areas. So as we close, would you mind telling our listeners some of the various ways that they can follow and gain access to your excellent research?

Woody Preucil: Sure. Well, if you go to our website, 13 D.Com 13 D as in delta.com you'll see in the top right corner, join 13 D. If you click the button, fill out the form, an account representative will follow up. We follow a lot of different investment themes and asset classes. Recently we've written a lot about uranium.

Woody Preucil: We followed that sector since 2018. We see uranium transitioning from a twelve plus year bear market into a new super cycle bull market which is driven by the global push to decarbonize economies, improve energy security, also follow longevity, science, clean energy. The obstacles there like grid, transmission bottlenecks and then the whole implications of growing water scarcity and antiquated infrastructure around much of the world. We're really also at the front end of a new global super cycle and water infrastructure investment. So there's a lot going on.

Woody Preucil: We follow a lot of different themes and so check out our website and send us a message and we'll get you going.

Cris Sheridan: And I do want to give you credit, I think you are one of the earliest researchers or analysts that I follow, recommending Nvidia as one of the key players and ways to gain exposure to the advancements and a lot of the swelling investment into artificial intelligence. I mean, I believe that's been part of one of your core holdings for quite some time.

Woody Preucil: Yeah, I think we first recommended that back in 2009. So the stock split, adjusted price was like $3.50 or something like that. And there's a growing number of technology companies that will benefit from this whole AI investment, global thrust that we're seeing thrust of.

Cris Sheridan: Right, right. And obviously you're not just having Nvidia as part of that way to gain exposure, but a large number of different companies diversifying across this space. So not putting all your eggs in one basket, of course, because Nvidia has seen such a big move, but there are a number of other major players in this space. So yeah, as we close, for any of you that aren't already highly recommend following 13 d's Research. You can do so at 13 D.Com and definitely reach out.

Cris Sheridan: You're going to find some excellent pieces and you produce. Woody. What I learned this week. Every week, right? Is that basically the frequency of your writing schedule?

Woody Preucil: Yes. We have two publications every week. One is What Will End this Week, published on Thursday, which is looking at about ten different themes or topics at any given time. And then we also have a publication that comes out on Sunday called What Are the Markets Telling US? Which is basically looking at technical analysis of what's happening in the market.

Cris Sheridan: Perfect. All right. Well, Woody, as always, it's a pleasure to speak with you on our show.

invest with us
.
apple podcast
spotify
randomness