AI, or artificial intelligence, has taken the world by storm since the launch of Chat GPT in late 2022 — shaking the windows and rattling the walls of every institution, from governments and universities to Fortune 500 companies and trade unions.
Proponents claim it will usher in a new era of economic productivity, prosperity, and human flourishing, while critics worry it will cause chaos and instability in every sector of the job market — from call center laborers to Hollywood screenwriters — exacerbating the wealth divide and turning millions of workers out on the street.
Some AI “doomers” worry that artificial intelligence will become self-aware and turn on its human creators, à la Skynet in “The Terminator” and other sci-fi works.
But what everyone seems to agree on is that AI will create tremendous wealth for the companies and individuals building and harnessing the technology.
Nvidia, which manufactures the leading computer chips powering the AI revolution, has become the poster child for how a company can unlock billions, or even trillions, in value in this new age.
Like AI itself, Nvidia is an “overnight success story” many decades in the making.
Formed in 1993 by CEO Jensen Huang and co-founders Chris Malachowsky and Curtis Priem, NVIDIA initially aimed to bring 3D technology to gaming and media.
In 1999 the company released the first-ever GPU (graphics processor unit), a powerful chip that could render 3D graphics in real time.
The first Xbox gaming console and the PlayStation 3 ran on NVIDIA’s formidable chip technology.
Later, Nvidia chips became the backbone of blockchain networks like Ethereum, which, until recently, required the large computing power of GPUs to secure transactions and store data.
Nvidia was also a pioneer in building a software toolkit called CUDA that made it easier for programmers to apply their chips to all manner of tasks, kind of like equipping a Formula 1 race car with an automatic transmission and cruise control.
Compared to traditional chips like CPUs — the workhorses that have powered computers from the mainframes of the 1950s to today’s PCs and smartphones — GPUs are generally superior in training AI models and powering responses to our endless AI prompts.
That’s because the so-called “machine learning algorithms” behind popular AI models require computers to perform multiple tasks at once.
GPUs, unlike CPUs, can break down AI tasks into smaller chunks and run them concurrently, dramatically improving speed and performance.
Think of a CPU as a decathlete who competes in 10 different track and field events.
Like a CPU, a decathlete can do multiple tasks very well, but only in sequence; you can’t take your shot put into the swimming pool.
A GPU, on the other hand, is more like a soccer team.
Each player may not be as versatile as the decathlete — and they may not even be as fast or strong.
But they work together to achieve something the decathlete never could: operating as a team to advance the ball up the field and score.
Today, GPUs are the dominant chip in AI — and Nvidia is the dominant player.
According to Mercury Research, in Q3 of 2023, NVIDIA sold $11.1 billion in chips, cards, and related hardware, representing a 99.7% share of GPU systems in data centers worldwide.
So was Nvidia just lucky to have developed the best-in-class GPU at the exact moment they’re needed to power the AI revolution?
CEO Jensen Huang said in a March 2023 interview: “We had the good wisdom to go put the whole company behind it,” a decade ago. The reality is somewhere in between. Luck, after all, is the combination of preparation and good timing.
In 2012 when Nvidia released its first AI product, it could hardly have anticipated that in a decade, AI was going to become the phenomenon it has today.
On the other hand, the company could never have seized this moment if it hadn’t started investing in AI long before its peers.
Other chip makers like Intel could have prepared better for the AI age but simply chose not to.
From 3D graphics to PC gaming to blockchains to AI, NVIDIA has often found itself at the forefront of the biggest paradigm shifts in technology. And this has translated into a historic windfall for shareholders.
The day Chat GPT launched in November 2022, NVIDIA was a $400 billion company — enormous even then, thanks in large part to the success of its gaming and graphics business.
But since then, its market value has swelled by a staggering trillion dollars to more than $1.4 trillion, the equivalent of about four Bank of Americas, in market capitalization in just over a year.
Much of this is driven by lofty expectations that the company will continue to grow at a breakneck speed.
Wall Street analysts estimate the company will more than double revenue between 2023 and 2024, and nearly double again in 2025.
Growth of that magnitude for such a large company is nearly unheard of.
This raises several questions.
For one, is such growth actually achievable?
And if so, won’t other chip makers race to build Nvidia killers?
It would be a mistake to think Nvidia has been alone in investing in AI.
Indeed, more than a dozen other companies crowd the market with their own offerings, including legacy chipmakers like AMD, Intel, and IBM, and lesser-known upstarts such as Graphcore and Groq.
Furthermore, big tech platforms, which all have huge computing needs, have been developing their own chips.
For example, the Google Cloud TPU (tensor processing unit) was launched in 2015 and was updated in 2021.
Today, it powers Google’s Bard chatbot and the company’s many other AI applications.
Amazon, the global leader in cloud computing — those data centers that power the computing needs of companies from Pfizer to McDonald’s — has its AI-focused chips known as Tranium/Inferentia, launched in 2020.
Chinese technology conglomerate Alibaba announced its own AI chip, the Hanguang 800, back in 2019.
As industry analyst Ben Bajarin wrote on X shortly after Microsoft announced its own AI hardware offering, “Those serious about platforms need to be serious about silicon.”
Nvidia is swimming in a pond with many other powerful companies full of smart people and seemingly endless resources, and when you are at 100% market share (or 99.7%, in the case of Nvidia’s share of GPUs in data centers) there’s nowhere to go but down.
Still, it would be a mistake to simply assume that legacy players or Nvidia’s customers can throw enough money to compete with Nvidia without a fight.
Furthermore, if AI prognosticators are correct, AI chips will power an entirely new industrial revolution, which will allow Nvidia to succeed even as competition grows.
Henry Ford sold a lot more cars in 1929 than in 1919 when half the automobiles on the road were Model Ts — even though competitors had consolidated and copied his methods, despite his market share steadily declining.
His share of the market went down but total sales went up, as did the value of the company.
The same could happen for Nvidia. After all, they don’t just make money from AI; last year, the company generated about $12 billion from non-AI products.
Still, Nvidia’s long-term success is far from guaranteed.
Fairchild Semiconductor was once the standard-bearer for American innovation.
It supplied many of the chips for the Apollo Lunar guidance system.
Its success helped transform a quiet valley of California apple orchards into Silicon Valley, and the heart of the global tech industry.
Today, Fairchild no longer exists. For a period in the 1960s and ’70s, Digital Equipment Corp (DEC) was the biggest computer company in the world with so-called “minicomputers,” but it too got swept away, this time with the rise of the personal computer or PC.
In 2007, Nokia sold one in every two phones globally.
That same year, the iPhone came out. Today Nokia accounts for just 3% of all sales.
And let’s not forget about Blackberry — remember them?
In technology, the only constant is change. Nvidia’s future success will rest on its ability to navigate that change.
Alex Tapscott is the managing director of the Ninepoint Digital Asset Group at Ninepoint Partners and the author of the book “Web3: Charting the Internet’s Next Economic and Cultural Frontier.”
Source