Connect with us


The Evolution of Computer Science in One Infographic



We take computing power for granted today.

That’s because computers are literally everywhere around us. And thanks to advances in technology and manufacturing, the cost of producing semiconductors is so low that we’ve even started turning things like toys and streetlights into computers.

But how and where did this familiar new era start?

The History of Computer Science

Today’s infographic comes to us from Computer Science Zone, and it describes the journey of how we got to today’s tech-oriented consumer society.

It may surprise you to learn that the humble and abstract groundwork of what we now call computer science goes all the way back to the beginning of the 18th century.

The Evolution of Computer Science in One Infographic

Incredibly, the history of computing goes all the way back to a famous mathematician named Gottfried Wilhem Leibniz.

Leibniz, a polymath living in the Holy Roman Empire in an area that is now modern-day Germany, was quite the talent. He independently developed the field of differential and integral calculus, developed his own mechanical calculators, and was a primary advocate of Rationalism.

It is arguable, however, that the modern impact of his work mostly stems from his formalization of the binary numerical system in 1703. He even envisioned a machine of the future that could use such a system of logic.

From Vacuums to Moore’s Law

The first computers, such as the IBM 650, used vacuum tube circuit modules for logic circuitry. Used up until the early 1960s, they required vast amounts of electricity, failed often, and required constant inspection for defective tubes. They were also the size of entire rooms.

Luckily, transistors were invented and then later integrated into circuits – and 1958 saw the production of the very first functioning integrated circuit by Jack Kilby of Texas Instruments. Shortly after, Gordon Moore of Intel predicted that the number of transistors per integrated circuit would double every year, a prediction now known as “Moore’s Law”.

Moore’s Law, which suggests exponential growth, continued for 50 years until it started scratching its upper limits.

It can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens.

– Gordon Moore in 2005

It’s now been argued by everyone from The Economist to the CEO of Nvidia that Moore’s Law is over for practical intents and purposes – but that doesn’t mean it’s the end of the road for computer science. In fact, it’s just the opposite.

The Next Computing Era

Computers no longer take up rooms – even very powerful ones now fit in the palm of your hand.

They are cheap enough to put in refrigerators, irrigation systems, thermostats, smoke detectors, cars, streetlights, and clothing. They can even be embedded in your skin.

The coming computing era will be dominated by artificial intelligence, the IoT, robotics, and unprecedented connectivity. And even if things are advancing at a sub-exponential rate, it will still be an incredible next step in the evolution of computer science.

Subscribe to Visual Capitalist
Click for Comments


Nvidia Joins the Trillion Dollar Club

America’s biggest chipmaker Nvidia has joined the trillion dollar club as advancements in AI move at lightning speed.



Nvidia Joins the Trillion Dollar Club

Chipmaker Nvidia is now worth nearly as much as Amazon.

America’s largest semiconductor company has vaulted past the $1 trillion market capitalization mark, a milestone reached by just a handful of companies including Apple, Amazon, and Microsoft. While many of these are household names, Nvidia has only recently gained widespread attention amid the AI boom.

The above graphic compares Nvidia to the seven companies that have reached the trillion dollar club.

Riding the AI Wave

Nvidia’s market cap has more than doubled in 2023 to over $1 trillion.

The company designs semiconductor chips that are made of silicon slices that contain specific patterns. Just like you flip an electrical switch by turning on a light at home, these chips have billions of switches that process complex information simultaneously.

Today, they are integral to many AI functions—from OpenAI’s ChatGPT to image generation. Here’s how Nvidia stands up against companies that have achieved the trillion dollar milestone:

Joined ClubMarket Cap
in trillions
Peak Market Cap
in trillions
AppleAug 2018$2.78$2.94
MicrosoftApr 2019$2.47$2.58
AramcoDec 2019$2.06$2.45
AlphabetJul 2020$1.58$1.98
AmazonApr 2020$1.25$1.88
MetaJun 2021$0.68$1.07
TeslaOct 2021$0.63$1.23
NvidiaMay 2023$1.02$1.02

Note: Market caps as of May 30th, 2023

After posting record sales, the company added $184 billion to its market value in one day. Only two other companies have exceeded this number: Amazon ($191 billion), and Apple ($191 billion).

As Nvidia’s market cap reaches new heights, many are wondering if its explosive growth will continue—or if the AI craze is merely temporary. There are cases to be made on both sides.

Bull Case Scenario

Big tech companies are racing to develop capabilities like OpenAI. These types of generative AI require vastly higher amounts of computing power, especially as they become more sophisticated.

Many tech giants, including Google and Microsoft use Nvidia chips to power their AI operations. Consider how Google plans to use generative AI in six products in the future. Each of these have over 2 billion users.

Nvidia has also launched new products days since its stratospheric rise, spanning from robotics to gaming. Leading the way is the A100, a powerful graphics processing unit (GPU) well-suited for machine learning. Additionally, it announced a new supercomputer platform that Google, Microsoft, and Meta are first in line for. Overall, 65,000 companies globally use the company’s chips for a wide range of functions.

Bear Case Scenario

While extreme investor optimism has launched Nvidia to record highs, how do some of its fundamental valuations stack up to other giants?

As the table below shows, its price to earnings (P/E) ratio is second-only to Amazon, at 214.4. This shows how much a shareholder pays compared to the earnings of a company. Here, the company’s share price is over 200 times its earnings on a per share basis.

P/E RatioNet Profit Margin (Annual)

Consider how this looks for revenue of Nvidia compared to other big tech names:

For some, Nvidia’s valuation seems unrealistic even in spite of the prospects of AI. While Nvidia has $11 billion in projected revenue for the next quarter, it would still mean significantly higher multiples than its big tech peers. This suggests the company is overvalued at current prices.

Nvidia’s Growth: Will it Last?

This is not the first time Nvidia’s market cap has rocketed up.

During the crypto rally of 2021, its share price skyrocketed over 100% as demand for its GPUs increased. These specialist chips help mine cryptocurrency, and a jump in demand led to a shortage of chips at the time.

As cryptocurrencies lost their lustre, Nvidia’s share price sank over 46% the following year.

By comparison, AI advancements could have more transformative power. Big tech is rushing to partner with Nvidia, potentially reshaping everything from search to advertising.

Continue Reading