Connect with us

Technology

The Evolution of Computer Science in One Infographic

Published

on

We take computing power for granted today.

That’s because computers are literally everywhere around us. And thanks to advances in technology and manufacturing, the cost of producing semiconductors is so low that we’ve even started turning things like toys and streetlights into computers.

But how and where did this familiar new era start?

The History of Computer Science

Today’s infographic comes to us from Computer Science Zone, and it describes the journey of how we got to today’s tech-oriented consumer society.

It may surprise you to learn that the humble and abstract groundwork of what we now call computer science goes all the way back to the beginning of the 18th century.

The Evolution of Computer Science in One Infographic

Incredibly, the history of computing goes all the way back to a famous mathematician named Gottfried Wilhem Leibniz.

Leibniz, a polymath living in the Holy Roman Empire in an area that is now modern-day Germany, was quite the talent. He independently developed the field of differential and integral calculus, developed his own mechanical calculators, and was a primary advocate of Rationalism.

It is arguable, however, that the modern impact of his work mostly stems from his formalization of the binary numerical system in 1703. He even envisioned a machine of the future that could use such a system of logic.

From Vacuums to Moore’s Law

The first computers, such as the IBM 650, used vacuum tube circuit modules for logic circuitry. Used up until the early 1960s, they required vast amounts of electricity, failed often, and required constant inspection for defective tubes. They were also the size of entire rooms.

Luckily, transistors were invented and then later integrated into circuits – and 1958 saw the production of the very first functioning integrated circuit by Jack Kilby of Texas Instruments. Shortly after, Gordon Moore of Intel predicted that the number of transistors per integrated circuit would double every year, a prediction now known as “Moore’s Law”.

Moore’s Law, which suggests exponential growth, continued for 50 years until it started scratching its upper limits.

It can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens.

– Gordon Moore in 2005

It’s now been argued by everyone from The Economist to the CEO of Nvidia that Moore’s Law is over for practical intents and purposes – but that doesn’t mean it’s the end of the road for computer science. In fact, it’s just the opposite.

The Next Computing Era

Computers no longer take up rooms – even very powerful ones now fit in the palm of your hand.

They are cheap enough to put in refrigerators, irrigation systems, thermostats, smoke detectors, cars, streetlights, and clothing. They can even be embedded in your skin.

The coming computing era will be dominated by artificial intelligence, the IoT, robotics, and unprecedented connectivity. And even if things are advancing at a sub-exponential rate, it will still be an incredible next step in the evolution of computer science.

Click for Comments

Technology

Charted: The Jobs Most Impacted by AI

We visualized the results of an analysis by the World Economic Forum, which uncovered the jobs most impacted by AI.

Published

on

Charted: The Jobs Most Impacted by AI

This was originally posted on our Voronoi app. Download the app for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.

Large language models (LLMs) and other generative AI tools haven’t been around for very long, but they’re expected to have far-reaching impacts on the way people do their jobs. With this in mind, researchers have already begun studying the potential impacts of this transformative technology.

In this graphic, we’ve visualized the results of a World Economic Forum report, which estimated how different job departments will be exposed to AI disruption.

Data and Methodology

To identify the job departments most impacted by AI, researchers assessed over 19,000 occupational tasks (e.g. reading documents) to determine if they relied on language. If a task was deemed language-based, it was then determined how much human involvement was needed to complete that task.

With this analysis, researchers were then able to estimate how AI would impact different occupational groups.

DepartmentLarge impact (%)Small impact (%)No impact (%)
IT73261
Finance70219
Customer Sales671617
Operations651817
HR57412
Marketing56413
Legal46504
Supply Chain431839

In our graphic, large impact refers to tasks that will be fully automated or significantly altered by AI technologies. Small impact refers to tasks that have a lesser potential for disruption.

Where AI will make the biggest impact

Jobs in information technology (IT) and finance have the highest share of tasks expected to be largely impacted by AI.

Within IT, tasks that are expected to be automated include software quality assurance and customer support. On the finance side, researchers believe that AI could be significantly useful for bookkeeping, accounting, and auditing.

Still interested in AI? Check out this graphic which ranked the most commonly used AI tools in 2023.

Continue Reading

Subscribe

Popular