Will humans or computer algorithms be the future arbiters of “truth”?
Today’s infographic from Futurism sums up the ideas that academics, technologists, and other experts are proposing that we implement to stop the spread of fake news.
Below the infographic, we raise our concerns about these methods.
While fake news is certainly problematic, the solutions proposed to penalize articles deemed to be “untrue” are just as scary.
By centralizing fact checking, a system is created that is inherently fragile, biased, and prone for abuse. Furthermore, the idea of axing websites that are deemed to be “untrue” is an initiative that limits independent thought and discourse, while allowing legacy media to remain entrenched.
It could be argued that the best thing about the internet is that it decentralizes content, allowing for any individual, blog, or independent media company to stimulate discussion and new ideas with low barriers to entry. Millions of new entrants have changed the media landscape, and it has left traditional media flailing to find ways to adjust revenue models while keeping their influence intact.
If we say that “truth” can only be verified by a centralized mechanism – a group of people, or an algorithm written by a group of people – we are welcoming the idea that arbitrary sources will be preferred, while others will not (unless they conform to certain standards).
Based on this mechanism, it is almost certain that well-established journalistic sources like The New York Times or The Washington Post will be the most trusted. By the same token, newer sources (like independent media, or blogs written by emerging thought leaders) will not be able to get traction unless they are referencing or receiving backing from these “trusted” gatekeepers.
This centralization is problematic – and here’s a step-by-step reasoning of why that is the case:
First, either method (human or computer) must rely on preconceived notions of what is “authoritative” and “true” to make decisions. Both will be biased in some way. Humans will lean towards a particular consensus or viewpoint, while computers must rank authority based on different factors (Pagerank, backlinks, source recognition, or headline/content analysis).
Next, there is a snowball effect involved: if only posts referencing these authoritative sources of “truth” can get traction on social media, then these sources become even more authoritative over time. This creates entrenchment that will be difficult to overcome, and new bloggers or media outlets will only be able to move up the ladder by associating their posts with an existing consensus. Grassroot movements and new ideas will suffer – especially those that conflict with mainstream beliefs, government, or corporate power.
Finally, this raises concerns about who fact checks the fact checkers. Forbes has a great post on this, showing that Snopes.com (a fact checker) could not even verify basic truths about its own operations.
Removing articles deemed to be “untrue” is a form of censorship. While it may help to remove many ridiculous articles from people’s social feeds, it will also impact the qualities of the internet that make it so great in the first place: its decentralized nature, and the ability for any one person to make a profound impact on the world.
The Future of 5G: Comparing 3 Generations of Wireless Technology
See how 5G compares to older iterations of wireless technology, and why it’s poised to change the way the modern world uses data.
The Future of 5G: Comparing 3 Generations of Wireless Technology
Wireless technology has evolved rapidly since the turn of the century. From voice-only 2G capabilities and internet-enabled 3G, today’s ecosystem of wireless activity is founded on the reliable connection of 4G.
Fifth-generation wireless network technology, better known as 5G, is now being rolled out in major cities worldwide. By 2024, an estimated 1.5 billion mobile users─which account for 40% of current global activity─will be using 5G wireless networks.
Today’s chart highlights three generations of wireless technology in the 21st century, and the differences between 3G, 4G, and 5G networks.
5G: The Next Great Thing?
With over 5 billion mobile users worldwide, our world is growing more connected than ever.
Data from GSMA Intelligence shows how rapidly global traffic could grow across different networks:
- 2018: 43% of mobile users on 4G
- 2025: 59% of mobile users on 4G, 15% of mobile users on 5G
But as with any new innovation, consumers should expect both positives and negatives as the technology matures.
- IoT Connectivity
5G networks will significantly optimize communication between the Internet of Things (IoT) devices to make our lives more convenient.
- Low latency
Also known as lag, latency is the time it takes for data to be transferred over networks. Users may see latency rates drop as low as one millisecond.
- High speeds
Real-time streaming may soon be a reality through 5G networks. Downloading a two-hour movie takes a whopping 26 hours over 3G networks and roughly six minutes on 4G networks─however, it’ll only take 3.6 seconds over 5G.
- Distance from nodes
Walls, trees, and even rain can significantly block 5G wireless signals.
- Requires many nodes
Many 5G nodes will need to be installed to offer the same level of coverage found on 4G.
- Restricted to 5G-enabled devices
Users can’t simply upgrade their software. Instead, they will need a 5G-enabled device to access the network.
Global 5G Networks
5G still has a way to go before it reaches mainstream adoption. Meanwhile, countries and cities are racing to install the infrastructure needed for the next wave of innovation to hit.
Since late 2018, over 25 countries have deployed 5G wireless networks. Notable achievements include South Korea, which became the first country globally to launch 5G wireless technology in April 2019. Switzerland boasts the highest number of 5G network deployments, currently at 225 and counting.
To date, China has built roughly 350,000 5G sites─compared to the less than 20,000 in the U.S.─and plans to invest an additional US$400 billion in infrastructure by 2023. Chinese mobile providers plan to launch 5G services starting in 2020.
What Does This Mean For 4G?
4G isn’t going anywhere anytime soon. As 5G gradually rolls out, 4G and 5G networks will need to work together to support the wave of IoT devices entering the market. This network piggybacking also has the potential to expand global access to the internet in the future.
The race to dominate the wireless waves is even pushing companies like China’s Huawei to explore 6G wireless innovation─before they’ve even launched their 5G networks.
Visualizing the Evolution of Consumer Credit
See how consumer credit has evolved through the ages — from its ancient origins, to the use of game-changing technologies like artificial intelligence.
The origin of credit dates all the way back to ancient civilizations.
The Sumerians and later the Babylonians both used consumer loans in their societies, primarily for agricultural purposes. The latter civilization even had rules about maximum lending rates engraved in the famous Code of Hammurabi.
But since then, consumer credit — and how we calculate creditworthiness — has gotten increasingly sophisticated. This is so much the case that technology now used in modern credit scoring would seem completely alien to people living just a few decades ago.
Video: Consumer Credit Through the Ages
Today’s motion graphic video is powered by Equifax, and it shows the evolution of consumer credit over the last 5,000 years.
The video highlights how consumer credit has worked both in the past and in the present. It also dives into the technologies that will be shaping the future of credit, including artificial intelligence and the blockchain.
A Brief History of Credit
We previously visualized the 5,000-year history of consumer credit, and how it dramatically changed over many centuries and societies.
What may have started as agricultural loans in Sumer and Babylon eventually became more ingrained in Ancient Roman society. In the year 50 B.C., for example, Cicero documented a transaction that occurred, and wrote “nomina facit, negotium conficit” — or, “he uses credit to complete the purchase”.
Modern consumer credit itself was born in England in 1803, when a group of English tailors came together to swap information on customers that failed to settle their debts. Eventually, extensive credit lists of customers started being compiled, with lending really booming in the 20th century as consumers started buying big ticket items like cars and appliances.
Later, the innovation of credit cards came about, and in the 1980s, modern credit scoring was introduced.
The Present and Future of Credit
The modern numeric credit score came about in 1989, and it uses logistic regression to assess five categories related to a consumer’s creditworthiness: payment history, debt burden, length of credit history, types of credit used, and new credit requests.
However, in the current era of big data and emerging technologies, companies are now finding new ways to advance credit models — and how these change will affect how consumers get credit in the future.
Consumer credit is already changing thanks to new methods such as trended data and alternative data. These both look at the bigger picture beyond traditional scoring, pulling in new data sources and using predictive methods to more accurately encapsulate creditworthiness.
In general, the future of credit will be shaped by five forces:
- Growing amounts of data
- A changing regulatory landscape
- Game-changing technologies
- Focus on identity
- The fintech boom
Through these forces, new credit models will integrate artificial intelligence, neural networks, big data, and more complex statistical methods. In short, credit patterns can be more accurately predicted using mountains of data and new technologies.
Finally, the credit landscape is set to shift in other ways, as well.
Regulatory forces are pushing data to be standardized and controlled directly by consumers, enabling a range of new fintech applications to benefit consumers. Meanwhile, the industry itself will be focusing in on identity to build trust and limit fraud, using technologies such as biometrics and blockchain to prove a borrower’s identity.
Markets7 months ago
The Jeff Bezos Empire in One Giant Chart
Maps10 months ago
Mercator Misconceptions: Clever Map Shows the True Size of Countries
Advertising6 months ago
Meet Generation Z: The Newest Member to the Workforce
Misc9 months ago
24 Cognitive Biases That Are Warping Your Perception of Reality
Advertising5 months ago
How the Tech Giants Make Their Billions
Technology8 months ago
The 20 Internet Giants That Rule the Web
Chart of the Week7 months ago
Chart: The World’s Largest 10 Economies in 2030
Environment6 months ago
The World’s 25 Largest Lakes, Side by Side