People have competed since time immemorial for resources or precious commodities. But throughout much of history, those most sought after were physical: food, water, gold, silks, spices, oil, and so on. In our modern era, however, perhaps the most precious resource of all is one you can’t even lay your hands on: data. And some of the largest, most profitable companies ever built specialise in collecting, moving, storing, studying and deriving insight from this data. They sit on digital goldmines.
And unlike so many of our physical resources, there is no theoretical cap to the volume of data in the world, and this volume is growing exponentially as we speak. People spend more and more of their lives on digital devices, and these devices themselves have a computational power that would have been unthinkable within living memory: in the mid-80s, a super computer cost $32 million, whilst today, a smartphone with the same processing power will cost you just a few hundred dollars; the iPhone in your pocket processes instructions tens of thousands of times faster than the computers that guided rockets to the moon. All of this both requires and kicks out data.
As with so many other technological trends, lockdown has acted as an accelerant, herding us into virtual, device-enabled lifestyles. The evolution of the Internet of Things (IoT) is speeding up, with the number of devices increasing in the home and outside the home, in places like factories and hospitals. In the not-so-distant future, machines, appliances, autonomous cars and even traffic lights could be connected and communicating with one another, generating enormous amounts of data. How much data are we talking about? The Information Overload Research Group estimate that the amount of data created every day stands at 2 Zettabytes, or in terms of bytes: circa 2 quintillion bytes. There are 18 zeroes in a quintillion. That’s a lot of data. The rate of data growth is also skyrocketing: 90% of the world’s data has been created in only the past two years.
Machines making data
And it isn’t only humans pumping out data. In fact, research suggests that machine-generated data accounted for 40% of internet data generated in 2020. While Artificial Intelligence (AI) is software code written by humans, Deep Learning (DL) is code written by data and machines themselves, and we have only scratched the surface of DL capabilities. DL is creating the next generation of computing platforms, such as conversational computers, self-driving cars, or consumer apps like TikTok, which uses DL for video recommendations, all capable of exponentially increasing the software’s capability. For instance, smart speakers like Amazon’s Alexa answered 100 billion voice commands in 2020 – 75% more than in 2019. In fact, 2020 was the breakthrough year for conversational AI. For the first time, AI systems could understand and generate language with human-like accuracy. Conversational AI requires 10x the computing resources of computer vision.
Turning data into gold
But this data is useless without someone or something to capture, organise and harness it for human benefit. We seek out companies that can turn this superabundance of data into something useful. Wolters Kluwer is a software company that develops expert solutions that endeavour to make sense of huge quantities of data for their customers, providing them with the right information, packaged in the right way, at the right time, and at the point of maximum impact. Nowhere is that clearer than in their healthcare end market, where their expert solution UpToDate puts the latest science and evidence into the hands of doctors to get the best outcomes. They leverage physician-authored clinical data points and synthesize all of that data into trusted evidence-based recommendations that are proven to improve patient care, quality and outcomes.
Wolters Kluwer is fully entrenched in their customers’ processes and their solutions deliver enormous value add at a relatively low cost. In that sense, they are very hard to displace and customers are sticky, giving them a very stable and recurring revenue base from which to grow.
Consultancy firm Accenture also works with clients to sift through data to identify that which is valuable, and help them to leverage it. For example, Accenture worked with a global fast-food chain to help them use their data bank to serve more customers with more personalised experiences. The company sifted through over 300bn rows of data, analysing it on a per store basis, looking at how demand changed on different days; for different protein types; and how variables such as weather, traffic density and holiday seasons effected customer and restaurant behaviour. Through digesting all of this, Accenture was able to generate forecasting models to give that restaurant chain real-time insight into how best to target customers.
Powering the big data revolution
Semiconductors are the essential technology enablers sitting at the heart of the data-centred trends reconfiguring the world. They can store memory, and they are essential to processing power, acting as the near invisible building blocks of the digital world. Without them, all sorts of consumer electronics, communications systems, big data processing, AI, and machine learning are impossible. Taiwan-based TSMC is the largest contract chip maker in the world, controlling more than half of the global made-to-order chip market, and its market dominance goes up considerably the more advanced the semi-conductor category.
TSMC sits right on the bleeding edge of making smaller, faster, more powerful chips. The Financial Times reports that, with 5-nanometre chips already commercially produced, TSMC is turning its attention to 3-nanometre chips – devices containing transistors 1/20,000th the width of a human hair – which will be used in devices from smartphones to supercomputers. TSMC brings a scale and manufacturing excellence to the production of these chips that make it the go-to supplier for tech companies, from Apple to Qualcomm. As machine learning and AI become ever more computationally intense, we believe TSMC will be at the forefront of producing chips capable of processing and storing the requisite data.
A final note… on the hidden costs of technology
Those are scary numbers, but let’s put that into perspective – the same study as before found that while their computing output jumped 6x from 2010 to 2018, their energy consumption only rose 6%. Data generation and usage will grow going forward, but the relationship with energy consumption is not linear, because of a number of important factors:
Efficiency is the name of the game: Processor efficiency has improved, reducing idle power, and increasing storage drive density and slowing server growth.
Hyperscale my data: The shift to cloud computing relies on hyperscale data centres: the largest and most efficient type of data centres run by the likes of Amazon Web Services, Google, Facebook and others. They usually boast the most cutting-edge technology that helps reduce energy costs. The shift from on-premise to large scale data centres has in fact been shown to reduce the workload carbon footprint by 88% for the median surveyed US enterprise data centre.
Tech to the rescue! AI plays a crucial role in reducing the energy used for cooling – Google’s DeepMind AI for example has driven the energy used for cooling at one of Google’s data centres down by 40% in 2016, using only historical data collected from sensors within the data centre.
Location, location, location! Setting up data centres in cold regions helps reduce energy usage by removing the need for cooling equipment altogether. Microsoft is said to be experimenting with innovative technologies such as installing data centres on the seabed.
Renewables: electricity accounts for c.70% of data centres’ total operating costs. As renewables start becoming cost competitive vs. fossil fuel options, tech firms are committing to boosting their renewables exposure. Facebook, Google, Microsoft and Amazon were amongst the top 10 largest buyers of renewable energy in the US in 2019.
As ever, rather than seeing them as a threat, data centres arguably present an opportunity to accelerate the sustainable energy transition, because they sit at the nexus of energy efficiency, renewable energy and burgeoning data economy enabled by digitalisation.
“On average, hyperscale data centres only require 16% of the power as compared to on-premises infrastructure for the same amount of computations.”
The value of investments and any income derived from them can go down as well as up and investors may not get back the original amount invested.
The information, opinions, estimates or forecasts contained in this document were obtained from sources reasonably believed to be reliable and are subject to change at any time.
Views and opinions have been arrived at by BMO Global Asset Management and should not be considered to be a recommendation or solicitation to buy or sell any companies that may be mentioned.
Meet Harry Waight
Meet Harry Waight
Harry joined BMO GAM in 2013, and works as a portfolio manager on the Global Equities team. He focuses on Japan, as well as global stocks contributing to Technological Innovation and Health and Well-Being. Outside of work he is interested in understanding history, as well as mastering Jiu-Jitsu – both of which are proving equally challenging.
You might be interested in...
How can investors tackle plastic pollution?
Investing in our blue planet
Winds of Change: Health & Wellbeing
ESG knowledge shared: September 2021
Winds of Change: Sustainable Cities
Rethinking finance: enabling sustainable development
Why BMO for Responsible Investing
Learn how BMO has been a driving force in responsible investing and how this has helped drive long-term results for clients.
We offer one of the broadest ranges of Responsible funds and solutions.
Sign-up for responsible investing insights from BMO
Want to know more about how we can help your clients invest responsibly?