Thursday, November 21, 2024

Energy utilities and cloud adoption – accelerators and impediments

The utility industry has been historically shy to jump into the cloud computing bandwagon. It wasn’t long ago when a conversation about migration to cloud would meet with scepticism at best. There was a lot of reasoning and comparison with the success that the financial industry, a similar industry with complex regulations and sensitive data, has had. “But we are unique” -that was the standard answer. And that position wasn’t entirely inaccurate.

Fast forward a few years, the outlook has changed. More than 75% of utilities use cloud now, with significant investments planned in the next few years. Enterprise applications such as Enterprise Resource Management (ERP) or Human Capital Management (HCM) using a SaaS model were the most obvious ones to start with, but that mix is changing to more sensitive and complicated workloads.

There are several factors driving this change.

Customer Experience

A utility at the end of the day provides a service, and no longer has the protection of being a monopoly. It is not hard for today’s customer to make the comparison with services like one’s favorite ride sharing, food delivery, hospitality, retail and expect the same level of quality from a utility. Customers expect their utilities to deliver an outstanding customer experience – experiences that are fluid, intuitive, rapid, and interactive. The only way to keep up with this level of experience that has been partly enabled by hyper scalers is to adopt a hyper scaler oneself. Digital transformation is no longer a choice and the agility provided by a hyper scaler is hard to match with a legacy data centre.

Energy Transition

Today’s customer is also incredibly informed and socially conscious – energy transition and sustainability are no longer mere discussions but has become a basic expectation. Combine that with changes in regulation and government policies all around the world, the industry finds itself in the centre of a seismic change- it does not matter whether you are a willing participant or not, change is coming. The energy consumption pattern is changing, the grid is changing, the consumer is changing – managing supply and demand and planning for reliability is much complex than before. Smart devices, micro grids, home generation, Vehicle to Grid, and Internet of Things will transform the average consumer to a prosumer. The need for flexible and scalable computing power to do complex forecasting, complex modeling and planning makes it a textbook use case for cloud providers.

The Analytical Utility

Utilities are investing billions of dollars to make the devices in the power grid remotely IP-addressable. According to Navigant Research, year 2022 will see the installation of nearly 1.1 billion smart meters. The number of smart meters only provide an indication of the growth in sensors on an average grid. All this acceleration really points to an unprecedented amount of data being produced. Data is power, the key to unlocking unlimited value, and a utility that sits on its data will be eaten up by fast movers.

The world’s largest taxi company is not really a taxi company, the world’s largest hotel chain is not really a hotel chain, and the fastest growing car company is fundamentally a data company (Tesla anyone?). And the utility of the future need not be a conventional utility company.

Case in point is Google – Google data centres consume more power than the State of Hawaii. Between 2010 and 2018, computing done in Google data centres globally increased by 550%, yet energy consumption increased only by 6%. To top it all, since 2017 Google has been purchasing 100% renewable energy as compared to total electricity use, largely through Power Purchase Agreements (PPAs) at scale. Google relies on hourly modeling and advanced analytics to forecast global energy requirements at the hourly level and to predict the carbon vs carbon-free content of the grid based on the location. Google has also committed itself to solving the technology challenge of optimizing supply and demand and accelerating the commercialization of next-generation resources. DeepMind system is being used to better predict and use wind power and ML is being used for carbon-intelligent load shifting (a fancy name for shifting flexible compute tasks to align with greener hours on the grid). The utility of the future will not be a conventional utility company. Mind you- I have not even yet considered the massive disruptions that Elon’s giga factories might usher in once Tesla, Space X and Twitter consumes less of his attention.

Reliability and Resiliency

For the grid, reliability is prime –this translates to reliability and resiliency to not just crucial grid management software, but the software and applications used to forecast, model and plan as well as interact with customers. Not long ago, three 9s (99.9%) was an acceptable standard. Now you are not even competitive unless you are a five 9s or a seven 9s – another textbook use case for a hyper scaler. Gone are the days when you would worry about keeping a primary and a secondary data centre up, perform periodic switch overs, risk overloading and tripping a power line because a power unit in your data centre failed. All that nightmare is now for the hyper scaler to own, allowing you to focus on your job – balancing the grid, operating your business, and keeping your customers happy.

Security

The question is to ask is not whether the cloud is secure but are you using the cloud securely. Subtle difference, but a perspective changing question. Utilities have grappled with this notion significantly in the past. I cannot risk a cyber security event, or my data is too precious to be shipped away out of my sight – common concerns a few years ago. But utility leaders are waking up to the fact that no matter the size of your cyber security budget, you are not going to be able to match the billions of dollars in investment that cloud computing providers are pouring into making their services secure. It is better to trust the best in the business and focus your attention on sensible policy definitions and enforcement of those policies.

Although all of this looks positive, there are several Impediments to this growth

Regulatory

Throughout history, utilities were rewarded for investing in capital expenditures (capex), long-lived physical assets like transmission and distribution infrastructure, pipelines, meters, and power plant, by being able to collect a set rate of return on this spending. By contrast, operating expenses (opex)—ongoing costs like fuel, maintenance, supplies—could not earn a return. A utility can recover its cost from customers, but no more. Among the unintended consequences of this incentive structure is a preference for capitalized “on-premises” costs, versus the opex costs of cloud computing. Despite this disincentive, utilities have slowly shifted some computing needs to the cloud, driven by the digital transformation need. Several solutions have been proposed to this regulatory conundrum, including some notable proposals in New York and Illinois. There have been several studies published by leading accounting firms that discussed multiple accounting approaches as well as merits and demerits of each approach.

Many regulators still don’t allow for utilities to earn a rate of return on their cloud investments and utilities want significantly more clarity from regulators on the ability to earn a rate of return on their cloud investment. Regulatory acceptance continues to be a barrier for cloud adoption.

Talent

Good talent is hard to find, and good talent to migrate your workloads even harder. Cloud migration cannot be approached merely as a lift and shift. If all that you are doing is to recreate your data centre in the cloud, you have done yourself a big disservice. Many adopters fail exactly at that – transplant your data centre to the cloud, dust your hands off and call it a success – thanks in part to the self-appointed experts. A recent PwC survey found that 53% of companies aren’t realizing substantial value from cloud investments, in part due to unavailability of digital talent. Another survey by 451 Research indicated that 90% of organizations are experiencing shortage of cloud skills.

Complex workloads

Current progress in the migration roadmap acts as an impediment as well. You started with a migration strategy – a quadrant that showed the easy to execute low hanging fruits, the straightforward migrations. Now that the low hanging fruits are taken care of, where do you go? Moving complex workload requires complex strategies, reengineering, re-platforming, and complex transformations – and those things are not exactly easy to execute. Combined with the lack of talent this represents a significant impediment.

Regardless of impediments, innovation always finds a way – that is the very nature of it. The reasons for adoption acceleration are much more powerful than the impediment induced slow down. The next decade will redefine what it means to be a utility, and the role of cloud and the capabilities native to cloud will prove to be the enabling factor in that redefinition.

Disclaimer: Any views or opinions expressed in this article are personal and do not represent those of people, institutions or organizations that the owner may or may not be associated with in a professional or personal capacity. Any views or opinions are not intended to malign any religion, ethnic group, organization, company or individual.

Latest

Translate »