Why Technology Still Doesn’t Matter Today

“Technology doesn’t matter” - this was my opening when I was giving a “Future of IT” presentation to the IT executive team of one of the largest oil companies in the world.

While digital strategy and transformation is one of the top agenda items for CEOs and executive teams, the reality is that most companies are still struggling to get the projected impact out of their current technology investments. Research has shown that industries and companies have a long way to go to make technology truly matter. Much work remains to be done to answer the key questions on how to make technology matter.

“Technology doesn’t matter” – this was my opening when I was giving a “Future of IT” presentation to the IT executive team of one of the largest oil companies in the world. As you can imagine, for someone who used to lead Technology Strategy practices for leading consulting firms to make that kind of statement, it was an eye-popping speech opening.  

But I was serious! Technology doesn’t matter for many of my clients. There are too many technologies that they are not actively exploring and deploying. They might be experimenting with emerging technologies such as AI. But those experiments are isolated in small groups and projects and haven’t been scaled across the enterprise. Their IT projects don’t always have business cases associated with them. The multi-year IT programs are often over budget and behind schedule. They are struggling to keep the systems running. And for many companies, it’s a daunting task to tally up the total technology spending across the enterprise. Some companies can’t even quickly tell me how much IT spent the year before and I will get a different figure every time I ask the question!   

My experiences were not isolated incidents. Let’s look at some sobering statistics: 

  • McKinsey Global Institute in May 2019 estimated that digital maturity by traditional incumbent firms is still only about 25% of the potential. “Only 26 percent of worldwide sales were made through digital channels, only 31 percent of operations volume was being digitally automated, and 25 percent of interactions in supply chains was being digitized.”
  • A 2019 State of Cloud survey found that while 90% of companies are now using cloud technology, more than 60% of companies spend less than $1M on cloud. Cloud adoption is still just nascent across many companies. But even with early adoption of cloud technology, survey respondents are already seeing waste – they estimated 27 percent waste in their cloud spend.
  • While AI is a hyped buzzword that is seen everywhere, in reality, AI adoption is also just emerging. A WSJ article found that 30% of organizations are just conducting AI pilots. Only 21% of organizations report using AI across multiple business functions. AI investments are still quite small with 58% percent of respondents said that less than one-tenth of their digital budgets goes toward AI.
  • Data, as the oil to fuel the future, has significant quality problems. MIT Sloan estimated that employees waste up to 50% of their time dealing with mundane data quality tasks. Dun & Bradstreet’s survey pointed out that 41% of companies cite that inconsistent data across technologies (CRMs, Marketing Automation System, etc.), as their biggest challenge.

Healthcare is an industry in which I have done a lot of recent client work. One of the key priorities in Healthcare has been to deploy Electronic Health Record (EHR) to push for better quality healthcare. While the adoption of EHR across hospitals has seen a significant jump, the full benefits of EHR has yet to be realized: 

  • In the 2018 update to Congress, HHS reported that as of 2015, 96 percent of non-federal acute care hospitals and 78 percent of office-based physicians adopted certified health IT. However, not everyone has access to the data: patients, healthcare providers, and payers. 
  • While EHR could potentially be useful for all facets of hospital care, the reality is the usage of EHR is still limited. For example, in a recent study by ONC on hospital usages of EHR, only 60% of hospitals used EHR to identify gaps in patient care, 59% used EHR to assess adherence to clinical guidelines (59%), and only 51% used EHR to develop an approach to query for patient data. Furthermore, while some vendors of EHR achieved significant adoption, users of some other EHR products were also more likely to not use EHR data at all. 
  • In another recent study, the researchers found extensive use of workarounds of EHR — nonstandard procedures often used to make up for gaps in system or workflow design. Workarounds included handwritten notes, emails, verbal discussions, and patient summary reports. Those workarounds would potentially create significant quality issues in the EHR records.

Not only the technologies are not being effectively used, but technologies are also often potential liabilities for many companies. For example, a 2017 Accenture report indicated that 26 percent of U.S. consumers had personal medical information stolen from health care information systems, of which half incurred approximately $2,500 in out-of-pocket costs per incident. In 2019 alone, companies such as Wells Fargo, Delta, Southwest Airlines, Kroger, and Target all had widely reported system outages. 

In a 2003 HBR article, Nicholas Carr famously stated: “IT doesn’t matter”. His argument is what makes a resource truly strategic is scarcity, not ubiquity. IT in 2003 had already become available and affordable to all. Thus, IT doesn’t matter since IT was quickly becoming the infrastructure technology that every company has access to.  However, as he correctly pointed out in the article, there were proprietary technologies, which can be owned, actually or effectively, by a single company. As long as they remain protected, proprietary technologies can be the foundations for long-term strategic advantages, enabling companies to reap higher profits than their rivals. Unfortunately, not many companies have truly invested in creating proprietary technologies and building competitive differentiation using technology. One of the questions I often ask my clients is how many patents the clients’ IT has developed. No. of patents might not be the best indicator for competitive differentiation. But it is an easy number to grasp and the patent information is publicly available. Unfortunately, so far, for a lot of my clients, the number of IT created patents is zero. 

So, for many companies, they are not wildly adopting available technologies. They are having a difficult time preventing technology from becoming a liability. And they are not using technology to create competitive differentiation. For those companies, does technology really matter? 

In the meantime, many companies are making the statement of them becoming a technology company. They realize that without technology, it is going to be mission impossible for them to stay relevant and competitive in the fourth Industrial Revolution. 

What are the root causes of the disconnect of the ambition to become a technology company and the reality of making technology actually matter? How do companies make technology matter more and get more outcomes out of the technology investments that they are already making? What does becoming a technology company really mean? How can companies accelerate the transformation of becoming a technology company? What do companies need to do differently? On this website, I will share many observations and propose some actions. I look forward to hearing your feedback!

Copyright © 2024 Parker Shi. All rights reserved.

Share on Twittter
Share on LinkedIn

Related Articles: