Implementing AI and overcoming barriers in GBS and Shared Services
There's plenty of hype around AI, and for good reason. It’s the fastest technology development in my lifetime, and what’s possible to do with AI is literally changing week by week. When used correctly, AI can speed up manual, repetitive work, save time, and help service teams focus on value-added work. However, despite all the buzz, it’s clear that many businesses are stuck on the puzzle of how to realistically implement AI in their organization.
Last month, I hosted a think tank at the Shared Services & GBS Leaders Exchange and heard firsthand the barriers people on the ground are facing when trying to introduce AI.
There were over 20 global business leaders present, spread across a range of shared service industries, from manufacturing to finance to e-commerce. We split the team into two groups to observe their AI journey. The groups comprised:
- Stormers: those who are already implementing AI and seeing results
- Strivers: those who want to implement AI but haven't yet.
AI barriers are preventing adoption
It was interesting to observe that 75% of attendees fell into the 'Striver' team, with AI barriers cited, including:
- Security and compliance concerns
- Buy-in from the board/ROI concerns
- Understanding the right AI tool for use cases
- IT infrastructure/data readiness
- Business resistance
Although attendees were unanimous in their desire to implement AI, a range of considerations were holding them back.
Before you can overcome barriers, you really need to understand what you want to use AI for. It’s all very well having amazing technology at your fingertips, but if you don’t know what to use it for, you’re not going to get far. Once you know what you want to achieve with AI, then you can make a genuine case for it and iron out the issues.
Know your use case
The 25% of think tank attendees who fell into the 'Stormer' team have already had early AI success with use cases such as e-commerce reconciliation, P2P chatbots, invoice capture, and dispute management.
It’s essential to know what problem you want to solve with AI, instead of simply thinking, “this shiny new tool looks impressive. I’d better start using it.” In my opinion, the best way to identify AI use cases is to take a good look at your end-to-end service processes with a lot of manual and repetitive work involved, and think about an AI co-pilot. For instance, if you are handling a lot of forms, you could look at introducing Intelligent Document Processing to scan and populate information that would otherwise be a manual process. Similarly, if you are dealing with large volumes of service emails, email triage and sentiment analysis are great implementations to speed up work.
Recognise what is and isn’t AI
AI has become a catch-all buzz term, but it’s important to make a clear distinction between artificial intelligence and other types of automation. The way I see it, AI falls into one of the following categories:
- Specific AI models: Data scientists who create predictive models based on your data.
- Narrow field AI: A product with built-in AI models that is trained for specific purposes. E.g., IDP to extract invoice data.
- Generic generative models: Bard, Midjourney, ChatGPT, Stanford Alpaca, or other generic models.
If you are thinking about implementing RPA, Rule Engines, iPaaS, or Low Code, these are not AI and belong in a different conversation.
Introduce GenAI in low-risk scenarios
While you will understandably want to exercise caution when using AI in front-line operations, there are certain roles and departments for which using AI poses low risk. For example, if your job title involves something like creating, e.g. Graphic Designer, Coder, or Copywriter, then using GenAI right now is low-risk. Even in our own organization, our Content Team uses AI for tasks such as proofreading, and our Coders use AI for the first draft of the code they write. These teams - which have embedded working processes of testing, quality control and proofs - are a no-brainer for speeding up repetitive tasks.
The way I'd recommend approaching generative AI is:
- Identify everyone in the organisation whose job involves creating (writing, designing, and building).
- Create task forces for each skill and let them find the best AI co-pilot for their task. Eg. Midjourney for Graphic Designers or ChatGPT for Copywriters.
- Test and procure low-risk tools to support individuals in your organisation
Employ a Chief AI Officer to handle the serious stuff
If your job title includes words like 'delivery', 'process', or 'execute', it’s important to put guardrails around GenAI and manage risk, which is where something like orchestration comes in. You need a way of measuring the outcome you’re expecting, and you need a crystal-clear policy around data, how to manage it, and what the appetite is to use your organization’s data in training other models.
I would go as far as to say that you will need to appoint someone in your business to be responsible for AI safety. A Chief AI Officer is going to be required very soon. It’s a very specific skill set for organizations that want to get serious about AI.
Orchestrate your service lines
If you are working in any kind of service delivery team, orchestration is the first layer needed to really get your operational ducks in a row and see where automation can bring the most benefit. Introducing orchestration enables you to view, run, and manage end-to-end processes using one single source of truth, with the capability to plug and play various automation and AI technologies. By being able to step back and see your whole process from start to finish, it will be much easier to identify gaps and bottlenecks and work out the potential for AI and automation to shine. It will also allow all your processes to be connected under one system rather than having disparate bits of automation and processes that slow everything down and are prone to human errors and inconsistencies.
At Enate, we recently introduced AI into our orchestration platform so that customers can automatically take advantage of sentiment analysis, email triage, data extraction, intelligent document processing, and query automation by simply switching these features ‘on’ in the marketplace. There’s no need for complex machine learning models as we’ve built our own private brain that seamlessly wraps around your data. Our security is watertight - we offer Microsoft Azure-levels of compliance and your data never leaves the private cloud it’s stored on.
If you’d like to know more about using orchestration and AI to solve specific business case, you can book some time with us here.
AI in Operations
Operational Soup is a term we use when work is being carried out, but businesses have little idea how much, by whom or exactly how it is processed.
Start orchestration in departments with strong use-cases to deliver value quickly. Often, good examples can be found in back/middle office process areas that have high variation and complexity such as finance or HR operations. Recent intelligence sourced through process mining suggests 80%+ of the work performed in a shared services organization is not performed in the ERP systems, but rather in Excel or Outlook. This is where Orchestration thrives.
Having orchestration implemented across our departments can be likened to having x-ray vision into your operations.
Global Head of Operations at TMF
Almost half of banking and investment CIOs (49%) and insurance CIOs (44%) indicated that they will increase their automation investments in 2021.
Source: Gartner, 2021