How companies can ready their teams to harness the power of AI

Getting the most value out of AI comes from making sure employees feel ready and willing to use the technology

Less than two years later, more than 35,000 employees regularly use Telus’s in-house gen AI tools in their work, with applications ranging from analyzing reports to writing code to generating images. To get to that point, the company had to ensure employees felt ready and able to use the technology without fear of failure, said Hesham Fahmy, the company’s chief information officer.

After rolling out its first tool, it created a sandbox of internal, secure and private programs that employees can experiment with and find uses for in their own roles.

“Embracing gen AI is not just about adopting the technology; it’s also about developing a skilled, digitally savvy workplace so we can truly harness (its) power,” Fahmy said.

About 60 per cent of Canadian businesses have adopted AI in some form, according to KPMG’s 2023 global technology report. Lewis Curley, a partner with KPMG Canada’s people and change practice and the firm’s national human resources transformation leader, said organizations’ ability to get the most value out of AI comes from making sure employees feel ready and willing to use the technology.

Workplace AI tools fall into two camps, he said: those built with specific use cases that employees will have to use as part of their jobs, and assistant-type programs that aren’t required, but can help employees do their job more efficiently. Gen AI applications largely fall into the latter group.

“Whether or not I use it, and how well I use it, comes down to my understanding of how to use the technology and my willingness to use the technology,” Curley said.

It has language, but it doesn’t think. It’s not a good idea to anthropomorphize it too much

Avi Goldfarb

Avi Goldfarb, a professor of marketing at the University of Toronto’s Rotman School of Management and the Rotman chair in Artificial Intelligence and Healthcare, said employers should hold information sessions to explain how new AI programs work and how they can and cannot be used in the workplace.

He said employees don’t need to understand the engineering behind gen AI tools, but they should be made aware of their capabilities and limitations.

“It has language, but it doesn’t think. It’s not a good idea to anthropomorphize it too much,” he said. “People have to understand that the information coming out of it … is built from data and it makes predictions, and its predictions are not perfect. Once you understand they’re imperfect, you’re going to use them more effectively.”

Curley said training is important, but he believes one of the most effective ways to drive adoption is by providing tips and tricks for how large language model (LLM)-backed programs can be most beneficial.

“‘Oh, that’s cool, but it doesn’t really help me’ is not going to make people go back to it,” he said.

Terri Griffith, a professor of innovation and entrepreneurship at Simon Fraser University’s Beedie School of Business, said organizations should have clear rules around what data can be fed into a gen AI product, particularly if they’re using off-the-shelf tools or if their internal tools are powered by an externally created LLM.

She added that companies with any federally protected data will have to be particularly clear about these expectations.

A women reads outside a Bell Canada office in Toronto, Ont.
A women reads outside a Bell Canada office in Toronto, Ont.Photo by Brent Lewin/Bloomberg files

The customer simulator was trained on historical customer service data and allows agents to select the type of call or customer and work through the simulated interaction until they resolve the issue, said Michel Richer, Bell’s senior vice-president of enterprise solutions, data, engineering and AI.

But before they can use the tool, employees have to complete mandatory training about how it should and shouldn’t be used, security and privacy information, and a reminder that the interface is an assistant “but (employees are) responsible for the end product (they’re) producing,” he said.

‘Oh, that’s cool, but it doesn’t really help me’ is not going to make people go back to it

Lewis Curley

Richer’s team has also hosted voluntary training sessions that have been well-attended, and they promote any new functions they add. He said there are roughly a few thousand daily sessions on the interface.
“We were pleasantly surprised by the uptake,” he said.

Since the tool launched, Richer said employees have been sharing ideas with his team about opportunities they see for integrating AI into their roles.

“It’s becoming a lot more of a dialogue, of employees actually pulling on the AI teams to provide them with tools — a lot more than we’ve seen in the past,” he said.

Curley at KPMG said workforce-readying initiatives should also include planning for the additional time employees stand to gain if the technology assists them as intended.

“A really interesting place that organizations can start to look at for generative AI is … freeing up people’s time to go and launch a new product, service or business line,” he said.

Fahmy said that as Telus’ employees across the organization, from frontline roles to the boardroom, use gen AI, the technology has freed up employee time for “things that really drive value: creativity, strategic thinking and solving complex problems in ways we had never thought possible.”

Related Posts


This will close in 0 seconds