May 16, 2005
New Communication and Information Technologies Emerging in the Workplace
Darlene Burt (CIBC)
What does your organization want or expect from your learning or training department? Comments: not experience, but outcomes or results. Proven ROI. Short-as-possible learning curve. High usage, good stuff, cheap price, increased sales, flexibility. Growing the employees.
We wanted all of that, and when we looked at it, we weren't there. We weren't as proactive as we wanted to be. In te future, we wanted to be proactive, to be able to report on results.
Focus on performance improvement, on measurement. What is your definition of 'performance improvement'? Comments: It's situational and dependent. Eg., content center, based on call volumes, call lengths. Elsewhere, perhaps to increase employee satisfaction, management abilities. Also: moving from current performance to desired performance. In a regulated environment: conformace to a performance audit, eg., accounting.
How do you know that you get there? Through measurement. How do you measure that? What do you think about that? Comment: it's difficult, depending on the skill. Eg., if sales go down it won't necessarily be fixed by a training initiative. Also: they're useful as indicators, but not to be relied on as the ultimate truth. Also: zero-error environment, eg., airlines. What if you don't train? The cost is huge.
Does anybody have any measurement strategies? Comments: we budgeted a maintenance cycle into our training programs. We already have the money set aside. It almost forces the hand of the training department.
We determined we had a need, to show performance improvement. We wanted to show the value of what we bring to the organization, and we know the organization wants results. We want to be more proactive in determining needs and then responding to them. What was important was executive sponsorship; there is a cost involved, time involved. Our VP wanted to show business results. And he wanted to invest in employees, which would result in meeting those goals. Comment: so he already believed that this works. Yes. Comment: executive support not enough, you need engagement at the learning level.
Comment: why was he behind it? Good question. He is a people person, he believes in the em ployee. He has seen the link between employee satisfaction and customer satisfaction. Experience in non-profit sector. Experience in what it means to be human: that's what really drove it. Maybe we were lucky. He knows that if you invest in people it's going to show. Comment: there has to be a process in place to channel the work you're doing into the outcome you want.
What we had been doing is a lot of classroom training, some directed consultive studies (managers delivering learning), usually based on a new process, new application, something that has to get out.
We wanted to make things a bit more regular, faster. The common denominator with training is that they take the employees away from the phones for extended periods of times. That's why we can't deliver training for training's sake.
We introduced an e-learning tool that interfaces with the workforce management system and the automatic call distributor (ACD) and our quality management system. The tool works with these three systems to deliver training directly to the representatives' desktops, based on periods of low call volume. It can measure representatives' performance. And it generates reports to evaluate individuals' traits and performance capabilities.
Why did we choose it? It can proactively deliver indivdiualized just-in-time content to designated users. Not only is it a delivery tool, it provides a 'student today' page - listing any assigned courses, pre-requisites, alerts and product change info, and a thing called "today's question" - reflects on, say, last week's learning. We also have on the 'student today' page something called 'wuick tips'. We're compiling a library of best practices, tips, performance techniques - any representative can click on that. It can deliver a variety fo content: text, audio, video, interactive, etc.
The tool: from a company in in the States called Knowledgent. They've found the niche of a contact center, in order to deliver content based on call volume downtimes. The tool itself is quite flexible; we can design the content, the duration of the learning breaks, the instructional design, etc. What we deliver is up to us. Comment: Did you need to do an IT assessment? very much so. We have a pretty complex system, and when you consider the workforce management system on top, and when the learning system says we can incorporate with them all, we had to know, how are you going to do that?
Comment: The last thing I would want is training during my downtime. Response: I see what your saying. But it is company time. But there's still that human factor. That's something we considered. That's why the design of the courses is, the employees really understand the fact and they want to build up their skills; that was one of their top-listed priorities. Comment: what is the employees' option to control the timing? Response: hopefully it's more than just in time, hopefully the training will be scheduled in advance. Comment: user choice? Response: users are assigned a number of courses, that they are required to take. They have a set of optional courses they can take in their off-time.
Comment: would I eventually be able to log in at home? Resp: probably not, firewalls and security, etc. We looked at that doing dial-up. We looked at it for pavilion employees. But it's hard - any time could be interrupted by someone coming in. Comment: that could be tracked. Resp. yes. We take account of that. Courses are 15-20 minutes long. Also, there is a system that says 'Call volumes are exceeding thresholds, your learning is being interrupted'. And that will all be tracked.
Aside from the scheduling, the tool allows us to do simulation training. Done properly, it's probably one of the best ways to illustrate particular processes. It can incorporate every aspect of a customer call. It can based on best-practices. There are five simulations we use:
- Watch an expert: they are watching, not interacting. Their screen is a simulated environment of what they would be experience during the call.
- Learn how and why. In addition to the above, little text boxes pop up and explain what's going on.
- You talk. They listen to the customer, they can talk to the customer, and the desktop does it's own thing.
- You type. The converse; they listen to the conversation, and they do the desktop typing work.
- You do it all. Fully simulated enviroment, they speak, they work the desktop tools, etc.
As mentioned, tracking is built into all of this. Statistical information, who completed what and when. Additionally, we have reporting. Reports include things like course usage, representatives' performance, review (test) performance, time in training, and learning break reports. Thesea re customizable, so we can review specific courses, specific learners, score ranges, even right down to individual questions.
Implementation: six-stage process; discovery process, learning strategy, learning council, development process, launch, challenges.
The discovery process: to align key business objectives with learning objectives. First, identify key business objectives - we needed to find those that training respresentatives could impact. So we looked at the representatives that were exceeding our expectations to see what they were doing that was setting them above the bar, identified root causal factors, and designed training accordingly. Then we needed to identify an approach to monitor and evaluate the learning.
The learning strategy: we analyze the representatives' performance, which set learning objectives. We used the ADDIE approach. We set up learning breaks, animations, etc. We implemented these learning breaks, and evaluated through tracking and recording.
Learning council: this was to ensure that we were on the right track. A representation of all stakeholders, including team leaders, representatives, workforce quality people, executive and leadership. They look at marketing, today's questions and quicktips, and performance improvement selections. Membership on the council is on a rotating basis (except for senior team). There's constantly input from every division. From the representatives, there was an application process.
Development process: we had to determine the number of these 15-minute pieces we needed. It was hard to condense content to 15-minute chunks; it was a tough go. We then needed to identify subject matter experts; we had them in hourse, which was great. And they were stoked about the new approach. And then we had the easy task of acquiring and analyzing content. There's a pretty linear progression through the courses; there's (say) 24 courses, they will be assigner by the week (I'm not going to assign 24 courses at once)
Implementation: we outsourced course requirements to an external solutions provider and used in-house instructional designers and developers.
Launch: we employed a staggered launch approach. Launch A, for the customer contact center (phone center). The team leads see the courses a week ahead, which gives them a head start. The launch period will last 8 weeks; this will be our yardstick. We identified a control group to measure impact. launch B, for the pavilions. Starts one week after; we identified five pavilions as our treatment group.
Challenges: New team members - we went from 8 to 24; they needed to know bank terms, acronyms, jargon, etc. New technology - not just for the users, this is also a new authoring tool. Communication - we needed to get everybody on board. If we don't have buy-in it's not going to work. Outsourcing - we had to get them on board with bank terms, bank culture. And we had to get them on the same page with respect to quality. The 'little things'.
Darlene Burt (CIBC)
A focused, consistent delivery of learning. No matter where they are in the system, we want them to have a consistent experience, whether in call center or pavilion.
Retention of knowledge and skills: we focused on the importance of the role of team leaders, coaching and the use of this tool.
Ongoing investment in people development.
Measurement performance instrument: we set as an objective to have 100 percent of courses to take 100 percent of cousres, with 80 percent pass rate. The actual target is 8-10 percent increase in sales of one product: learning how to talk about the product, to market the product. The objectives were based on Knowledgent's previous work.
Comment: how wmuch did this cost per employee? Response: I don't know.
We also wanted to focus on people development and improvement. Employee training was a key factor; every year employees identify training as a top priority in teir employee surveys.