If you go back a year, probably 20% of the projects that we were receiving were generative AI projects. Today that’s 50-50. The generative AI side is accelerating even more, so that's where we see the growthDonald MacDonald, Head of Group Data Office, OCBC Bank
- About This Video
- Transcript
As one of the first banks in the region to scale generative artificial intelligence (AI) applications, OCBC’s Head of Group Data Office, Donald MacDonald, discusses how the bank transformed its processes, technologies, and employees to enable the rapid adoption of generative AI.
INFocus Series
INFocus provides exclusive insights and trends from experts and leaders across the Asia Pacific region, exploring the forces, opportunities, and challenges shaping its future.
Donald MacDonald
We've definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. To handle that growth, we've had to transform our people, our technology, and our process.
[series intro clip]
Na Zhou
I'm Na Zhou, partner at Oliver Wyman, and AI lead for the Asia Pacific region. I'm here with Donald.
Donald
I’m Donald MacDonald, head of group data office at OCBC.
Na
Donald, pleasure to have you here. Today we want to talk about AI. But out of everything we could talk about AI, I want to focus on something relevant for practitioners scaling AI for impact.
Now for OCBC, you are probably one of the first banks in the region to have scaled up generative AI applications to all of your 30,000 employees. So, my question to you – how come you were able to move so fast?
Donald
I think we're able to move fast, because I think we were very quick to identify the transformative power of generative AI, but also, we weren't starting from scratch. We had a lot of the key building blocks in place already from other work we've done in the past. If you go back to 2019, OCBC was already using the first generation of large language models, things like BERT and FinBERT, and we were using those to read the news, to identify bonds that were at risk of default, identify credit early warning signals, even make stock recommendations to our customers.
It was a space that the team were actively monitoring and abreast of. We also had capabilities in place like our team had skills in deep learning, in natural language processing. That quite easily allowed us to move quickly to generative AI. We had on-premise GPU clusters, and we had quite a mature MLOps practice as well, which allowed us to bring in these models. When we build applications, [we] deploy them out to end users. Having those kinds of basics in place were very, very applicable to the generative AI world.
The team – because we were actively monitoring the space, when ChatGPT came out at the end of 2022 – very quickly identified that this was better than what we were already using, and we could see that it allowed us to do a lot of things that in the past with traditional AI would have taken a long amount of time. We could do them much, much faster now, but it also allowed us to do things that we couldn't do before. Things like our coding assistant, for example. Two years ago, I could never have imagined we could have built that, let alone build it within one week of effort. So those kinds of things showed us that generative AI was truly transformative, and that we wanted to double down and go big on this space.
With that in mind, we basically identified about ten use cases across the organization we thought would be valuable. We took it to the top level of management from the CEO down and basically presented the strategy to them and got them to buy into going in big on generative AI.
Na
So good foundation to build on is important. The next step, taking it upstairs to have this decided strategic vision from the CEO herself. Maybe tell us a little bit more about the strategy itself and also the impact you see as an outcome.
Donald
The strategy, very quickly, we decided that we were going to focus on internal use cases within OCBC, and we decided that for a couple of reasons.
Firstly, we could see there was a lot of potential benefits in applying generative AI to internal processes. Secondly, generative AI was still a nascent technology. You know, there were potential risks there. And so, we wanted to experiment with it in a less risky environment, working with our employees first, building experience, building the guardrails, before we then ultimately take it and apply it to customer facing use cases. So, our strategy was really twofold. The first thing we did was rolled out a POC, a very quick win, something called OCBC GPT, and we realized that back in Q2 last year, people were already using ChatGPT on their own devices. Basically, we wanted every employee to have the power of ChatGPT, but within the OCBC environment, through an interface that they were familiar with, which in our case was Microsoft Teams.
In May last year, we rolled out OCBC GPT to the OCBC employees. But OCBC GPT on its own is not transformative. To me, that's the table stakes. Our generative AI strategy is taking two prongs. The first prong is building a suite of what we call universal assistants. These are tools that are really designed to help every OCBC employee be more productive. We have things like OCBC GPT to help people draft content to come up with ideas. We have Buddy, which is our RAG powered knowledge assistant. It searches for 400,000 documents across OCBC every day, and if an employee has a question on policy, procedures, acronyms, then Buddy helps them find answers within OCBC's knowledge base. Another example would be Document AI, [which] is effectively a tool where you can drag and drop any document. You can summarize the document; you can ask questions of that document without having to read the whole thing. So, these universal assistants are really designed to help every single employee be slightly more efficient every day.
The second focus is then more vertically aligned. What we've been doing is building role specific copilots. We're rolling out these for different job functions across OCBC. It's about how can we take generative AI and embed it directly into the way that your team works and into your processes. We have things like OCBC Wingman, which is our coding assistant for the IT team, to help them develop software faster.
For sales teams, we have the RM copilots, which helps them pull together customer information, generate talking points when speaking to a customer. We’re increasingly rolling these things out in areas like the contact center, our operations teams, compliance teams, even legal, is ultimately where we're going to get to. So, every team in OCBC should have a copilot that helps them be more efficient every day.
Na
The universal tools helping every employee to be a little bit more efficient, that’s great, but that's not truly transformative. I think the path you just described certainly would be familiar for a lot of the practitioners, industry leaders like you.
The next big question is when you’re building these specific vertical use cases, it's all about how you cleverly build generative AI into the process itself. You’re probably facing a thousand use case requests. You want to do a thousand times faster; you want to do a thousand times more. The question here is, how do we actually scale it up to meet this demand? And what are some of the big step changes you’re willing to make?
Donald
We’ve definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. If you go back a year, probably 20% of the projects that we were receiving were generative AI projects, but today that’s 50-50. If anything, the generative AI side is accelerating even more, so that's where we see the growth. To handle that growth, we really have to transform in three areas. We've had to transform our people, our technology, and our process.
When it comes to people, we've really done two things. First thing we did was really prepare every employee in OCBC to able to leverage generative AI. We’ve built our own internal training courses that teach every employee what is generative AI – how does it work, and what are some of the limitations? And then teaching everyone to be an expert prompter. So now more than 10,000 employees across the organization, including our CEO, have taken our prompt engineering training, and they're able to maximize the value from generative AI.
Second thing we've done on the people side is we look at the data team itself. When we first started, we focused all of the generative AI development into one squad. They were really our center of excellence for generative AI. Today, because there's so much demand, we cannot focus everything into one team, they just can’t scale. Now all of the data science teams have been cross-trained – they do traditional AI and they also do generative AI. At the same time, we still maintain a small center of excellence, and they're responsible for [not only] being the experts when it comes to generative AI, but also building the frameworks, the utilities that are then leveraged by all of the other data science teams across the organization. Things like our guardrails, our agent frameworks, for example, that would be done by the central team.
The second thing we transformed was the technology. One guiding principle we have is that generative AI, we should do it through one platform. We don't have multiple teams across the organization competing and bringing in their own LLMs. So, my team is responsible for evaluating all of the LLMs as they get released in the market, and we spent all our time using open source LLMs. We have structured frameworks that allow us to evaluate the models to see if they are better than what we're already using, and also look at things like toxicity and gender bias, understanding what the limitations of these models are. We do that centrally. My team also builds things like the guardrails, [like] the RAG system. We understand the context of material being put into the models, the hallucination detection, even things like hallucination rectification through FLARE. My team, we build that centrally and we deploy that as services. Anyone in OCBC who's leveraging our LLMs as a service capability then gets access to [it]. That allows us to accelerate rollout of generative AI across OCBC quite significantly.
The third area we transform is really the way we deploy generative AI into the business. It's all about the process. I think when we first started, when we were doing those universal assistants, they were kind of disparate. They were quite isolated on people's desktops. Today our focus is on what we call AI powered platforms, which is looking at what are the key platforms used by OCBC employees across the business and making sure we actually embed generative AI directly into the workflows within those processes. So, when every employee is doing their day-to-day work generative AI can be there behind the scenes, helping everyone be slightly more productive, even if they're not aware that they're actually interacting with generative AI.
Na
So fantastic. Really multifaceted approach. Touching people, technology, and process. Certain elements are definitely centralized to provide the consistency and the ability to accelerate across. That's fantastic. Thank you once again. Thank you, Donald for your insight and for your time. Thank you so much.
Donald
Pleasure. Thank you very much.
- About This Video
- Transcript
As one of the first banks in the region to scale generative artificial intelligence (AI) applications, OCBC’s Head of Group Data Office, Donald MacDonald, discusses how the bank transformed its processes, technologies, and employees to enable the rapid adoption of generative AI.
INFocus Series
INFocus provides exclusive insights and trends from experts and leaders across the Asia Pacific region, exploring the forces, opportunities, and challenges shaping its future.
Donald MacDonald
We've definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. To handle that growth, we've had to transform our people, our technology, and our process.
[series intro clip]
Na Zhou
I'm Na Zhou, partner at Oliver Wyman, and AI lead for the Asia Pacific region. I'm here with Donald.
Donald
I’m Donald MacDonald, head of group data office at OCBC.
Na
Donald, pleasure to have you here. Today we want to talk about AI. But out of everything we could talk about AI, I want to focus on something relevant for practitioners scaling AI for impact.
Now for OCBC, you are probably one of the first banks in the region to have scaled up generative AI applications to all of your 30,000 employees. So, my question to you – how come you were able to move so fast?
Donald
I think we're able to move fast, because I think we were very quick to identify the transformative power of generative AI, but also, we weren't starting from scratch. We had a lot of the key building blocks in place already from other work we've done in the past. If you go back to 2019, OCBC was already using the first generation of large language models, things like BERT and FinBERT, and we were using those to read the news, to identify bonds that were at risk of default, identify credit early warning signals, even make stock recommendations to our customers.
It was a space that the team were actively monitoring and abreast of. We also had capabilities in place like our team had skills in deep learning, in natural language processing. That quite easily allowed us to move quickly to generative AI. We had on-premise GPU clusters, and we had quite a mature MLOps practice as well, which allowed us to bring in these models. When we build applications, [we] deploy them out to end users. Having those kinds of basics in place were very, very applicable to the generative AI world.
The team – because we were actively monitoring the space, when ChatGPT came out at the end of 2022 – very quickly identified that this was better than what we were already using, and we could see that it allowed us to do a lot of things that in the past with traditional AI would have taken a long amount of time. We could do them much, much faster now, but it also allowed us to do things that we couldn't do before. Things like our coding assistant, for example. Two years ago, I could never have imagined we could have built that, let alone build it within one week of effort. So those kinds of things showed us that generative AI was truly transformative, and that we wanted to double down and go big on this space.
With that in mind, we basically identified about ten use cases across the organization we thought would be valuable. We took it to the top level of management from the CEO down and basically presented the strategy to them and got them to buy into going in big on generative AI.
Na
So good foundation to build on is important. The next step, taking it upstairs to have this decided strategic vision from the CEO herself. Maybe tell us a little bit more about the strategy itself and also the impact you see as an outcome.
Donald
The strategy, very quickly, we decided that we were going to focus on internal use cases within OCBC, and we decided that for a couple of reasons.
Firstly, we could see there was a lot of potential benefits in applying generative AI to internal processes. Secondly, generative AI was still a nascent technology. You know, there were potential risks there. And so, we wanted to experiment with it in a less risky environment, working with our employees first, building experience, building the guardrails, before we then ultimately take it and apply it to customer facing use cases. So, our strategy was really twofold. The first thing we did was rolled out a POC, a very quick win, something called OCBC GPT, and we realized that back in Q2 last year, people were already using ChatGPT on their own devices. Basically, we wanted every employee to have the power of ChatGPT, but within the OCBC environment, through an interface that they were familiar with, which in our case was Microsoft Teams.
In May last year, we rolled out OCBC GPT to the OCBC employees. But OCBC GPT on its own is not transformative. To me, that's the table stakes. Our generative AI strategy is taking two prongs. The first prong is building a suite of what we call universal assistants. These are tools that are really designed to help every OCBC employee be more productive. We have things like OCBC GPT to help people draft content to come up with ideas. We have Buddy, which is our RAG powered knowledge assistant. It searches for 400,000 documents across OCBC every day, and if an employee has a question on policy, procedures, acronyms, then Buddy helps them find answers within OCBC's knowledge base. Another example would be Document AI, [which] is effectively a tool where you can drag and drop any document. You can summarize the document; you can ask questions of that document without having to read the whole thing. So, these universal assistants are really designed to help every single employee be slightly more efficient every day.
The second focus is then more vertically aligned. What we've been doing is building role specific copilots. We're rolling out these for different job functions across OCBC. It's about how can we take generative AI and embed it directly into the way that your team works and into your processes. We have things like OCBC Wingman, which is our coding assistant for the IT team, to help them develop software faster.
For sales teams, we have the RM copilots, which helps them pull together customer information, generate talking points when speaking to a customer. We’re increasingly rolling these things out in areas like the contact center, our operations teams, compliance teams, even legal, is ultimately where we're going to get to. So, every team in OCBC should have a copilot that helps them be more efficient every day.
Na
The universal tools helping every employee to be a little bit more efficient, that’s great, but that's not truly transformative. I think the path you just described certainly would be familiar for a lot of the practitioners, industry leaders like you.
The next big question is when you’re building these specific vertical use cases, it's all about how you cleverly build generative AI into the process itself. You’re probably facing a thousand use case requests. You want to do a thousand times faster; you want to do a thousand times more. The question here is, how do we actually scale it up to meet this demand? And what are some of the big step changes you’re willing to make?
Donald
We’ve definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. If you go back a year, probably 20% of the projects that we were receiving were generative AI projects, but today that’s 50-50. If anything, the generative AI side is accelerating even more, so that's where we see the growth. To handle that growth, we really have to transform in three areas. We've had to transform our people, our technology, and our process.
When it comes to people, we've really done two things. First thing we did was really prepare every employee in OCBC to able to leverage generative AI. We’ve built our own internal training courses that teach every employee what is generative AI – how does it work, and what are some of the limitations? And then teaching everyone to be an expert prompter. So now more than 10,000 employees across the organization, including our CEO, have taken our prompt engineering training, and they're able to maximize the value from generative AI.
Second thing we've done on the people side is we look at the data team itself. When we first started, we focused all of the generative AI development into one squad. They were really our center of excellence for generative AI. Today, because there's so much demand, we cannot focus everything into one team, they just can’t scale. Now all of the data science teams have been cross-trained – they do traditional AI and they also do generative AI. At the same time, we still maintain a small center of excellence, and they're responsible for [not only] being the experts when it comes to generative AI, but also building the frameworks, the utilities that are then leveraged by all of the other data science teams across the organization. Things like our guardrails, our agent frameworks, for example, that would be done by the central team.
The second thing we transformed was the technology. One guiding principle we have is that generative AI, we should do it through one platform. We don't have multiple teams across the organization competing and bringing in their own LLMs. So, my team is responsible for evaluating all of the LLMs as they get released in the market, and we spent all our time using open source LLMs. We have structured frameworks that allow us to evaluate the models to see if they are better than what we're already using, and also look at things like toxicity and gender bias, understanding what the limitations of these models are. We do that centrally. My team also builds things like the guardrails, [like] the RAG system. We understand the context of material being put into the models, the hallucination detection, even things like hallucination rectification through FLARE. My team, we build that centrally and we deploy that as services. Anyone in OCBC who's leveraging our LLMs as a service capability then gets access to [it]. That allows us to accelerate rollout of generative AI across OCBC quite significantly.
The third area we transform is really the way we deploy generative AI into the business. It's all about the process. I think when we first started, when we were doing those universal assistants, they were kind of disparate. They were quite isolated on people's desktops. Today our focus is on what we call AI powered platforms, which is looking at what are the key platforms used by OCBC employees across the business and making sure we actually embed generative AI directly into the workflows within those processes. So, when every employee is doing their day-to-day work generative AI can be there behind the scenes, helping everyone be slightly more productive, even if they're not aware that they're actually interacting with generative AI.
Na
So fantastic. Really multifaceted approach. Touching people, technology, and process. Certain elements are definitely centralized to provide the consistency and the ability to accelerate across. That's fantastic. Thank you once again. Thank you, Donald for your insight and for your time. Thank you so much.
Donald
Pleasure. Thank you very much.
As one of the first banks in the region to scale generative artificial intelligence (AI) applications, OCBC’s Head of Group Data Office, Donald MacDonald, discusses how the bank transformed its processes, technologies, and employees to enable the rapid adoption of generative AI.
INFocus Series
INFocus provides exclusive insights and trends from experts and leaders across the Asia Pacific region, exploring the forces, opportunities, and challenges shaping its future.
Donald MacDonald
We've definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. To handle that growth, we've had to transform our people, our technology, and our process.
[series intro clip]
Na Zhou
I'm Na Zhou, partner at Oliver Wyman, and AI lead for the Asia Pacific region. I'm here with Donald.
Donald
I’m Donald MacDonald, head of group data office at OCBC.
Na
Donald, pleasure to have you here. Today we want to talk about AI. But out of everything we could talk about AI, I want to focus on something relevant for practitioners scaling AI for impact.
Now for OCBC, you are probably one of the first banks in the region to have scaled up generative AI applications to all of your 30,000 employees. So, my question to you – how come you were able to move so fast?
Donald
I think we're able to move fast, because I think we were very quick to identify the transformative power of generative AI, but also, we weren't starting from scratch. We had a lot of the key building blocks in place already from other work we've done in the past. If you go back to 2019, OCBC was already using the first generation of large language models, things like BERT and FinBERT, and we were using those to read the news, to identify bonds that were at risk of default, identify credit early warning signals, even make stock recommendations to our customers.
It was a space that the team were actively monitoring and abreast of. We also had capabilities in place like our team had skills in deep learning, in natural language processing. That quite easily allowed us to move quickly to generative AI. We had on-premise GPU clusters, and we had quite a mature MLOps practice as well, which allowed us to bring in these models. When we build applications, [we] deploy them out to end users. Having those kinds of basics in place were very, very applicable to the generative AI world.
The team – because we were actively monitoring the space, when ChatGPT came out at the end of 2022 – very quickly identified that this was better than what we were already using, and we could see that it allowed us to do a lot of things that in the past with traditional AI would have taken a long amount of time. We could do them much, much faster now, but it also allowed us to do things that we couldn't do before. Things like our coding assistant, for example. Two years ago, I could never have imagined we could have built that, let alone build it within one week of effort. So those kinds of things showed us that generative AI was truly transformative, and that we wanted to double down and go big on this space.
With that in mind, we basically identified about ten use cases across the organization we thought would be valuable. We took it to the top level of management from the CEO down and basically presented the strategy to them and got them to buy into going in big on generative AI.
Na
So good foundation to build on is important. The next step, taking it upstairs to have this decided strategic vision from the CEO herself. Maybe tell us a little bit more about the strategy itself and also the impact you see as an outcome.
Donald
The strategy, very quickly, we decided that we were going to focus on internal use cases within OCBC, and we decided that for a couple of reasons.
Firstly, we could see there was a lot of potential benefits in applying generative AI to internal processes. Secondly, generative AI was still a nascent technology. You know, there were potential risks there. And so, we wanted to experiment with it in a less risky environment, working with our employees first, building experience, building the guardrails, before we then ultimately take it and apply it to customer facing use cases. So, our strategy was really twofold. The first thing we did was rolled out a POC, a very quick win, something called OCBC GPT, and we realized that back in Q2 last year, people were already using ChatGPT on their own devices. Basically, we wanted every employee to have the power of ChatGPT, but within the OCBC environment, through an interface that they were familiar with, which in our case was Microsoft Teams.
In May last year, we rolled out OCBC GPT to the OCBC employees. But OCBC GPT on its own is not transformative. To me, that's the table stakes. Our generative AI strategy is taking two prongs. The first prong is building a suite of what we call universal assistants. These are tools that are really designed to help every OCBC employee be more productive. We have things like OCBC GPT to help people draft content to come up with ideas. We have Buddy, which is our RAG powered knowledge assistant. It searches for 400,000 documents across OCBC every day, and if an employee has a question on policy, procedures, acronyms, then Buddy helps them find answers within OCBC's knowledge base. Another example would be Document AI, [which] is effectively a tool where you can drag and drop any document. You can summarize the document; you can ask questions of that document without having to read the whole thing. So, these universal assistants are really designed to help every single employee be slightly more efficient every day.
The second focus is then more vertically aligned. What we've been doing is building role specific copilots. We're rolling out these for different job functions across OCBC. It's about how can we take generative AI and embed it directly into the way that your team works and into your processes. We have things like OCBC Wingman, which is our coding assistant for the IT team, to help them develop software faster.
For sales teams, we have the RM copilots, which helps them pull together customer information, generate talking points when speaking to a customer. We’re increasingly rolling these things out in areas like the contact center, our operations teams, compliance teams, even legal, is ultimately where we're going to get to. So, every team in OCBC should have a copilot that helps them be more efficient every day.
Na
The universal tools helping every employee to be a little bit more efficient, that’s great, but that's not truly transformative. I think the path you just described certainly would be familiar for a lot of the practitioners, industry leaders like you.
The next big question is when you’re building these specific vertical use cases, it's all about how you cleverly build generative AI into the process itself. You’re probably facing a thousand use case requests. You want to do a thousand times faster; you want to do a thousand times more. The question here is, how do we actually scale it up to meet this demand? And what are some of the big step changes you’re willing to make?
Donald
We’ve definitely had to scale up and change the way we work because we see huge demand for generative AI across the organization. If you go back a year, probably 20% of the projects that we were receiving were generative AI projects, but today that’s 50-50. If anything, the generative AI side is accelerating even more, so that's where we see the growth. To handle that growth, we really have to transform in three areas. We've had to transform our people, our technology, and our process.
When it comes to people, we've really done two things. First thing we did was really prepare every employee in OCBC to able to leverage generative AI. We’ve built our own internal training courses that teach every employee what is generative AI – how does it work, and what are some of the limitations? And then teaching everyone to be an expert prompter. So now more than 10,000 employees across the organization, including our CEO, have taken our prompt engineering training, and they're able to maximize the value from generative AI.
Second thing we've done on the people side is we look at the data team itself. When we first started, we focused all of the generative AI development into one squad. They were really our center of excellence for generative AI. Today, because there's so much demand, we cannot focus everything into one team, they just can’t scale. Now all of the data science teams have been cross-trained – they do traditional AI and they also do generative AI. At the same time, we still maintain a small center of excellence, and they're responsible for [not only] being the experts when it comes to generative AI, but also building the frameworks, the utilities that are then leveraged by all of the other data science teams across the organization. Things like our guardrails, our agent frameworks, for example, that would be done by the central team.
The second thing we transformed was the technology. One guiding principle we have is that generative AI, we should do it through one platform. We don't have multiple teams across the organization competing and bringing in their own LLMs. So, my team is responsible for evaluating all of the LLMs as they get released in the market, and we spent all our time using open source LLMs. We have structured frameworks that allow us to evaluate the models to see if they are better than what we're already using, and also look at things like toxicity and gender bias, understanding what the limitations of these models are. We do that centrally. My team also builds things like the guardrails, [like] the RAG system. We understand the context of material being put into the models, the hallucination detection, even things like hallucination rectification through FLARE. My team, we build that centrally and we deploy that as services. Anyone in OCBC who's leveraging our LLMs as a service capability then gets access to [it]. That allows us to accelerate rollout of generative AI across OCBC quite significantly.
The third area we transform is really the way we deploy generative AI into the business. It's all about the process. I think when we first started, when we were doing those universal assistants, they were kind of disparate. They were quite isolated on people's desktops. Today our focus is on what we call AI powered platforms, which is looking at what are the key platforms used by OCBC employees across the business and making sure we actually embed generative AI directly into the workflows within those processes. So, when every employee is doing their day-to-day work generative AI can be there behind the scenes, helping everyone be slightly more productive, even if they're not aware that they're actually interacting with generative AI.
Na
So fantastic. Really multifaceted approach. Touching people, technology, and process. Certain elements are definitely centralized to provide the consistency and the ability to accelerate across. That's fantastic. Thank you once again. Thank you, Donald for your insight and for your time. Thank you so much.
Donald
Pleasure. Thank you very much.