|Joerg Guenther, Managing Director, Global Co-Head of Securities Services Operations & Technology at Citi, discusses the lessons learned for both providers and clients for future initiatives.
While the start of the pandemic involved a massive and sudden change in work environments, it also coincided with significant market volatility. Was that something you’d factored in or was that an additional challenge?
We definitely saw unprecedented volumes and volatility at the time, but a historic commitment to digitalisation meant that we had enough headroom factored into our systems. As such, we were exceptionally well positioned to handle the trading volume spikes and were able to process everything with close to zero impact. After that experience, however, we have added capacity to create headroom in preparation for any additional future spikes.
Are there any lasting lessons you have learned from reviewing your operations during the pandemic?
I’d say there are two aspects to this: one related to meeting client servicing requirements and another to addressing the expectations of our own workforce. As far as the former is concerned, one thing that came to the fore for the industry as a whole was location risk in processing. It’s sometimes easier to say, “Let’s just move everything to a single low-cost location”. Citi does not have that single location risk as we operate in multiple sites across all geographies so we can easily follow the sun. Accepting the importance of optimisation, people nevertheless now realise they want to have choices in spreading workload capacity and backup locations should one or more locations be adversely affected by a crisis.
Going forward, with regard to our own workforce, Citi will be employing a hybrid model. Our current view is that we will not be exclusively remote nor we will be exclusively in the office. That will impact us on an ongoing basis. In a way, you test your business continuity on a constant basis every single day, which just makes you so much more ready for the unexpected. Overall, I’d say risk awareness has definitely sharpened during this period.
Does that apply to your clients as well?
It was a very mixed bag. We have clients of all shapes and sizes. Not all clients were equally well prepared and a few had to come up with a plan of action very quickly. We set up a monthly “lessons learned” consultation with our clients around what we were doing to address some of these risks. There was a lot of collaboration across the teams and it has made our relationship with our clients much stronger.
Do you feel you’re now in a position from a technology perspective to help your clients with likely future challenges?
In terms of the pandemic and any future situations that are similar in nature, we’ve learned a lot from what unfolded and incorporated this into our future plans.
I think the ability to allow our clients some flexibility in terms of how they interface with us is really important. Our architectural vision is all about freedom of choice with different channels to send us data and different channels to receive data from us. We will absolutely move much more to real-time dissemination of this data. We’ve already made a lot of progress in the use of APIs.
Not all clients have thought through what their operating model will be in this digitised framework. Others have very solid plans, and they challenge us all the time. And then you have a few that are not that far along in the journey.
Has the client attitude to outsourcing changed at all over the past 18 months?
Clients certainly realise the value of having a service provider that has a greater level of resilience and flexibility built in. We are seeing a general interest from clients in broadening the relationship to include more value-added services such as tax filing and regulatory reporting.
In addition to the range of services they can provide, there has been a realisation for a while that custodians are sitting on enormous amounts of data. The question remains how that can be turned into something actionable for clients.
The discussion around data is very interesting. From an analytics perspective, you really want to provide clients with the ability to take data from any source (their own data, data from a custodian or other third party) and enable them to perform their own analytics. And this is what we are offering to our clients – they have access not only to analytical front ends and dashboards, but also the underlying APIs, so that if they want to receive the data to do their own analytics, they have that choice.
Data science is still evolving and I think cloud-based solutions, where you can marry and correlate different data elements are likely to appear.
In that regard, do you see the emergence of standards or is this likely to be an area of competitive differentiation?
Many custodians, fund accountants and middle office providers are looking closely at new offerings based on the data they hold. I don’t know how differentiated that’s really going to be because everybody is moving in that direction. I think the differentiation is really going to come from the actual analytical content and the ability to choose components of these analytical elements and assemble them with your own data and with other third-party data. Our recent partnership with Snowflake is aiming to standardize data models and establish a cloud native best practice for these use cases.
What do you do with legacy technology in this digitisation drive?
I’ve seen many attempts at a complete rewrite of legacy systems, a big bang approach, and not one of them has been really successful. You need to de-risk that modernisation using an incremental approach. We don’t necessarily view legacy technology as a bad thing, but we do recognise the need to evolve and we are committed to a multi-year effort to build the right architecture.
For us, it is all about greater agility, being nimbler and faster to market with enhancements, while at the same time focusing on an incremental approach that de-risks the deployment and release of these changes. Our blueprint, our vision, is to use event-driven integration and loosely coupled systems to bridge the divide between legacy and new technology. This “Strangler Fig Pattern” approach involves introducing a layer of abstraction, which ultimately allows legacy applications to communicate with newer solutions such as microservices and cloud-based solutions. This enables us to incrementally replace and evolve away from the legacy without the inherent risk of a big bang approach.
You have to look at it holistically: the right architectural blueprint, the right operating model, the right organisational structure, the right resources, and the right delivery method. And that is exactly what we’re looking to do.
*Reprinted from Global Custodian 2021 © 1989-2021 Tungsten Publishing Inc. All Rights Reserved