Used by Delta Capita to publish ‘Points of View’ posts to the Publications section / the homepage of the main website.

Delta Capita By Karan Kapoor, Head of Regulatory Change and Technology

As debates on key CSDR Settlement Discipline issues remain unresolved and the expectations of the delay come closer to realisation, the industry understandably remains in a state of flux. However, regardless of precisely what happens with the timing of the regulation, it is now the time to get more disciplined about settlements, which is strengthened further by the latest ESMA Trends Risks and Vulnerabilities (TVR) report.

The study shows a dramatic surge in the level of settlement fails during the second half of March. According to the report, fails climbed to around 14% for equities and close to 6% for government and corporate bonds.

Although, one can argue that COVID-19 induced volatility and adaptations to work environments have driven, what the study states to be, the most significant rise in European trade settlement fails since 2014. The truth, however, is that these numbers put into sharp focus the longstanding operational and structural issues that have hung over trade settlement processes like a dark cloud for far too long now.

Identifying and remediating the root cause of any settlement delay or failure in time to avoid CSDR consequences wastes on average 4-6 hours of capacity. Market participants, therefore, need to shift their attention away from how to deal with trade fails, towards pre-emptively reducing the number of transactions that are failing to settle through internal efficiency improvement and collaborative approaches.

Achieving the above objective is by no means a straightforward task, as most financial institutions still operate with an inherited legacy technology architecture and highly complex operating models.

Firms should explore solutions that could address the issue of settlement fails that do not require wholesale changes to their existing architecture at unmanageable costs.

In the post CSDR era, every hour will matter when it comes to settlement efficiency, as the luxury of ‘another day’ to resolve a failing trade will not be possible. Increasing the control organisations have over their trade lifecycles and identifying settlement delay or potential failure warning signs on T+0 will give participants a considerable advantage. Detecting and being alerted to transaction event anomalies in real-time will provide impacted teams valuable time to prevent failures before CSDR consequences materialise.

Improving the discipline around internal settlement processes and encouraging counterparts to follow suit through collaboration, incentives or slaps on the wrist where relevant, is where we see the market trending. 

Whether the European Commission confirms the delay of CSDR or not, it will be irrelevant as long as the industry conforms to find a solution to ensure penalties and buy-ins do not occur in the first place.

If you want to discover more about CSDR or want to know how we can help you with your CSDR transformation then click on the banner below or email us at marketing@deltacapita.com.

How banks can finally reduce their Karbon footprint

By Gary McClure, CEO KYC Business Services

Imagine this, you are a COO of a major bank managing hundreds, sometimes thousands of people globally on Know Your Customer (KYC) and Anti-Money laundering (AML) tasks. Time that could otherwise be spent focusing on the better risk decisions is instead used to run an army of people gathering information on clients, all before inputting the information (often incorrectly) into multiple systems.

For those that work on this daily, it will no doubt sound familiar, but when one actually steps backs and looks at the bigger picture, all KYC is really about is harvesting and inputting data into a system before then deciding on whether or not to on-board or retain a customer. Sounds simple when put in these terms, so why do so many banks still have KYC processes in place that are costing millions in additional tech and operational expenditure per year?

Read more

High-Touch in a Low-Touch Environment Series: Wealth Management – Webinar Summary & Key Takeaways

By Rezwan Shafique – Head of Consulting, UK

The time is ripe for high-touch and low-touch environments to converge for the wealth management sector. On May 20, I had the pleasure of participating on the panel of Delta Capita’s High-Touch in a Low-Touch Environment webinar. Hosted by Delta Capita’s commercial officer, Julian Eyre, I was joined by industry experts Anand Rajan from UBS Wealth Management U.S., Hugh Adlington from Close Brothers Asset Management and Barclays’ James Penny. (Please click here to hear the full recording of the webinar).

Julian did a great job of chairing the panel. As I expected, the panelists shared similar views about convergence between high-touch, low-touch and the future of digital client services in the wealth management industry.  We discussed strategies to reduce costs and increase revenues in a wider digital engagement context. We also pondered what tools are missing from our current portfolios to bridge the gap between low-touch and high-touch client engagement, and the roles of mobile, machine learning and AI. Furthermore, we talked about how the recent Covid-19 lockdown affected our client engagements and we introduced our digital client engagement services.

Following are some of my takeaways, with credit to my friends and colleagues’ insights.

Read more

The banking industry operating model is broken and needs dramatic reinvention for banks to survive in the longer term.

The clear parallels

As we look at the banking industry operating model today, it looks similar to that of the airline industry in the late 1980s.  The entire business value chain proprietary owned and operated by the airlines themselves, including non-core functions, such as ticketing, luggage handling, ground services and catering etc. This model resulted in little to no recurring investment in the non-core functions that were considered low value, cost centres. In reality, these functions were instead subject to continuous cost challenge without investment. This approach did little for client experience, operational efficiency and risk management. Morale of the staff who had the thankless task of managing and maintaining service continuity in these functions was extremely low (sound familiar?). Further, this model distracted investment and management focus from delivering differentiation in the airline’s core proposition. The outcome – poor quality services, customer dissatisfaction and consistent failure to deliver on a strategic return on equity; and in some cases business failure.

Thankfully with the benefit of hindsight, we can also look to the airline industry for the solution – adoption of a supply chain model.  Integration into an eco-system of specialist providers/strategic business partners providing non-core services and functions. By developing standardised services on multi-tenanted platforms to multiple airlines, these specialist providers were able to achieve utility scale and the platform investment not possible for a single airline operator. In turn, this allowed the airlines to focus on differentiation of proposition. The outcome – improved cost income ratios, better quality support services and enhanced customer experiences, resulting in improved returns on equity.

However, despite these clear lessons, and banking industry executives agreement on the topic, industry body research and advice from leading strategic advisory firms over the years – the industry still remains lethargic and little action has been taken.

Read more

The IBOR transition is both an opportunity and a threat to every financial institution. The opportunity is that a clean, well-managed IBOR transformation will enable a firm to take advantage of the possibilities that arise from changing one of the foundations of current financial markets. The threat is that a poorly managed, lagging transition will at best cost a firm market share and at worst incur significant regulatory attention and intervention.

The FCA as lead regulator is making it clear that Friday December 31, 2021 (YE 2021) is the hard deadline for the end of LIBOR. Industry bodies (ISDA etc.) and working groups are currently finalising the new fallback and trigger provisions for each product to create a transition path away from IBOR. The current proposals differ significantly across products, countries, and timeframes and the transition from unified IBOR term rates to multiple differing solutions is creating many issues, some of which have yet to be identified due to the uncertainty involved.

Want to learn more? Click here to receive the full, free white paper directly to your email inbox.

In a recent article published in the Securities Lending Times SFTR 2019 Annual, Jonathan Adams (Managing Principal, Delta Capita) questions whether SFTR will ultimately deliver more efficiency and transparency to the securities finance market.

When the first Securities Financing Transactions Regulation regulatory technical standard (RTS) was released to market participants for consultation, it was met with dubious distain. Of the 153 fields, few could imagine how it would be possible to report more than 30 fields. Settlement matching had been on dates, security identifiers, counterparty information and economic terms.

It begs the question, how can a reporting regime of this complexity, fraught with the risk of matching failure, serve the regulator in determining the potential for systemic failure? Moreover, how can it benefit market participants?

Click here to read the full article.

James Proudman, Executive Director for UK Deposit Takers Supervision at the Bank of England, spoke on 4th June at the FCA Conference of Governance in Banking about the implications of artificial intelligence (AI) and machine learning (ML) on governance in banking.

Proudman told the audience that the governance of AI adoption is “a topic of increased concern to prudential regulators” since “governance failings are the root cause of almost all prudential failures” and that managing the associated risks is an increasingly important strategic issue for boards of financial services firms.

While Proudman made sure to highlight the potential benefits of AI applications in areas such as securities trading, anti-money laundering (AML), fraud detection and credit risk assessments, he stressed that as a prudential regulator, the Bank of England has a need to understand “how the application of AI and ML within financial services is evolving”, the implications on the risks to firms’ “safety and soundness”, and in turn, how these risks can be mitigated through the banks’ internal governance, systems, and controls.

Speaking in reference to a survey of AI/ML adoption in finance currently being conducted by the Bank of England and the FCA, he stated that there is general agreement that although AI and ML can reduce risks, “some firms acknowledged that, incorrectly used, AI and ML techniques could give rise to new, complex risk types”.

Proudman suggested that the retrieval, processing, and use of data may pose a significant challenge, pointing to three potential causes of data-related risk:

  • the expanding scale of managing problems related to poor data quality as data availability and sources balloon,
  • ethical, legal, conduct, and reputational issues associated with the use of personal data, and
  • distortions resulting from biases in historical data and assumptions built into ML algorithms.

His insistence on the “the need to understand carefully the assumptions built into underlying algorithms” and “the need for a strong focus on understanding and explaining the outcomes generated by AI/ML” sends a clear signal to firms to incorporate ML explainability tools into their model development and validation workflow. Directly applied on a ML model, such tools allow modellers and testers to understand both why any individual decision was taken and how the inputs to a model interact to make it work the way it does as a whole. ML explainability tools can also be applied after AI/ML is approved for use. As Proudman notes, governance has a role to play during the deployment and evaluation stages as well as for correcting erroneous machine behaviour. To ensure proper oversight, a ‘human in the loop’ can make use of explainability tools to support their decision in favour or against shutting down an algorithm, for example.

Proudman further proposed that regulations designed to deal with human shortcomings, such as “poorly aligned incentives, responsibilities and remuneration” or “short-termism” remain as relevant in an AI/ML-centric work environment and that it will be crucial to ensure clear individual accountability for machine-driven actions and decisions.

The implication that individual employees, including senior management, may be held responsible for actions or decisions taken by a machine reinforces the case for facilitating human-friendly model explainability. Boards should think about how the right tools best enable their workforce to comprehend the reasons for, say, a rejected mortgage application, and whether the model that made that decision did so because of built-in human biases. Since the person responsible will not necessarily be proficient in the language of AI/ML, it is crucial that these tools facilitate human-friendly interpretations and, in turn, informed decision-making.

Proudman also affirmed that he sees increased execution risks arising from the acceleration in the rate of AI/ML adoption and proposed that boards should ensure that firms possess the skill sets and controls to deal with these risks.

Boards should heed Proudman’s call to align their governance structures with the challenges of AI/ML. In addition to the obvious benefits to the business, having knowledge of what the models are doing and being able to explain how they work may prove invaluable when it comes to anticipating new rules for transparency and interpretability requirements of ML models.

Other related issues, such as data privacy, also have implications for corporate governance which can be addressed using AI/ML tools. As an example, sending human voice data to the cloud through voice-activated mobile applications may expose users to risks of illegitimate data use and can cause distrust in a firms’ data practices. To avoid this, model compression tools can be applied to reduce the size of speech recognition models and consequently allow voice data to be processed locally so that they never leave the device.

______

Alexander Klemm

Consultant

Delta Capita

In this latest white paper, Edward Adcock & Thuy Nguyen (Data Science Consultants, Delta Capita) discuss machine learning model challenges, and introduces the Delta Capita DC MINT platform, which helps clients to address these challenges.

Click here to read the white paper in full.

Many articles have been written and discussed on the Three Lines of Defence model. Some have theorised on its implementation and many have collectively discussed the challenges that organisations have faced, and a few have outlined why it may not be appropriate. However, that all said, the FCA’s 2017 review of Compliance found that all firms that participated in the survey had adopted the Three Lines of Defence model.

In this article, David Long, Charanpal Matharu and Nick Wilcock outline some key insights observed by the Non-Financial Risk Practice at Delta Capita. This may prompt organisations to review the effectiveness of their framework.

Read more

Payment glitches that cause shopping chaos during peak periods like Black Friday have become an all too common experience for South African consumers. South Africans have bitten the Black Friday bug that happens every year in late November, and which is typically followed by Cyber Monday. The craze has only become bigger with every year.

During Black Friday 2017, Standard Bank’s transaction volumes on its credit and debit cards spiked more than 100 percent compared with the same event the previous year. In 2016, Absa said its total customer issuing spend for the Black Friday weekend was just more than R1 billion.

It’s unsurprising then that with high transaction volumes comes higher risk.

In this article, Earl McCausland (DC Managing Director, South Africa) & Trevor Belstead (DC Head of Transaction Banking Solutions) discuss bloated legacy systems, and why banks should ultimately consider utilising transaction monitoring technology, which allows them to instantly locate missing payments and proactively avoid payment outages.

Click here to read the full article.