Research

Analytics Platform and Program: Keys to Success for Regulatory Compliance in Financial Services

By David Macdonald, Robert Morison, Jan 11, 2017

Download the PDF

The Big Ideas

  • Advanced analytics are at the heart of regulatory compliance processes in financial services.

  • Fiduciary responsibilities can be mistakenly compromised when using open source analytics technologies to address regulatory requirements.

  • Penalties for noncompliance can be severe, and financial services leaders must pay close attention to the quality and efficiency of their analytics platforms and programs.

  • A best-of-both-worlds approach leverages open source technologies, especially in DevOps, while delivering production models with proven technologies that satisfy the regulators.

Introduction

In financial services and other highly regulated industries, regulatory compliance depends more than ever on a company’s analytical capabilities. With increasing regulatory requirements and scrutiny, reliable analytics are essential to the timely accuracy in reserves calculations, stress tests, due diligence, and regulatory compliance. The analytics tools and technologies employed, the flexibility of the technology platform, and the comprehensiveness of the overall analytics program all can accelerate, or compromise, the business processes for regulatory compliance.

A critical decision today is how much to rely on open source technologies. Popular with many analysts and data scientists, they have the advantages of low initial cost and new functionality in specific domains. But they introduce often unforeseen problems and risks:

  • They may improve individual tasks at the expense of complicating the overall analytics process.

  • They may lack the scalability to become production analytics models without extensive reprogramming.

  • Most important, they may lack the transparency, auditability, and track records that regulators demand in verifying analytical calculations and the business decisions made with them.

Analytics technology selection can be trickier and far more consequential than people recognize. Both business and technology leaders in financial services should regularly review their analytics platforms and programs with an eye toward both the accuracy of models and the efficiency of the regulatory compliance process. This research brief discusses the challenges and recommended approaches.

Formidable Challenges

Data Enormity

Technological challenges of regulatory compliance start with the already enormous and still growing amount of data that must be analyzed (new rules and regulations never ask for less data). Companies need the computing power to run models in a timely and frequent fashion. Inefficiencies occur if all of the data being collected and stored have to wait for computing resources to complete an analysis or initiate an investigation. The computing platform has to be able to scale up as data volumes and analytical demands grow.

Data Preparation

A supervised or unsupervised model’s performance and predictability depend wholly on the features created from the raw data. In fact, how the data is presented to the algorithm determines the functional and operational performance of the model. Successful modelers and Kaggle competitors often attribute the bulk of the effort to feature engineering. In other words, the modeler who has prepared the data for the algorithm is more likely to build less complex models that run faster, fit better, and are easier to understand and simpler to maintain. Data preparation in building models is described in many ways, including data transformation, imputation, attribute creation, feature extraction, feature selection, variable reduction, and so on. Whether you are dealing with images, text, signal or time series data, the first order of business is to determine what is the best representation of the data to build a model for a particular algorithm.

Flexible Platform

The computing platform and data acquisition methods require flexibility and responsiveness to handle unanticipated or unscheduled demands. This calls for an open and accommodating technical platform. Many requests from regulators come in an ad hoc fashion where investigations and queries need to be carried out in new ways. For example, when the Panama Papers revealed how wealthy individuals kept information about their financial assets hidden, many companies scrambled to assess their exposure through named entities, and they developed risk mitigation strategies. This need arose suddenly, and the response had to be implemented quickly. This situation is not dissimilar to a natural disaster or a major shock to the system for a particular market. Rather than being purely reactive, companies need the ability to plan ahead and test the impact of various scenarios, such as assumptions on capital, liquidity and reserves.

Comprehensive Program for Transparency

Implementing a comprehensive program for controlling and managing the ways in which data, analytics and models are used poses major challenges. The program tracks what is happening and why, the methods used, and how results are developed. Each part of the program requires the degree of transparency necessary for regulators to assess the accuracy, reliability, and soundness of the analysis. Companies must be able to document and justify to auditors and regulators compliance processes such as supporting capital reserves calculations and creating suspicious activity reports. This documentation is also vital to improving the performance of models and supporting future efforts in detecting irregular activity.

A program is best structured around the entire lifecycle of analytical models: development and testing, implementation in production, evaluation and enhancement, and eventual retirement. Essential are good processes to manage usage (who is allowed to use the model and for what purpose) and relationships among the models (including the sequence in which they are run). All this ensures the integrity of the underlying dependencies and the overall compliance process.

Technological Choices

A fully controlled, programmatic and automated approach to managing models seems ideal, but things are not that simple. There are two complicating forces at work. The first is the need for a flexible and open platform to encourage innovation. As part of the rapid growth of big data and advanced analytics, new technologies, tools and analytical methods are appearing all the time. The most prolific analysts and data scientists are curious; they want to understand and use the best tools. Companies want to provide the flexibility to use the best tools for the task at hand, and next month’s best tools may well be different from today’s, especially in DevOps environments.

The second force is the preferences of data scientists and analysts, especially recent graduates. Many of them favor and have experience with open source tools, like Python for programming and R for statistical functions. Given the continuing shortage of analytics professionals, companies need to accommodate these preferences to maximize recruitment, retention, and overall analyst productivity.

This tension — between flexible and auditable, presized and scalable, and open source and commercially available tools — plays out in several ways. Departmental expedience may be in conflict with overall productivity and risk control. For example, assume there are 10 individual steps associated with running an analytical process. Steps two and six may have tools available that are faster, cheaper, or more convenient. Individually, the new tools make steps two and six more advantageous, but the added work of integrating steps two and six with adjoining steps makes the end-to-end process less efficient. It is important to keep in mind the bigger picture of the analytical life cycle.

Another common challenge is the need to rewrite algorithms and reprogram models either for scale or to move them from development into production. Some analytical problems need to be tackled with multiple threads and multiple nodes to handle data volumes and process in reasonable time. This poses additional considerations for using programs that are natively single threaded, such as Python. In other cases, the key to speed and efficiency is to avoid moving large volumes of data. Instead, the analytics are moved to run inside the database or data lake. Depending on the tools being used, code may need to be rewritten. Today, many models, after their development in a structured language, are reprogrammed in lower-level languages like C++ so they can process more efficiently in a runtime/production environment.

Importantly, the preferences of analysts may not align with those of the regulators. Open source tools can lack the transparency and the ability to trace records that auditors and regulators typically rely on. There are cases where models built with open source tools are duplicated with SAS technology and run in parallel to validate the original versions. The SAS algorithms have come to serve as the yardstick because they are published, widely known, and accepted among regulators for doing what we say they will do.

Best of Both Worlds

The issues discussed above are unlikely to be resolved in favor of either extreme. SAS provides integrated analytics platforms and automated processes for end-to-end model management. Analytical models can be deployed from development into production in a consistent environment without rewriting and with the governance, control, and transparency around data and models that regulators demand.

At the same time, SAS is committed to enabling organizations with the flexibility around tools and methods that they require. SAS algorithms and other technology can be wrapped and embedded into models programmed in other languages, such as Python, or assembled with other common open source tools. Organizations can combine new tools with programmed algorithms and core intellectual property that proven commercialized software provides. DevOps environments can use the latest ad hoc tools, but core analytics remain consistent, more easily scaled up and ported to production, and more readily validated by auditors and regulators. Rewriting, retraining and revalidating of models is minimized. It is a best-of-both-worlds approach.

When companies take this approach, we advise them to be attentive to the mechanics and climate of open source. The products and apps enjoy the benefits of crowdsourcing, but not the quality control and warrantied performance of commercial software. There are implicit obligations to the open source community around both intended use and giving back. Individual analysts build their online resumes by publishing code. How does this affect a company’s intellectual property protection and other necessary controls in the sensitive arena of regulatory compliance?

Above all, we advise companies to keep their eyes on the prize. Take advantage of flexibility around tools and subprocesses. But do not compromise the cohesion and performance of the overall analytics program. Do not waste time and resources reinventing the wheel. If it is possible to automatically generate a dozen variations of a forecast and use the optimal results to build upon, then take that early lead. Do not rewrite any basics; make choices that accelerate the overall process and generate the best outcomes. The game is not over when the models are built, but rather when compliance is validated. Even then, it is only over until the next compliance cycle begins or the next requirement comes in from the regulators.

Serious Consequences

Regulatory compliance is very high on the agendas of the executives and boards of directors of financial services companies for a very good reason. If they do not have the analytics to validate compliance to both risk management and financial crime prevention regulations, they are subject to varying degrees of regulatory actions including financial penalties, matters requiring attention (MRA), matters requiring immediate attention (MRIA), and consent orders that place conditions on how they conduct business moving forward. For example, a bank may be prevented from merger and acquisition activity until it has resolved its compliance problems. Even in the most benign cases, companies can incur significant effort and expense demonstrating their compliance processes. One global bank is currently budgeting an estimated $7 billion per year to address compliance as part of a Deferred Prosecution Agreement.

Since the financial crisis, noncompliant companies have paid hefty fines and settlements. In 2012, Wells Fargo, JPMorgan Chase, Citigroup, Bank of America, and Allied Financial incurred collectively over $25 billion in fines. In 2013, JPMorgan Chase had an individual fine of almost $15 billion. In 2014, Credit Suisse paid $2.6 billion. The list goes on. A company can be otherwise seemingly well run and profitable, but the profit can be wiped out by penalties for lack of compliance.

Regulatory penalties affect shareholders if stock prices decline. They affect everyday operations if an exposure to corporate reputation leads to customer defections. There can also be individual penalties, including fines or imprisonment, for those who willfully circumvent compliance requirements or are blatantly in breach of the Bank Secrecy Act.

Alternatively, there are rewards for those who demonstrate compliance with analytical precision. A major East Coast bank held a large amount of capital against the contingency of failing regulatory stress tests and audits. When it passed with flying colors, it recalibrated its reserve requirements and released the extra capital for more productive use.

Meanwhile, compliance itself is big business. In 2000, a global bank had roughly 2,000 employees working in its compliance and regulation program. Today, it has four times as many employees at a cost of over $8 billion a year. When analytical technologies are used to automate or streamline compliance activities, companies gain efficiencies and reduce costs.

Leaders’ Responsibilities

The fiduciary responsibility of financial services leaders requires a keen understanding of the technology choices and tradeoffs with respect to all costs, risks, business constraints, and ultimately shareholder value. This applies not just to regulatory compliance, but to the business at large.

Executives must understand what data is available, as well as the continuing change in how much data and how many types of data can be analyzed. Data volumes are thousands of times more than just a few years ago. Leaders must appreciate what advanced analytics technologies can do with that data, especially with self-learning techniques, and recognize that today’s apps are probably just scratching the surface of what is possible. For example, in such a short time, consider how quickly the capabilities of financial robo-advisors have advanced.

Together, business and technology leaders are wise to calibrate a best-of-both-worlds approach to analytics for regulatory compliance. The goal is to encourage a judicious mix of innovation, control, open platform, and productive processes. Leaders can provide data scientists, statisticians and other analytics professionals with a flexible platform, choice of tools, organized data and means of collaboration. At the same time, they can protect the company from avoidable risks in the use of embryonic or segregated technologies. They can also establish governance, integrate the platform, manage how models move through the life cycle, and ultimately facilitate the use of analytics to both identify hidden risks and manage the known risks more efficiently.

About the authors

Author photo

David Macdonald is the vice president and general manager of the US Financial Services Business Unit at SAS where he is responsible for growing business relationships with institutions focused on retail and wholesale banking, consumer credit, capital markets, investment banking, life and P&C insurance, and institutional investments. Under his leadership the business has grown consistently for 14 years despite the challenges of the financial downturn that began in 2007. Leveraging creativity, inspiration and over 30 years of experience, Macdonald leads a highly skilled team of subject-matter experts and technical presales and sales teams to position strategic solutions to solve the complex business problems financial services institutions face.

In addition, he also plays a key role in the strategy and development of SAS solutions for the financial services industry, including compliance and regulatory offerings, enterprise risk, customer intelligence and enterprise/big data analytical platforms. He has a deep passion for using his technical knowledge and business acumen to translate complex technological advances in ways that convey business value to the industry. These skills are critical in the financial services industry where early adoption of new technologies and solutions is perpetual.

Before his role in financial services, Macdonald formed the CRM Practice at SAS, leading many teams focused on selling and supporting SAS Customer Intelligence Solutions across multiple industries. This was a natural progression after joining SAS from Intrinsic. Macdonald co-founded Intrinsic Ltd. (UK), where he pioneered a marketing automation product and service solution before Intrinsic was acquired by SAS in 2001.

Macdonald received his bachelor’s degree from Dundee University in Scotland. He lives in Cary, N.C., with his wife, Anne, and their four daughters.


Author photo

Robert Morison serves as IIA’s Lead Faculty member. An accomplished business researcher, writer, discussion leader, and management consultant, he has been leading breakthrough research at the intersection of business, technology, and human asset management for more than 20 years. He is co-author of Analytics At Work: Smarter Decisions, Better Results (Harvard Business Press, 2010), Workforce Crisis: How to Beat the Coming Shortage of Skills And Talent (Harvard Business Press, 2006), and three Harvard Business Review articles, one of which received a McKinsey Award as best article of 2004. He holds an A.B. from Dartmouth College and an M.A. from Boston University.


Tags