A bold new experiment is taking place in the Federal government across a number of agencies to identify and address systemic risk before the next financial collapse occurs. You may be familiar with the Securities and Exchange Commission’s Division of Risk, Strategy, and Financial Innovation.
Over the last 3 years, the S.E.C. has revamped this Office into a “think tank” with a multidisciplinary team of professionals from a variety of academic disciplines. This is not your father’s SEC; the team is made up of 35 PhD financial economists, financial engineers, programmers, MBA’s and other experts.
Likewise, the Treasury Department has set up a new Office of Financial Research, which was created under the Dodd-Frank bill in 2010 to support the Financial Stability Oversight Council – the group responsible for coordinating the efforts of the top financial regulators.
Richard Berner, the newly appointed head of the OFR, is tasked with finding threats to financial markets BEFORE they occur. Berner, a trained economist, has some experience looking around corners as the chief economists for Morgan Stanley he and a colleague revised their forecast of economic growth in 2007 to predict the coming recession before many on Wall Street saw the signs of economic trouble.
There is an arms race of data analytics unfolding amongst economists and researchers to create tools to recognize and hopefully avoid the next crisis. Berner is leading this charge and is now building a new forecasting model with the help of academics and financial engineers. Many market watchers give Berner kudos for these efforts however there are some who question whether a financial model is capable of capturing the complexity of global financial markets.
Berner faces the same challenge of the providers of Big Data solutions. How do you standardize all sorts of records to a common data set that everyone agrees with so that the numbers are comparable? There is no common taxonomy for data across different firms!
The Office of Financial Research may not be able to see the future and avoid all risk events to financial markets but it does mark a new era in how risk management will be conducted going forward.
What role does GRC play in a world dominated by predictive analytics? What new skills are needed by risk practitioners in the future? Berner didn’t see or understand the systemic risks inherent in a correlated global market and missed how risks in US markets might impact our European counterparts overseas. “There are still pretty big gaps in our knowledge”, Berner said during his interview for the article.
What is becoming clear is regardless of your business the expectation to understand data and develop a governance model for data is increasingly apparent. Attempting to tackle this effort alone in isolated silos would be self-defeating. The best course of action is to begin to socialize the need for data management with key stakeholders in your firm. Agreeing on a common set of definitions and taxonomy helps create a framework for defining important data and understanding where the gaps exist.
Resist the temptation to discuss risks at this stage of discovery. Trust the process to reveal new information and potential risks as you learn more about how data is used and managed across your firm. Rushing to define risks may predetermine outcomes and prevent you from learning gaps you would not have anticipated beforehand.
You may not be able to “see around” corners when you complete this exercise but you may begin to ask new questions and have a better understanding of the bottlenecks of data that prevent you from achieving higher levels of performance. Early success is the key to how far you decide to push the envelope in your data analysis.
Regulators are building a formidable store of information on organizations that will grow and become more sophisticated. Risk professionals should be prepared to have an equally robust set of data to demonstrate that you are building the same level of proficiency to understand their business.