This article was written by Jason Demby, the Director of Business Development, Financial Services, Datameer.
By now, you know that the financial services industry pendulum is at the extreme end of massive risk reduction, intensifying regulatory compliance requirements, and tightening revenue margins. As a result, most banks are completing intense ‘front-to-back’ business line reviews with an eye towards reducing operational risk and optimizing the way each business operates.
These assessments span from front office trader activities to back office trade processing – and their findings result in massive transformational technology and operations projects. Operational risk reduction is a top priority, and these associated projects are the ones receiving budget approval. Many of these projects address big data architectures and the analytics needed to support data-driven management decisions.
Operational risk is…
For those new to the topic, operational risk is all risk not inherent in financial, systematic, or market-wide risk and often comes down to the risk inherent in human dependency and error. This ranges from risk introduced by accidents made during manual process intervention to risk introduced by traders purposely engaging in subversive activities. How do you compare and contrast these risks and turn the related data into measurable and actionable indicators?
How do you measure it?
Thought leaders from major advisory firms are absorbing shifting mandates from the Basel Committee on Banking Safety (BCBS) and helping banks crack the code on how to measure and monitor operational risk.
What key risk indicators (KRIs) are applicable to different lines of business and disparate processes? How do you standardize the capture of relevant data and set up the processes to repeatedly measure, analyze, and visualize those KRIs? This moment of massive operational restructuring is the perfect opportunity to define the data processes and architectures that will help measure and monitor operational risk.
According to PwC regarding operational risk modeling, “such techniques increasingly include analytical tools and models designed to support management decisions, rather than regulatory capital calculations. Hence, while the demise of the AMA (Advanced Measurement Approach) may spell the end of internal models for capital purposes, it may well free up analytical capabilities to develop different internal models that could arguably be far more useful to the management of operational risk than the AMA.”
Historically, operational risk management was based on historical loss events and reserving the right amount of capital to ensure that the bank was secure if these events occurred. After completing their front-to-back assessments, financial services institutions now need to look forward and embed the right analytics within their target-state operating model to drive management decisions.
This needs to be done with analytical tools that are both business analyst friendly and support massive amounts of disparate structured and unstructured operational data.
What are those analytical tools?
The industry investment in big data infrastructure and supporting analytical tools continues to grow. With business structures and processes constantly changing, when trying to measure and reduce operational risk it is important to leverage analytical platforms that:
Garlicoin - The Next DogeGo to article >>
– Enable rapid and iterative analytics by the business analysts that know the business the best
– Handle massive amounts of structured and unstructured operational data
– Tightly integrate with existing application and data platforms
– Empower full automation and scheduling to reduce manual intervention
– Perform during times of operational or market stress
– Offer enterprise-grade governance and security
This ecosystem will include products that track business processes and throughput and will be tightly integrated with highly performing Hadoop infrastructures and analytical tools.
While strong IT partnership is critical, putting these tools in the hands of the business analysts will be key to successfully evolving these analytics as the business evolves. Rogue traders and accidental Excel keystrokes beware – with this new mindset of business process aligned operational risk measurement, banks will proactively implement the analytics to make decisions and prevent costly operational loss events in the future.
Finance Magnates is proud to present its inaugural TLV Conference, to be held on the 29th of June. Apart from the invaluable networking opportunities, the agenda is packed with panels, masterclasses and keynote speeches geared towards the subject of fintech. Industry leaders will be discussing subjects such as marketing automation, cybersecurity, financial regulation and cryptocurrencies, with a focus on the booming Israeli fintech scene. In keeping with this, there will also be a fintech spotlight session in which a lineup of firms will each have four minutes to pitch their exciting products, culminating in an award presented to the most promising.