Skip to Content

Batch Processing Data Streaming Technology Transform Risk Management via Real-Time Information Processing

In risk management, the need for speed has never been more pressing than during the current crisis. Batch processing has been deeply embedded in the banking industry for decades now. From processing end of day batch processes in core systems (e.g. calculating interest) to sending data downstream for various purposes such as feeding the general ledger, feeding risk systems, reporting systems, etc. batch processes have been so deeply embedded that almost all organizations have developed their operating models around batch processing. For many years this approach worked well for organizations both in terms of technology as well as operational efficiencies, despite some of its challenges.

Data Streaming Technology Transform Risk Management via Real-Time Information Processing

This article explores how a new wave of data streaming technology led by social media giants like LinkedIn, Google, and Netflix is now challenging the status quo and forcing organizations to rethink their operating model leveraging technology that processes and delivers information in real-time.

Key Challenges with the status quo

Information Delay

  • Information is only available on a T+1 basis
  • Timely Risk analysis is constrained by system processing times
  • Decisions are based on estimates and assumptions via workaround solutions
  • Management overlays required

Cost of Ownership

  • The high cost of implementation
  • The high cost of maintenance
  • The high cost of Vendors
  • The high cost of change (both time and dollar value)
  • Performance-driven by hardware leading to high TCO

System and Process Complexity

  • Process and operating models are complex due to system constraints
  • Large Applications require large IT support teams
  • Workaround solutions (e.g. spreadsheets) have been introduced
  • Duplication of business logic in various applications (e.g. cash flow projection)

Key Challenges with the status quo

Batch Processing Is Approaching Its Sunset

In recent years, however, a new wave of technology is disrupting the traditional model. Tools and architectures developed by social media giants such as LinkedIn, Google, Netflix, Facebook to process vast amounts of data every day are being applied in the banking world to transform end-of-day and T+1 processes into near real-time processes. These types of architectures must be built for scale from the ground up, which means traditional systems also start to fall-short in being able to match these types of demands.

This new wave of banking technology is based on the fundamental principles of:

  1. Data Streaming Tech i.e. revolutionizing how data is passed & processed/written into a database – moving away from batch transfers and central database based processing to event-based data updates into multiple databases that are interconnected through orchestration engines
  2. Targeted Scalability i.e. being able to automatically scale up or down specific microservice based on demand
  3. Microservices Architecture i.e. smaller & unique manageable chunks of code that perform a specific task as opposed to building a complex application. This is used as a reusable unique service in the ecosystem of systems and other consuming services
  4. Polyglot Architecture i.e. coding language or OS-agnostic development frameworks using containerization
  5. Open Source Adoption to constantly tap into innovations and being able to quickly adapt

Banks have started to embrace this new breed of technology in their frontline systems as well as their digital services. Most banks these days have mobile apps, trading platforms, and internet banking solutions that are either already being developed on modern tech architecture or are planned for transition soon to maintain competitive edge & reap its obvious business benefits.

However, the middle and back-office functions within a bank are yet to reap the benefits of this technology. Many areas within a bank could be transformed with realtime processing of information and use of agile technology stack that can quickly adapt to business or regulatory changes. Some notable areas that are bound to eventually embrace this wave of change driven by technology disruption are:

  • Transaction processing e.g. payments (already occurring in Australia – NPP), loan drawdown, etc.
  • Stress Testing / Simulations & calculations for managing financial risk
  • Management reporting & regulatory reporting
  • Markets and Trading platforms that receive instant updates from core systems on business transactions e.g. loan drawdown

Transforming The World Of Risk Management Via Real-Time Information Processing

Real-time risk management is a pressing topic in the current environment for banks.

During the COVID-19 crisis, banks have felt the operational pain due to their existing systems and processes not being geared up to deal with the demand for internal and external stakeholders for timely insights and management action. A few examples of this are:

  • Recalibrating models to accommodate macro-economic changes
  • Being able to run scenario tests and reviewing results in an unconstrained environment
  • Accommodating regulatory changes such as the repayment deferral schemes
  • Generating additional reports for management and regulators

The root cause has usually been attributed to process ineffectiveness, use of spreadsheets, legacy systems, rigid/inflexible systems, heavy reliance on external vendors; which has been exacerbated by the working from home arrangements. However, one of the areas that have had limited focus is the need for speed in the current environment we are in.

What if:

  • Model updates could be done instantaneously?
  • Do calculations run in minutes as opposed to hours?
  • Simulations could be run with the results reviewed and presented to management on the same day?
  • This can be achieved at a significantly reduced cost?

The introduction of data streaming technology coupled with the modern development architecture will change the world of Risk Management. The operating model will look very different from how things are executed today.

  • Data from source systems could be fed into Risk systems daily and in real-time
  • Finance & Risk professionals could run calculations during the day and instantly review results
  • Models & assumptions could be updated on demand
  • Several Simulations & Stress Tests can be run within a day followed by management meetings
  • Calculations such as RWA, ECL, IRRBB that rely on large data sets and complex statistical formulas could be processed within minutes and numbers ready for review and analysis instantly

Not everything will change overnight, of course, adoption will occur at different frequencies across systems, with some systems still bound by upstream batch dependencies. Many of the same benefits of adopting streaming technologies still apply to even when implemented at the single-application level. New systems can already realize these benefits, that grow continuously as the overall eco-system changes and adapts.

The 4 Key Pillars Of Data Streaming Tech Revolution

This new wave of technological change that is enabling high-performance data streaming is not just as a result of a single platform. It is a combination of years of technology innovation in architecture, technology stack, and development approach that has been glued together to deliver the optimal platform for data processing at high speeds. The flow of data from one system to another may conceptually appear to be the same, what has changed is the “under the hood” engine.

Data Streaming Tech Revolution

4 key pillars are driving this revolution.

Microservice Based Architecture

This is an architectural framework that is built on the foundational principles of modular, independent and easily communicable sets of services that interact with each other and external consumers to deliver a business solution. The idea behind this architectural framework is to move away from large monolithic application development models that are developed and supported by large IT teams. This type of monolithic architecture has created complexity and rigidity in systems. Microservices based approach creates the flexibility much needed in the current age to quickly adapt. It also organically leads to distributed accountability within IT teams and with smaller teams, throughput is increased.

Microservice Based Architecture

Containerisation Based Development

Containerisation is a deployment framework that enables packaging multiple services into logical containers for quick and easy deployment into any OS or cloud. This framework essentially focusses on removing dependencies for the deployment of services written in different coding languages but also creates targeted auto scalability of services based on demand surge.

Rather than the traditional approach of beefing up CPU and hardware for increased performance, containerization makes optimal use of existing CPU delivering higher performance than traditional systems. It also creates the much-needed independence for a service to be deployable in any type of operating environment e.g. cloud or bare metal or different OS making the transition to the cloud or vice versa much easier than before.

Data Stream Processing Technology

In its essence, data streaming is a conceptual term to describe the unbounded flow of data from one service to another designed in a way that creates minimal interruption or dependencies. This eventually results in information being delivered (streamed) to users as soon as it is processed. The best example to describe this technology is streaming video apps such as Netflix, YouTube that can stream the video while it is buffering. The same concept is being applied to processing other forms of data e.g. structured or unstructured texts.

Data Stream Processing Technology

The core shift in this technology as opposed to traditional approaches is the use of an event-based messaging service model for data consumption, processing, and updates. This results in the streaming tech platform taking full control of utilization of its services and sequencing of events while continuously delivering processed information to users as they are completed. When this technology is coupled with containerization and microservices-based architecture, the platform delivers exceptional performance while remaining optimal in its utilization of CPU.

The other key element of this technology is that it enables per record processing as opposed to batches of records. This can be hugely beneficial in certain use cases whereby information per record can trigger critical business processes reducing lag e.g. Credit assessment per customer as opposed to the portfolio.

Use of Open Source Technologies and Standards

Open-source technologies have existed for decades but never before have companies started to adopt these technologies like the current era of software development. Especially when it comes to the development of software using containers and data stream processing platforms, most leading technology companies are developing their platforms based on the open-source technology stack.

  • Docker: Containerisation platform (used by Netflix, PayPal)
  • Kubernetes: Container Orchestration and Load Balancing (originally designed by Google, used by Booking.com, Spotify)
  • Apache Kafka: Data Stream Processing Platform (originally developed by LinkedIn, used by Uber, LinkedIn)

Conclusion

Advances in technology are enabling organizations to process vast amounts of data in real-time, opening up opportunities for the Financial Services sector to derive actionable insights, create new business opportunities, and improve their services to their customers. This wave of technological change will spread in middle and back-office systems of banks – it is only a question of “WHEN”, not “IF”. The world of Risk Management, which is under extraordinary pressure to create visibility for the board, regulators, and the market, will need to step back and relook at their business processes and technology landscape to evaluate whether it is set up for success.