Unglamorus batch processing is ‘the new sexy’ for financial services IT

April 1, 2011 Off By David
Object Storage
Grazed from Computer Weekly.  Author: Sebastien Quoiran.

Financial services organisations have long been recognised as leaders in the adoption of the latest technologies – especially those that offer them the ability to drive their competitive advantage and profitability, writes Sebastien Quoirin, general manager at ORSYP. From the automation of stock exchanges in the late 1980s to the banking sector’s early adoption of online services, much of this new technology investment has improved business productivity and contributed to profit growth to some degree. Nevertheless, based on some very surprising examples I have come across, I wonder whether the sector’s focus has become skewed towards "sexier" technologies at the expense of less glamorous but essential basics?

One such area is the essential, yet slightly unsung "workload automation", also known as "job scheduling" or batch processing. It seems that even the most forward-looking organisations within the financial sector have, at times, neglected this underlying technology that would help them to drive efficiencies, reduce costs and understand how their systems work (making them more robust).

Financial services IT playing catch-up

Gartner has said that 70% of IT processes are still performed in batch rather than real-time, demonstrating that workload automation is a key component in delivering critical business services. To put this into the financial services context, a large financial organisation will run over a million jobs per day for their critical applications, covering anything from end-of-day processing to trading applications.

While scheduling is clearly incredibly strategic and critical to the business, many financial services organisations continue to queue and launch jobs manually, engaging programmers in simple tasks that could (and should) be automated to allow them to focus on other, more complex projects.

One example that is a major financial institution that was still manually processing jobs for its customer information products. Its former system was one based on web menus and required a full-time team of 10 people to execute jobs in order to deliver the information required by clients.

Not only was this hugely inefficient, but the level of manual interaction meant that there was the possibility of inaccuracies in the data produced – data that its customers were using to make large-scale investment decisions. As the requirements and number of customers grew, however, the risks and inefficiencies also scaled, so the situation was not sustainable.

This particular organisation had opted for a state-of-the-art web menu solution, yet did not focus on the basic foundation. Implementing a job scheduler had a massive impact on its business. By converting web menus to batch scripts, one person could do the job of 10 and the organisation was able to reduce risks of human error, increase efficiency, ensure service levels, and deliver information fast to its customers. Moreover, it could scale immediately in relation to its ever-increasing number of customers.

You can teach an old technology new tricks

Job scheduling itself is nothing new but, as a basic requirement for efficient processing, it seldom has its time in the spotlight. Evolving from the centralised mainframe world, job schedulers prioritised and scheduled computing tasks within the constraints of the environment, ensuring delivery against business demands. As environments became more distributed and heterogeneous, the enterprise job scheduler was born, capable of directing jobs across complex infrastructure and application landscapes.

As workload automation enjoyed a successful evolution from the mainframe to the distributed world, we are beginning to see it evolve to suit the hybrid world. Resulting from the competitive need to adapt to the business constraints of this new order, we see naturally cloud-enabled technology emerging that will allow scalability from core processing into hybrid work. I am convinced that this evolution will deliver a reduction of cost and risk, increased efficiency and improved visibility.

An unglamorous yet strategic focus for financial service IT

The globalisation of business, mergers and acquisitions, and a profusion of products have led to a truly heterogeneous world, and increasingly that world will be supported by distributed or cloud IT environments.

Financial services companies, in particular, will benefit from revisiting their focus in light of these developments, especially when they might find that choosing the right automation tool for the environment and systems they are running will help to ensure more efficient and reliable time to delivery for the business.

Workload automation is no longer restricted to its unglamorous reputation of processing last night’s batches. Today, it is about event-driven, real-time, full automation of business processes. Whereas new and exciting technologies receive the hype, workload automation has embraced modernity all the way to the cloud.

With this in mind, and particularly in the financial services sector, we will begin to see a new emphasis on technologies, such as workload automation, which have demonstrated true ROI.

This does not mean that organisations will turn away from new and hyped technologies, as innovation will always remain a business driver. But the focus will shift to core strategic and critical elements which will be instrumental in delivering the ultimate payoffs – significantly reduced costs, improved IT efficiency and increased revenues.