The complexity challenge

The economic and regulatory environment of recent years has placed a powerful downward pressure on any real investment within banks and other financial services firms, whilst at the same time diverting a significant portion of remaining resources to coping with rapidly evolving regulatory requirements.

This situation has not been helped by the fact that many banks have complex application portfolios, populated with heterogeneous legacy applications that are a hangover from previous acquisitions, and an unstructured approach to gaining the competitive edge. This ‘technical debt’ typically features significant amounts of functional overlap, tactical solutions, data quality issues and complex integrations. The problem is further compounded by the historic tendency to buy or build best-of-breed solutions for each business line across the bank.

The impact of complexity is intuitively negative, but very hard to measure. However, it seems reasonable to assume that a complex IT environment is overly costly in IT terms, and results in non-IT operational overhead, impacts both the user and customer experience and adds to operational risk. Of equal importance is the effect that complexity has on reducing the value of every IT investment spend.

So, how is this problem being tackled? In addition to typical cost reduction measures, some banks have embarked on simplification programmes, and most are redeveloping elements of their application stack. However, despite these efforts, it is far from clear that the actual complexity of IT portfolios (applications and infrastructure) and the attendant issues are being effectively measured, managed and reduced. In this paper, we look at why complexity needs to be tackled and examine some practical steps to managing a successful simplification programme.

To discuss this topic further, please get in touch with our specialists