The Challenging Times of Delivering on OpenBanking, Part 2

In Part 1 of this 2 part series, I discussed the challenges of the open sharing of data, and consent management model. Now I’ll wrap it up with the need for a common authentication model, shareability, and we’ll start with the most important part – who’s paying for this?

Banks have to foot the bill
The evolving demand for Open Banking means that established institutions need to absorb the cost for the complex infrastructure required to support it. Often the new challenger banks or Fintechs have a cloud-first strategy which do not suffer from the need to maintain complex legacy systems and can cherry pick on the most profitable services to offer to their customers without the need to secure the customer end accounts. This responsibility will still remain with the member Bank where the account originally resides.

Most Banks, typically run their IT infrastructure at 80% efficiency with limited spill-over capacity for those annual peak traffic events. This infrastructure is typically scaled against known service-usage models such as current registered customers, etc. However, this open sharing of Banking customer data is expected to lead to a gradual increase to API traffic which places stresses to already stretched resources including mainframes, journaling systems and front-end systems. To address this concern, greater emphasis is now being placed on the API platforms to be able to automatically scale according to increased traffic volumes and intelligently adjust to manage customer data at the edge rather increasing the pressure to the down-stream resource limited systems. Such API platforms must be able to scale to the cloud regardless of form factor, adopt modern infrastructure initiatives such as Bring-Your-Own-Kubernetes server farm and be highly flexible in the form factor that is utilised.

The need for a common authentication model
Each member Bank that is sharing customer data through commonly accepted APIs, typically enforces multiple layers of security including flexible user authentication models. As a result, an ‘alphabet-soup’ of such standards and mechanisms is being adopted globally, which only aims to frustrate those offering a consolidated modern user interface experience. Among the most common approaches are the varied use of One Time Passwords being distributed using SMS or push notifications, the use of biometrics or risk based analytics but each such mechanism imposes significant demands on the integrators who are looking to consume the Banking data. Of note, when a transaction is to be executed, over specific value thresholds, it is the responsibility and associated risk for the actual end-member Bank to authenticate that a valid user initiated the request. This frustrates the end user experience especially if they transact with several banking institutions as each will, in turn,  enforce a different authentication model and associated set of rules.  In addition – SMS and push notification distribution of OTPs offer a flawed authentication model as often the time-bound user-data doesn’t arrive in time or is sent insecurely, especially if the customer is away from their normal country of residence.

A consolidated authentication model is required, which provides a high degree of both user and device authentication, and at the same time does not rely on the distribution of one time security codes. One suggested approach is being formulated by the FIDO2 standards committees. Users authenticate on their device, using facial, voice authentication, etc., which, in turn, unlocks a private security key soft token, that is used to seamlessly sign a digital challenge issued by the authenticating Bank. This approach supports flexible on-device user authentication models that the customer is naturally aware of, and simplifies the demands of which mechanisms need to be supported by the Bank. Coupled with this, risk-driven analytics are possible using data collected from the customers device, such as enforcing geo-fencing, detection of jail-broken devices and the use of step-up authentication, thereby reducing the risk of fraud taking place, especially as such anonymous transaction data can be globally shared between the member Banks in a federated manner.

Proven shareability of banking data
One of the obligations for the regulated OpenBanking initiative is the need for those affected financial institutions (4000+ in EMEA) to prove that that the API driven data transaction services are constantly available to third parties wishing to consume the data. As such, the onus is to publish monthly reports, via their OpenBanking management Portal that summarises the volume of such transactions, the average response times, error rates being experienced and mean time to recover from a failure.

As such, comprehensive enterprise-monitoring tools are being sought as the traditional in-house solutions are often unable to provide for the detailed monitoring capabilities of APIs. These tools must complement the deployed API Management platform, can be utilised to create such detailed published reports but at the same time can be utilised to identify the source on ongoing issues, in complex, often legacy, environments.

Coupled with this, when new service capabilities and integrations are being made available, Banks are being increasingly challenged with the need to automatically test these services they are ultimately being exposing to the Internet. Traditionally, on premise driven solutions were being utilised, but when faced with testing for potential huge traffic volumes (> 50 million user) consuming services from multiple devices, which in turn avail of varying data connections, cloud driven testing is a necessity. As such, API Management vendors are seeking to provide such cloud-centric automated API testing tools that cater for large user populations with alerting plans when a potential failure is detected. It is only through this pro-active approach can the Banks avoid having to report poor service availability to the in-country regulator, which may result in sizeable fines.

DIY OpenBanking Platform
In the early days of OpenBanking, it was a common trend that Banks would utilise their in-house IT consultants to build their own solution using a mixture of coding, off the shelf solutions, open source components, etc. This approach more often than not resulted in a complex web of technologies which relied on large amounts of source code and integrations to be managed. These also were often insecurely designed and costly to maintain due to the need for specialised skill sets. To add to this difficulty,  there is often a high turn-over in contracted IT consultants, which adds to the difficulty to maintain the required solution knowledge to deliver on a high service quality and add new capabilities.

To address this often over-looked consideration, API vendors are required to provide solutions which have achieved a high degree of International security accreditation such as Common Criteria, easy to deploy, cloud agnostic, and does not warrant the need for coding or costly skills to maintain or adapt the solution platform to the ever changing business needs. At the same time, the adopted platform must facilitate modern enterprise paradigms to be realised such as flexible scaling both on premise or via the cloud, the use of micro-service models and well as offering the ability to manage the complete life cycle of a Banking API.

Conclusion

OpenBanking promises significant landscape change to the Banking sector on a global scale over the coming years. We are now seeing the slow erosion of geographic and regulatory boundaries traditionally associated with the Banking industry. Consumers are no longer limited to Banking within legal borders and wish to transact entirely via their mobile device. User experience is key as it promotes loyalty and self-promotion between the social-media aware consumer base.

Long-term established financial institutions are therefore coming under increasing competitive pressure from these emerging digital Fintechs. They are proving to be able to offer more attractive front-line services, such as credit card and account management services, short term loans as well reacting to market trends, in a fraction of the time. 

However, one key advantage that OpenBanking brings to the banking industry is access to Board-available funds to drive digital transformation programs in-house. Traditional inter-departmental boundaries are being removed through the use of API-first strategies, private cloud platforms are being adopted to reduce the Bank’s carbon foot print and the necessity to maintain costly infrastructures. Such efficiencies can only drive competitive change and enable the traditional Banking institutions to be more dynamic in meeting the challenges of an ever changing global village and the time-bound demands of the next generation of consumers.

Sean O'Connell, MSC, CISSP

Sean O'Connell, MSC, CISSP

Sean O Connell, B.SC, M.Sc, CISSP, is the technical lead for the Broadcom API Management solution in EMEA. With a hardened background in deep-core security and cryptography, for over 11 years in Siemens, Sean has been at CA Technologies/Broadcom for over 15 years where he has held senior roles for the positioning of value-centric APIM enterprise-class solutions to his customer-base. His back-ground in security has been central to understanding emerging threats as well as managing the complex web of evolving requirements when enterprises adopt an API-first digital transformation program. In his spare time, he is a keen hill walker, mountain biker and when the weather doesn't permit, he can be found in his shed working with wood.

Share With Your Network

Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on facebook
Facebook
Share on email
Email
Share on print
Print
Scroll to top