To move beyond the infrastructure-led phase, the financial ecosystem must recognise that raw data is not an absolute asset. In the nascent stages of open banking, the industry celebrated the velocity of data packets across the network. But for a lender or an insurer, a raw JavaScript object notation (JSON) file is not a decision. The cognitive gap between a data stream and a credit decision is where actionable intelligence resides. This intelligence results from signal extraction: the ability to transform disparate, consented data points into high-fidelity behavioural patterns that reflect financial reality more accurately than legacy proxies.
Recent research into our open-finance layer suggests that this “intelligence phase” requires a departure from heuristic-based risk models. Traditional underwriting has historically relied on static, retrospective snapshots, such as credit scores, which often fail to capture the nuances of the informal economy. Actionable intelligence leverages real-time cash flow data, goods and services tax (GST) filings, and granular transaction metadata to construct a dynamic customer profile. This shift is transformative for India’s "thin-file" segments, small businesses and gig workers, who have long been overlooked by the rigidity of collateral-based lending. By utilising the triple-A framework — availability, accessibility, and analysis — institutions can now build predictive models that identify creditworthiness through intent and operational flow rather than just fixed assets.
However, as complex decisioning models are brought in to synthesise this data, trust deficit remains a systemic risk. Building trust in this high-velocity environment is no longer just about satisfying regulatory checklists; trust must be embroiled into the design of all interactions. In the wake of the Digital Personal Data Protection Act (2023), the industry must transition from passive compliance to radical transparency.
Trust in an open finance layer is effectively three-dimensional. First, there is technological trust: The baseline assurance that application programming interface (APIs) are resilient, secure, and performant. As transaction volumes scale, any perceived latency or security vulnerability in the open data rails erodes the collective credibility of the entire DPI ecosystem. The second layer is procedural integrity: Ensuring that consent mechanisms move beyond binary acceptance towards a framework of granular, revocable, and informed agency for customers.
The third, and perhaps most critical dimension, is outcome trust. This transcends mere transactional gains. While a customer may initially share data for tangible economic benefits, such as customised interest rates or higher credit limits, genuine systemic trust is rooted in the perception of inherent fairness. If the intelligence derived from APIs results in opaque, black-box decisioning, that foundational confidence evaporates. Trust is not merely a tool for economic exchange; it is a cognitive state where the customer feels secure within the architecture. In a mature open-finance landscape, 'the computer said no' is an insufficient response. Trust is sustained when the system demonstrates technical reliability and ethical consistency, proving that data sharing leads to equitable, transparent, and predictable outcomes.
Furthermore, the convergence of heterogeneous data silos, the integration of financial records with health or commerce data, introduces complex second-order risks. While this convergence facilitates hyper-personalised financial journeys, it simultaneously expands the attack surface for sophisticated fraud. To mitigate this, the industry may pivot towards collaborative intelligence. Similar to the unified threat intelligence models utilised in cybersecurity, financial institutions can consider the merits of sharing anonymised and consented signals to identify and neutralise cross-institutional fraud patterns in real time. Our open-finance layer has transcended its technical origins to become the economic substrate of a digital-first state. This necessitates a pivot from data volume to intelligence quality.
The writer is cofounder and CEO, Finarkein