For over ten years, application security teams have encountered a perplexing issue: with enhanced detection tools came increasingly irrelevant outcomes. As alerts from static analysis tools, scanners, and CVE databases surged, the expected promise of improved security slipped further away, morphing into a cycle of alert fatigue and strained resources.
According to OX Security’s 2025 Application Security Benchmark Report, an astonishing 95–98% of AppSec alerts fail to necessitate action, and may actually be detrimental to organizations rather than beneficial.
Analysis of over 101 million security alerts across 178 organizations reveals a critical inefficiency in modern AppSec operations. For every organization, the average number of alerts reached nearly 570,000, with only 202 of those indicating significant security issues.
This troubling statistic emphasizes a broader problem: security teams are expending time and budgets chasing alerts that often do not represent real threats. Consequently, this dynamic hinders genuine innovation within organizations. Chris Hughes aptly states in Resilient Cyber, “We do all this while masquerading as business enablers, actively burying our peers in toil, delaying development velocity, and ultimately hindering business outcomes.”
The Origin of the Issue: An Overflow of Alerts with Little Context
In 2015, the application security landscape presented a less daunting challenge, with only 6,494 CVEs publicly disclosed that year. The focus was predominantly on detection, with tools evaluated by their ability to identify issues without a clear assessment of their relevance.
Fast forward to 2025, where cloud-native applications, rapid development cycles, and expanded attack surfaces have transformed the landscape. In just the previous year, over 40,000 new CVEs were documented, pushing the global tally beyond 200,000. Unfortunately, many AppSec tools have not adapted; instead, they persist with detection-heavy methods, inundating systems with unfiltered alerts devoid of context.
OX Security’s benchmark aligns with practitioners’ suspicions, revealing that a substantial portion of reported issues—32%—have a low exploitation probability, while 25% lack any known public exploit, and another 25% arise from unused or development-only dependencies.
This deluge of irrelevant findings not only slows the security processes but actively undermines them. It is crucial for teams to discern the 2-5% of alerts that warrant immediate action. These actionable alerts predominantly encompass known exploited vulnerabilities (KEV), secrets management challenges, and occasionally posture management issues, as highlighted in the report.
Implementing a Holistic Prioritization Strategy
To reverse the downward spiral of inefficiency, organizations must embrace a more refined approach to application security that prioritizes evidence-based strategies. This strategy necessitates a transition from generic alert management to a comprehensive evaluation model that encompasses the entire software development lifecycle (SDLC), incorporating multiple facets, including reachability, exploitability, business impact, and cloud-to-code mapping.
Employing such a framework allows organizations to effectively filter out irrelevant alerts and concentrate their resources on the minority of threats that genuinely pose risks. This shift not only enhances the effectiveness of security measures, but also liberates critical resources, enabling more confident development activities.
OX Security tackles these challenges with Code Projection, a technology that enables mapping cloud and runtime elements back to their code origins, facilitating contextually informed decision-making and dynamic risk prioritization.
Quantifying Real-World Outcomes
The data reveals a compelling narrative: through the application of evidence-based prioritization, the daunting average of 569,354 alerts per organization could be dramatically curtailed to 11,836, with merely 202 necessitating urgent attention.
Industry benchmarks unveil several critical insights regarding alert noise. Across various environments—whether enterprise or commercial—baseline noise levels appear remarkably consistent, irrespective of the industry. Enterprise environments, however, encounter significantly greater complexity due to their extensive tool ecosystems, larger application footprint, elevated volume of security events, and heightened overall risk.
Moreover, the financial sector faces a particularly acute vulnerability, marked by elevated alert volumes linked to the handling of sensitive financial data. The Verizon Data Breach Investigations Report underscores that the majority of cyber attackers are primarily motivated by financial gain, further highlighting the financial sector’s attractiveness as a target.
The implications of these findings extend far beyond simple metrics. If less than 5% of application security alerts are crucial to organizational safety, the substantial investments in triaging vulnerabilities and remediation efforts may be fundamentally misguided. This inefficiency permeates costs related to bug-bounty programs, vulnerability management, and the internal friction between development and security teams, which arises when teams demand fixes for non-critical vulnerabilities.
Rethinking Detection: Towards Effective Prioritization
As organizations brace for an estimated 50,000 new vulnerabilities in 2025, the necessity for effective triage has never been more urgent. The antiquated model of “detect everything, fix later” not only risks obsolescence but poses immediate dangers.
OX Security’s report powerfully asserts that the future of application security lies not in chasing every potential vulnerability but in strategically focusing on those that present a genuine risk to the organization.