Security leaders push for continuous controls as audits stay manual
Security teams say they want real-time insight into controls, but still rely on periodic checks that trail daily operations. New RegScale research shows how wide that gap remains and where organizations are directing time, staff, and budget to manage it.

How organizations measure the ROI of AI tools in their GRC programs (Source: RegScale)
Manual work still shapes compliance programs
Manual processes continue to drive how organizations handle compliance. Security and risk teams spend thousands of hours each year collecting evidence, managing documents, and preparing for audits. That workload pulls staff away from control testing, security improvements, and risk response.
Organizations often delay or scale back core GRC activities such as control testing, training, and policy updates. Evidence collection remains one of the most demanding tasks, frequently consuming the equivalent of a dedicated role.
This pattern reflects a compliance model where people spend much of their time chasing artifacts across systems that do not connect well.
“There’s no sugarcoating the reality of manual GRC. Organizations are wasting thousands of person-hours annually just collecting evidence. Critical security work gets delayed because compliance eats every available resource. Teams are manually juggling frameworks and scrambling to complete audits with processes that just can’t scale,” said Dale Hoak, RegScale CISO.
Framework sprawl increases pressure on teams
Organizations manage multiple compliance frameworks at the same time. Many juggle half a dozen or more, often layered with industry specific requirements. Overlap between frameworks does little to reduce effort because reporting formats, evidence expectations, and control interpretations differ.
A growing share of compliance work focuses on requirements introduced in recent years. Security teams report spending more time adjusting to new rules than improving existing programs. Manual framework mapping remains common, increasing duplication across audits and assessments.
The research shows that as frameworks multiply, administrative work increases while visibility into how controls perform does not improve.
Automation adoption remains uneven
Automation exists in most organizations in some form. Teams use tools to manage policies, track risks, or collect evidence. The depth of automation remains limited, with programs operating through hybrid workflows that combine tools and manual steps.
Policy management is the area most often automated, followed by evidence collection. Audit preparation and response remain harder to automate due to the need for context, explanations, and coordination with external reviewers.
The findings indicate that organizations tend to automate structured tasks first, while more complex workflows change at a slower rate.
Continuous monitoring adoption lags behind intent
Security leaders agree that continuous controls monitoring improves security and compliance outcomes. Adoption remains limited, with most organizations still relying on scheduled assessments that run weeks or months apart.
Compliance as code adoption follows a similar pattern. Some organizations integrate controls into development pipelines, while others apply compliance checks after deployment. This gap limits real-time insight into control performance in cloud and software-driven environments.
Barriers to continuous monitoring center on budget, integration challenges, and skills. Cultural resistance appears to have declined, suggesting that willingness has outpaced implementation.
AI shows consistent gains with guardrails in place
Organizations that adopted AI in compliance workflows report positive outcomes. Respondents cite faster reporting, reduced manual effort, and improved risk detection. AI supports tasks such as evidence analysis, reporting, and monitoring.
Governance accompanies adoption. Respondents report safeguards including regular audits, employee training, and human oversight of outputs. Many also maintain internal policies, dedicated teams, and restrictions on higher-risk use cases.
Despite these results, AI adoption continues to encounter constraints tied to funding, integration, and staffing.
Automation shows measurable return on time spent
Automation produces measurable benefits across compliance activities. Organizations report time savings after automating parts of their GRC programs. Audit preparation improves, with teams responding faster and producing more consistent evidence.
Organizations that combine automation with AI report stronger gains than those relying on rule-based automation alone. These gains align with reductions in repetitive work and greater consistency across compliance artifacts.
Boards want visibility
Organizations struggle to communicate results to leadership. Tools often provide limited insight into time savings, risk reduction, or program performance, making it harder to demonstrate value and support continued investment.
Some organizations report tighter integration between GRC systems and board reporting, enabling shared views of risk and compliance status. Others continue to rely on manual aggregation and static reports.