Cloud teams are hitting maturity walls in governance, security, and AI use

Enterprise cloud programs have reached a point where most foundational services are already in place, and the daily work now centers on governance, security enforcement, and managing sprawl across environments. Hybrid and multi-cloud architectures have become routine in large organizations, bringing new operational pressures around consistency and control.

enterprise cloud governance gaps

A new survey of cloud architects and enterprise cloud decision-makers found that Azure has become a dominant platform in enterprise environments, with 93.4% of respondents reporting an Azure presence. The same group reported strong adoption of resilience practices, extensive use of cloud-native security tooling, and growing dependence on AI workflows. At the same time, the data by theCUBE Researchs hows recurring gaps in infrastructure automation, cloud migration security, and enterprise governance over AI usage.

Multi-cloud sprawl is driving policy drift

Many organizations are now operating at a scale where cloud management complexity is tied less to platform capability and more to the number of accounts, projects, and environments they must govern. Nearly two-thirds of respondents reported operating between six and 20 cloud accounts across AWS, Azure, and Google Cloud.

This kind of sprawl increases the chances of inconsistent access controls, uneven patching and configuration standards, and untracked cloud assets. It also increases audit overhead, since compliance and risk teams need consistent reporting across a growing footprint.

Infrastructure-as-code adoption remains widespread, yet fragmentation continues to undermine consistency. The survey found that CloudFormation is used by 76% of respondents and Terraform by 55%. Azure DevOps was reported as a CI/CD platform by 83% of organizations.

These overlapping tooling patterns can create parallel automation stacks that evolve separately across cloud platforms. The result is configuration drift, duplicated work across teams, and gaps in governance enforcement.

Resilience practices are established, migrations remain risky

Operational resilience has become standard practice in many environments. A majority of respondents reported using multi-region active/passive architectures for workloads. Environment separation is also common, with 46% physically separating production and non-production systems and 46% using logical separation.

Migration activity remains heavy across enterprises, especially for data platforms. At the same time, downtime tolerance is limited. Nearly half of respondents said their organizations can accept only one to six hours of downtime for cutover during migration.

That combination creates pressure to migrate at speed while keeping data integrity intact. In regulated environments, that pressure extends to audit evidence and compliance validation, which often needs to be produced in parallel with migration execution.

PII is nearly universal in cloud environments

Sensitive data exposure remains a major risk driver in enterprise cloud programs. The survey found that most organizations store and process personally identifiable information. This level of regulated data presence makes security and compliance operational requirements tied directly to cloud design and migration planning.

Cloud-native managed database adoption is also high. More than half of respondents reported using managed cloud databases, and a third reported using SaaS-based database services. Only 10% reported operating self-hosted databases.

This shift toward managed services reduces operational burden on infrastructure teams, but it increases reliance on identity governance, network segmentation, and application-layer security controls. It also creates stronger dependency on cloud provider logging and access models.

Security is slowing cloud migration work

Security has become the most common obstacle during cloud migrations. Half of respondents identified security as their top migration challenge.

Many organizations have already migrated low-risk workloads. Current migrations increasingly involve regulated data, internal systems of record, and applications with complex dependencies. That raises the cost of errors, especially when migrations require short cutover windows and strong compliance controls.

The survey also found significant adoption of specialized security tooling, including Aqua, Wiz, and Snyk. Secrets management maturity also appears strong, with most organizations storing secrets in Azure Key Vault.

Security remains a primary constraint. One contributing factor is that compliance validation often happens late in the migration cycle, which can delay production cutover and increase risk exposure when remediation must occur under time pressure.

AI workloads are driving infrastructure change

AI and GPU-based workloads are becoming standard requirements across enterprise environments. The survey found that 76% of organizations are already running GPU workloads.

Development stacks also reflect this shift. Python was reported as a primary language, with Java close behind. These languages remain central to AI workflows, data engineering, and enterprise application back ends.

Machine learning adoption is also widespread since organizations reported actively training ML models. Many of these pipelines are now part of production environments, making operational continuity a priority.

At the same time, many organizations reported that their ML pipelines require migration. That creates a separate modernization track alongside traditional application migration, since ML workflows often depend on large datasets, GPU infrastructure, and repeatable validation steps tied to model behavior.

DevOps teams are carrying monitoring and response load

Monitoring and incident response work is shifting toward DevOps organizations, adding operational strain as cloud complexity grows.

The survey found that 44% of organizations reported monitoring led by DevOps teams, and 38% reported shared monitoring responsibility across groups. This workload distribution can reduce time available for application modernization and cloud governance work. It also increases burnout risk as teams spend more time responding to alerts and operational disruptions.

Monitoring becomes harder due to inconsistent telemetry across cloud providers, container platforms, and managed services. Organizations are now managing observability as a primary operational constraint tied directly to staffing capacity.

Public AI tools are creating governance gaps

One of the most significant governance issues emerging across enterprises is the widespread use of public AI tools such as ChatGPT and Copilot. Only 20% of organizations reported enterprise-wide deployments built on a common governed framework.

This gap creates risk around prompt handling, data leakage, and regulatory exposure, especially in environments where PII is widely processed. It also introduces uncertainty around how employees are using AI tools in workflows tied to internal systems, customer records, and proprietary code.

Agentic AI use is already tied to operational tasks. Respondents reported using AI to automate repetitive tasks, optimize decisions, deploy AI assistants, and support decision-making.

Adoption is also being driven heavily through external sources. Organizations reported sourcing agentic AI capabilities through platform vendors, and IT or consulting service providers. Only a third build primarily in-house.

“Our research shows agentic AI has crossed the curiosity threshold and entered an execution phase, but enterprise readiness is lagging ambition,” said Paul Nashawaty, Practice Lead and Principal Analyst at theCUBE Research. “While nearly all respondents see value in agentic AI, only 31.5% plan to build these capabilities primarily in-house, and fewer than 30% have standardized enterprise deployments, forcing organizations to trade speed for long-term control as they rely on platforms and services to move forward.”

Download: Tines Voice of Security 2026 report

Don't miss