[Initiative]: Cloud Native Adoption Framework
Name
Cloud Native Adoption Framework
Short description
A developer centric framework to assess, improve, and scale Cloud Native adoption using CNCF projects, grounded in real world workflows and community insight.
Responsible group
TOC
Does the initiative belong to a subproject?
No
Subproject name
No response
Primary contact
Mona Borham ([email protected])
Additional contacts
No response
Initiative description
Improving Developer Experience (DevX) in Cloud Native environments requires more than individual tools, it requires a shared understanding of how developers work, where friction exists, and how to iterate toward better workflows using CNCF projects.
This initiative aims to map the modern Cloud Native developer workflow and use it as a foundation for: • DevX maturity assessments across organizations • DevX oriented reference architectures and technical guidance • Shared language and patterns for adoption
Together, these form a cohesive guide and framework to improve Cloud Native Adoption & Developer Experience using CNCF building blocks.
This initiative aims to improve Developer Experience (DevX) in Cloud Native environments by grounding efforts in the real world developer workflow and making CNCF project adoption developer first.
It is designed to:
- Bridge developer workflows with CNCF projects Offer DevX oriented reference architectures and playbooks that show how CNCF projects can be used together to reduce friction and support delivery.
- Support maturity assessment and improvement Provide a DevX maturity toolkit that helps organizations assess their developer experience across multiple dimensions and tie gaps to actionable next steps.
- Foster a shared language and systems view Enable development, platform, and leadership teams to align on DevX challenges using a common vocabulary, patterns, and feedback loops.
- Accelerate meaningful CNCF adoption Help organizations adopt CNCF projects with clearer DevX first guidance, reducing shelfware risk and increasing long term developer and platform alignment.
Deliverable(s) or exit criteria
Track 1: DevX Maturity, Performance & Technical Guidance
This track provides a capability- and outcome-based framework to help organizations evaluate, benchmark, and improve the state of Developer Experience (DevX) across the Cloud Native software lifecycle. It combines structured maturity and performance assessments with actionable technical guidance in the form of reference architectures and iterative playbooks. By aligning developer workflows with enabling capabilities and CNCF project integrations, this track supports organizations in identifying friction points, visualizing effective patterns, and making continuous, incremental improvements. The goal is to move beyond static evaluations and provide a practical path toward developer centric, outcome driven Cloud Native adoption. Specifically, this track includes:
-
Maturity assessment across key dimensions such as platform enablement, application team practices, technical capabilities, and organizational culture
-
Developer centric outcome measurement (metrics) at the individual, team, and organizational levels to track delivery performance and DevX quality over time
-
Reference architectures that illustrate how CNCF projects can be composed to support common developer workflows across the inner and outer loop
-
Scenario based playbooks that provide step-by-step guidance for implementing improvements, reducing friction, and adopting CNCF projects incrementally
Dimensions & Capabilities Metrics
| Category | Sub-Dimensions | Examples of Capabilities |
|---|---|---|
| Application Team Capability | CI/CD Discipline, Deployment Practice | DORA metrics (DF, LT, CFR, MTTR), PR lead time, containerization rate |
| Technical & Cross-Cutting | Security(in general sense aka supply chain security), Observability, Reliability | Shift-left scanning, MTTR-vulns, incident metrics, progressive delivery adoption |
| Platform Enablement | Developer Self-Service, Automation, Tooling | IaC coverage, MTTP for environments, % self-service, standardized pipelines |
| Organizational Culture | Flow, Team Autonomy, Learning, Collaboration | Onboarding time, Westrum Culture scores, % features validated by user feedback |
This will position the toolkit as both:
- A DevX Mirror: Helps organizations reflect on their current state across teams, workflows, and platform capabilities
- A DevX Compass: Guides continuous improvement through targeted capabilities, CNCF aligned scenarios, and measurable outcomes
Outputs:
- Heatmap & Maturity Scorecard: A visual, team-aware view of DevX capabilities across the organization
- DevX KPIs: Actionable metrics (developer, team, org-level) to track improvements over time
- Benchmarks (optional): Community-aggregated insight through anonymized survey and assessment data
Reference Architecture & Playbooks
Serve As:
- A DevX Bridge: Connect capability gaps to real-world implementation patterns using CNCF projects
- A DevX Accelerator: Help teams move faster by reusing proven architectures, avoiding anti-patterns, and improving confidence in their workflows
| Capability Area | Sample Scenario | Reference Architecture Pattern |
|---|---|---|
| CI/CD Discipline | Secure, fast, reliable CI/CD pipeline for microservices | Trivy +Argo CD |
| Progressive Delivery | Safe rollout with observability and rollback | Argo Rollouts + Flagger + Prometheus + OpenTelemetry |
| Self-Service Platform | Developers provision envs with minimal YAML | Backstage + Crossplane + Argo CD with GitOps templates |
| Ephemeral Environments | Preview per PR with traceability and auto-teardown | Keptn + ArgoCD + Flux + OpenTelemetry |
| Shift-Left Security | Security scanning embedded early in dev workflow | Sigstore + Trivy |
| Developer Onboarding | New engineer can ship PR with full feedback in <1 hour | Backstage + DevSpace + Inner loop tooling |
| Production Readiness | Observability-first rollout for new services | OTel SDKs + Prometheus + Jaeger + alerting + service catalog |
Outputs:
- Developer-Centric Reference Architectures: CNCF project compositions mapped to common DevX scenarios across the inner and outer loop
- Modular Playbooks: Step by step improvement guides tailored to capability maturity levels and team context
- CNCF Integration Patterns: Real-world combinations of sandbox, incubating, and graduated projects aligned to developer workflows
- Friction Watchpoints: Highlighted anti-patterns and risks to avoid when implementing DevX patterns
- Success Metrics & Impact Signals: Suggested KPIs tied to playbook adoption to validate outcomes (e.g., time-to-feedback, % auto-provisioned PRs, rollback success rate)
Track 2: DevX Radar (Community Pulse)
A lightweight recurring publication (e.g., quarterly) capturing insights from the ecosystem.
Highlights:
- Emerging DevX challenges and solutions
- Patterns and anti-patterns observed across the ecosystem
- Tooling trends and case studies
- Community sourced and feedback-driven
- CNCF Tie-in: Spotlights how CNCF projects are evolving to meet DevX needs
Great initiative! Would you mind creating a separate initiative for Track 2? That should make it easier to manage the progress, deliverables, and status :)
I think the main idea here is to publish the Track #1 learnings and results in the radar. That’s why I preferred to keep them together. I’m happy to adjust, though, if we all feel it makes more sense to separate them.
Fully in support of moving this forward! 🙌
- I agreed that it's easier to understand when it's mapped to the common developer workflows and activities that developers do on their day-to-day basis (inner-loop, outer-loop). Especially that what CNCF offer is vast and various forms of the reference like the landscape is focused on technology perspective.
- I also like the way it's assessed on different layers/levels. I can imagine that in company and big organization, both the decision to choose particular tool along with the evaluation of how effective it is would naturally happen on larger scale rather than at individual level.
1 question that I had in mind was: What would be the measurement metrics? I assume it's referring to the "measurable outcomes" in DevX Compass and correspond to the mentioned capabilities examples like DORA, incident, % self-service, etc. Would it be something quantitative?
@riaankleinhans can you move this to the vote status?
@salaboy @joshuabezaleel @kdubois @graz-dev @julsemaan @graz-dev @cloudmelon Vote!!
@joshuabezaleel I brainstormed some KPIs that we can measure. These metrics are far from being perfect and I think it's better to think them through before defining the maturity for each dimensions. You can find them in the following list:
Platform Capabilities Focus: Foundational systems & boundaries, golden paths, self-service infrastructure, scalability, reliability & resilience, automation
Key Metrics: Mean Time to Provision (MTTP) for developer environments or services Percentage of infrastructure managed via IaC Self-service ratio (e.g., % of deployments or resources created without ticketing) Platform adoption rate (% of teams using platform-provided pipelines, services, or APIs) Platform incident count vs. uptime / availability Average platform support response time
Application Team Capabilities Focus: Ability of application teams to deliver and operate modern software using platform capabilities
Key Metrics: Deployment Frequency (DORA) Lead Time for Changes (DORA) Change Failure Rate (DORA) Time to Restore Service (DORA) % of services using containerization / serverless CI pipeline success rate Code review lead time Progressive delivery adoption (canary, blue/green, feature flags) Incident MTTR / Mean Time Between Failures (MTBF)
Cross-Cutting Concerns Focus: Security, compliance
Key Metrics: Vulnerability detection rate (Shift-left security tools) Policy violations caught during CI/CD or after deployment(OPA, Kyverno, etc.) Mean Time to Remediate (MTTR) vulnerabilities % of deployments using signed or SBOM-verified artifacts Incidents per release / release rollback rate
Organizational Culture & Product-Centric Practices Focus: Team structure, flow of work, autonomy, user empathy, and psychological safety
Key Metrics: Westrum Culture Survey Scores (Generative, Bureaucratic, Pathological) Team Stability / Rotation Rate % of features driven by validated customer feedback Time from idea to production Cycle Time per Feature % of time spent on toil vs. innovation Internal NPS or Developer Satisfaction (DevSat) Onboarding lead time (how quickly new developers become productive)