Trusted by 2.200+ Industry leaders
Beginner
Emerging
Developing
Operational
Transformational
Strategy & Vision
Data & Infrastructure
People & Culture
Governance & Trust
Execution & Scaling
Innovation & Co-Creation
Microsoft Dynamics 365 AI Scalability Evaluator
Microsoft has launched a wave of AI capabilities and agents for Dynamics 365, and tools such as Copilot Studio and Azure AI Foundry now makes it possible to build your own. But real impact doesn’t start with the model – it starts with the data, integrations, and governance behind your Dynamics 365 landscape. Our Microsoft Dynamics 365 AI Scalability Evaluator helps you understand whether your Dynamics 365 (and connected systems) are ready to support AI in day-to-day operations. In a few minutes, you’ll see how mature your organization is across strategy, data quality, integrations, governance, and ways of working, especially where gaps could block AI in D365.
Start Evaluation
Strategy & Vision
1. AI is clearly connected to our core business strategy and KPIs, including how we use Microsoft Dynamics 365 (e.g. Finance & Supply Chain).
Not connected; no strategy or KPIs.
Vaguely referenced; no measurable links.
Linked for a few use cases/KPIs.
Broad linkage with quantified targets.
Fully embedded; KPI impact tracked company-wide.
2. Leadership actively supports and communicates the AI vision.
No visible sponsorship
Named sponsor; limited engagement
Sponsors communicate periodically and remove blockers
Active, routine communication with ownership and resourcing
Persistent executive advocacy with clear accountability and outcomes
3. There’s a defined roadmap linking AI initiatives to tangible outcomes for processes and operations dependent on Dynamics 365.
No roadmap
High-level themes only
12-month roadmap with milestones and owners
Multi-year roadmap tied to product/IT plans and budget
Dynamic roadmap with benefits tracking and reprioritization
4. We regularly review and adjust AI goals based on business impact.
Goals not reviewed
Ad-hoc reviews after issues
Scheduled quarterly reviews; some adjustments
Data-driven reviews with corrective actions
Continuous, metric-driven optimization across portfolio
Next question
Data & Infrastructure
5. Our operational data, especially in Dynamics 365 and key connected systems, is accurate, complete, and reliable.
Quality unknown and inconsistent
Basic checks on limited datasets
Defined SLAs and routine monitoring
Automated data quality rules and alerts
End-to-end quality governance with remediation and accountability
6. Dynamics 365 and other core systems (e.g. WMS, MES, CRM, EDI platforms) share and integrate data seamlessly.
Siloed systems; manual transfers
Point-to-point integrations in a few areas
Standardized integrations across key systems
Near real-time integration with governed APIs
Enterprise integration fabric with real-time/streaming and lineage
7. We can access 12+ months of clean, usable historical data from Dynamics 365 and key integrated systems for model training or analysis.
History unavailable
Partial history; gaps common
≥12 months for priority domains
Multi-year history with retention policies
Curated historical/feature stores with versioning and time travel
8. Our technical infrastructure, including Azure, Dynamics 365, and our data/integration platforms, can support AI workloads at scale.
Not suitable for AI workloads
Prototype-level capacity only
Scales for several use cases; basic cost controls
Elastic, monitored; GPU/accelerator ready; cost governance
Enterprise-grade multi-region reliability, observability, and efficiency
Previous question
Next question
People & Culture
9. Our workforce is open to experimenting with AI tools and automation.
Resistance or policy blocks usage
Limited experimentation by a few individuals
Structured pilots in some teams
Widespread experimentation with guidance and sharing
Normalized experimentation with measurable adoption and outcomes
10. Employees have access to training and upskilling around data and AI.
No formal training
Optional, generic courses
Role-based curriculum with tracking
Hands-on labs, certifications, and practice communities
Continuous learning paths tied to career progression and performance
11. We have internal AI champions or experts leading initiatives.
None identified
Informal enthusiasts only
Named champions in key functions
Cross-functional champion network with time/budget
Mature community of practice with goals, playbooks, and KPIs
12. AI is seen as a tool to enhance performance, not replace people.
Predominant fear of replacement
Mixed sentiment; unclear messaging
Leadership messaging supports augmentation
Augmentation embedded in goals and workflows
Demonstrated productivity gains with recognition and incentives
Previous question
Next question
Governance & Trust
13. Data privacy and security policies are in place and actively managed across our Dynamics 365 environments, integrations, and AI solutions.
No policies specific to AI/data
Basic policies; inconsistent enforcement
Defined controls for sensitive data; periodic audits
Automated controls (DLP, access, encryption) with alerts
Continuous compliance with risk registers and board reporting
14. We have ethical guidelines or frameworks for AI development and use.
None
Draft principles; low awareness
Published guidelines referenced in projects
Guidelines operationalized (checklists, approvals)
Ethics embedded in lifecycle (risk scoring, sign-offs, oversight)
15. AI models and decisions based on our Dynamics 365 and related data are explainable, auditable, and monitored.
Opaque models; no monitoring
Limited documentation; basic logs
Model cards, explainability for key use cases; monitoring in place
Comprehensive audit trails and continuous evaluation
End-to-end observability (quality, bias, safety) with incident playbooks
16. We maintain compliance with regional and industry-specific regulations for our data, AI use, and Dynamics 365 landscape (including integrations and vendors).
Unknown obligations
Ad-hoc compliance efforts
Mapped obligations; controls for priority regulations
Formal compliance program (eg, DPIA, supplier attestations)
Proactive readiness for new regulations; periodic external assurance
Previous question
Next question
Execution & Scaling
17. We’ve already launched one or more AI pilots or proof-of-concepts.
None
Single small pilot
Multiple pilots with lessons learned
Pilots progressing to production with gates
Continuous pipeline from ideas → pilots → scaled deployments
18. We can measure ROI or business impact from AI initiatives.
No measurement
Output/activity metrics only
Outcome metrics for key use cases
Standardized benefit tracking and baselines
Portfolio-level ROI with unit economics and re-investment
19. There’s a repeatable process for deploying AI solutions at scale.
None; every deployment is bespoke
Draft process used by a few teams
Defined lifecycle (idea→POC→pilot→prod) with artifacts
LLMOps/MLOps (CI/CD, tests, rollback) and playbooks
Enterprise platform with reuse libraries, SLAs, and SRE integration
20. AI is embedded into operational workflows or customer experiences.
Not embedded
Limited embedding in isolated workflows
Embedded in several processes with training/support
Broad embedding with change management and adoption KPIs
Pervasive embedding with measurable impact on key journeys
Previous question
Next question
Innovation & Co-Creation
21. We collaborate with partners, vendors, or startups on innovation projects.
No collaboration
Occasional vendor trials
Select partners with defined objectives
Active ecosystem with governance and clear roles
Strategic portfolio of co-innovation with IP/exit provisions
22. We actively explore new AI-driven use cases and emerging technologies.
No structured exploration
Ad-hoc scouting
Regular discovery sessions and idea intake
Funded exploration funnel with evaluation criteria
Continuous horizon scanning with rapid prototyping and kill/scale decisions
23. We’re open to co-developing AI agents or prototypes on our Dynamics 365 landscape with partners.
Not considered
Rare exceptions
Allowed under supervision and templates
Common practice with secure sandboxes and data controls
Formal co-dev program with shared roadmaps and reuse targets
24. We allocate budget and resources to innovation and experimentation.
No dedicated budget
Small, ad-hoc funds
Yearly budget for pilots/POCs
Ring-fenced fund with intake and governance
Multi-year portfolio funding with stage-gate investment
Previous question
Show results
Where are you
in
your Dynamics 365 AI journey?
Download the report to understand:
A single AI readiness level for Dynamics 365, based on 24 statements
A breakdown across data quality & integration, governance, people, processes, and impact
Clear next steps to move from experiments to reliable AI in and around Dynamics 365
Get detailed evaluation
Microsoft Dynamics 365 AI Scalability Evaluation
NaN
Undefined
Download the full report
and
have now rebranded to