Microsoft Fabric MCP Server: AI-Powered Data Development Preview

Microsoft launched Fabric MCP Server on October 1, 2025, bringing the Model Context Protocol standard to its unified data platform. This open-source implementation enables AI agents like GitHub Copilot and Claude to interact with Fabric workloads through natural language, transforming how developers build data pipelines, query real-time analytics, and manage infrastructure.

Microsoft Fabric MCP Server: AI-Powered Data Development Preview

By BizApps Summit Team | 03 October 2025

Microsoft launched Fabric MCP Server on October 1, 2025, bringing the Model Context Protocol standard to its unified data platform. This open-source implementation enables AI agents like GitHub Copilot and Claude to interact with Fabric workloads through natural language, transforming how developers build data pipelines, query real-time analytics, and manage infrastructure. Currently in public preview, the technology comes in two flavors: a local-first API context provider for code generation and a real-time intelligence server that translates natural language into KQL queries. Early adopters report 50-70% time savings in development workflows, though preview limitations and security considerations require careful evaluation before production deployment.

Two implementations, one unified vision

Microsoft has developed two distinct MCP server implementations that serve complementary roles in the Fabric ecosystem. Fabric MCP (general) provides a local-first development framework that packages all of Fabric’s public APIs, JSON schemas for every item type, and best-practice guidance into an AI-accessible context layer. It runs entirely on your machine without connecting to live Fabric environments, making it ideal for safe code generation and learning. Fabric RTI MCP Server targets real-time intelligence workloads, acting as a bridge between AI agents and live data sources in Eventhouse and Azure Data Explorer. This server translates natural language questions into optimized KQL queries and returns results in seconds.

Both implementations are fully open source and part of Microsoft’s broader MCP initiative that includes servers for Azure, SQL, DevOps, and Microsoft 365. The architecture follows Anthropic’s Model Context Protocol specification, which was introduced in November 2024 as an open standard for connecting AI models to external tools and data sources. Think of MCP as USB-C for AI applications—a standardized interface replacing fragmented custom integrations.

The general Fabric MCP is built with .NET and requires .NET 9.x SDK, while the RTI implementation uses Python 3.10+ and is distributed via PyPI for easy installation. The local-first architecture of Fabric MCP means zero risk of credential leakage or accidental production changes. Developers review and decide when to run generated code, maintaining full control throughout the workflow.

Natural language meets enterprise data

The core capability that makes Fabric MCP transformative is natural language to query translation. With the RTI MCP Server, analysts can ask “Sample 10 rows from StormEvents table and analyze trends across the past 10 years” and receive instant results without writing a single line of KQL. The AI agent automatically selects the appropriate MCP tool—like kusto_query or kusto_sample_table_data—authenticates through Azure Identity, executes the query, and presents formatted results conversationally.

For security teams, this enables threat detection without technical expertise. One documented example shows an analyst asking: “I have data about user executed commands in ProcessEvents table, can you sample few rows and classify the executed commands with threat tolerance of low/med/high.” The RTI MCP Server translates this into an optimized KQL query, analyzes command patterns, and returns a categorized threat assessment table—all within seconds.

The Fabric MCP (general) implementation focuses on code generation and scaffolding. Developers can prompt GitHub Copilot with “Generate a notebook that reads from Bronze lakehouse ‘sales’ table, cleans data, and upserts to Silver lakehouse” and receive complete PySpark code with correct schemas, error handling, and upsert logic. The MCP server provides embedded OpenAPI specifications, JSON schemas for all Fabric item types (Lakehouses, pipelines, semantic models, notebooks, reports), and Microsoft’s recommended patterns for pagination, long-running operations, and retry logic.

Schema discovery happens automatically. The RTI MCP Server exposes metadata and table structures, allowing AI agents to dynamically understand your data without manual documentation. GraphQL implementations introspect schemas to let AI agents independently identify available data types, queries, and mutations. This eliminates the N×M integration problem where every AI application needs custom connectors for every data source.

Comprehensive workload integration

Microsoft Fabric MCP integrates with the full spectrum of Fabric workloads. The RTI MCP Server provides 15+ tools for Eventhouse operations including kusto_list_databases, kusto_get_table_schema, kusto_ingest_inline_into_table, and semantic search through kusto_get_shots (which requires Azure OpenAI embedding configuration). For Eventstreams, it offers list_eventstreams, get_eventstream, and get_eventstream_definition to manage real-time data processing.

The general Fabric MCP covers all item definition types: Lakehouses, Data Warehouses, Data Factory pipelines, semantic models, notebooks, reports, and Real-Time analytics workloads. It doesn’t connect to live resources but instead packages comprehensive API context so AI agents can generate valid resource definitions following Microsoft’s specifications.

A third implementation focuses on GraphQL API integration, enabling flexible queries across Lakehouses, Data Warehouses, and SQL databases. This local MCP server introspects GraphQL schemas, empowering AI agents to understand available types and operations without developers defining separate MCP tools for each GraphQL type. Queries like “Show customers and their purchase patterns filtered by region” execute through a single standardized interface.

Community developers have created additional implementations. The Augustab/microsoft_fabric_mcp repository provides 25+ tools for workspace management, including lakehouse operations, table access, shortcuts, job monitoring, and capacity management. These read-only tools ensure no risk of accidental data modifications during AI-assisted exploration.

AI agent and IDE integration

GitHub Copilot integration is the primary use case, now generally available in VS Code with agent mode support. Developers configure MCP servers in their VS Code settings.json file with simple JSON entries specifying the command, arguments, and environment variables. Once configured, Copilot’s agent mode can autonomously execute multi-step tasks, iteratively adapting based on feedback and automatically selecting the right MCP tools.

A typical configuration looks like this: specify the fabric-rti-mcp server using the uvx command, set the KUSTO_SERVICE_URI environment variable to your cluster address, and define a default database. Restart VS Code, and GitHub Copilot immediately gains access to 15+ Fabric RTI tools. The agent can list databases, sample tables, generate queries, and analyze results—all through conversational prompts in the Copilot chat interface.

Claude Desktop offers another popular integration path. Developers add MCP servers to the claude_desktop_config.json file with similar configuration patterns. Claude then provides natural language access to Fabric data, making it excellent for exploratory analysis and data discovery workflows.

Visual Studio 2022 version 17.14+ includes native MCP support with direct installation capabilities and OAuth authentication. Cursor IDE supports MCP with “YOLO mode” for automatic tool execution. Cline provides full stdio and HTTP transport support. The ecosystem spans every major AI-powered development environment.

For enterprise scenarios, Azure AI Foundry offers preview support through its Python SDK. Developers create McpTool objects specifying server URLs and allowed tools, then attach them to agents. This enables dynamic tool discovery with enterprise security features like VNet integration, Managed Identity authentication, and tool approval workflows (always/never/auto policies).

Microsoft Copilot Studio reached general availability with MCP integration, featuring an onboarding wizard that connects to MCP servers with just a few clicks. Once connected, agents automatically receive the latest tools and information as systems evolve. The integration includes enhanced tracing to show which MCP server and specific tool was invoked at runtime, critical for debugging complex multi-agent workflows.

Setup patterns and implementation paths

Installation takes different paths depending on which implementation you need. For the RTI MCP Server, the easiest method is pip install microsoft-fabric-rti-mcp followed by uvx microsoft-fabric-rti-mcp to run it. VS Code users can add servers through the command palette using “MCP: Add Server” and selecting “Install from Pip.” Manual cloning from the GitHub repository provides an alternative for developers who want to inspect or modify source code.

The general Fabric MCP requires building from source since it’s a .NET application. Clone the microsoft/mcp repository, navigate to servers/Fabric.Mcp.Server, and build with the dotnet CLI in release configuration. The global.json file may pin specific .NET SDK versions, so ensure you have the required SDK installed.

Authentication happens through Azure Identity for live data access scenarios. The RTI MCP Server uses DefaultAzureCredential, which tries authentication methods in order: environment variables, Visual Studio credentials, Azure CLI login, Azure PowerShell, Azure Developer CLI, and finally interactive browser authentication. For most developers, simply running az login provides sufficient authentication. The system automatically discovers and uses those cached credentials without storing tokens directly.

For production scenarios, service principal authentication or managed identity offers better security. Set environment variables for AZURE_TENANT_ID, AZURE_CLIENT_ID, and AZURE_CLIENT_SECRET. When deploying to Azure Functions or Container Apps, assign managed identities to eliminate credential management entirely.

Environment variables customize behavior. The KUSTO_SERVICE_URI variable sets the default cluster, while KUSTO_SERVICE_DEFAULT_DB specifies the initial database. The AZ_OPENAI_EMBEDDING_ENDPOINT enables semantic search functionality for the kusto_get_shots tool. For HTTP deployment modes, set FABRIC_RTI_TRANSPORT to “http”, configure FABRIC_RTI_HTTP_HOST and FABRIC_RTI_HTTP_PORT, and optionally enable stateless mode.

Configuration examples show VS Code settings with MCP servers defined under the “mcp” key. Each server entry includes a command (like “uvx”), arguments array (like “microsoft-fabric-rti-mcp”), and environment variables object. Absolute paths are critical on Windows to avoid resolution errors. The configuration persists across sessions, automatically connecting MCP clients to servers on startup.

Real-world business value

Organizations report significant development acceleration. Infrastructure-as-code workflows see 60-70% time reductions when generating Terraform templates for Fabric resources. One documented example shows developers using multiple MCP servers simultaneously: Microsoft Learn MCP provides documentation context, Terraform MCP supplies resource specifications, and Fabric MCP generates valid configurations with proper token handlebars and best practices baked in.

Data engineering teams build medallion architecture pipelines in hours instead of days. A developer prompts: “Create a notebook that reads from Bronze lakehouse ‘sales’, applies transformations, and upserts to Silver lakehouse.” The Fabric MCP retrieves schemas for both lakehouses, generates PySpark code with correct data types, implements efficient upsert logic, and includes error handling patterns. Schema mismatch errors—a common source of pipeline failures—essentially disappear because the AI works from authoritative schema definitions.

Security operations centers leverage RTI MCP for threat analysis without requiring analysts to learn KQL syntax. Natural language queries like “Classify executed commands by threat tolerance and provide summary statistics” translate into optimized queries that analyze ProcessEvents tables and return categorized threat assessments. This democratizes access to security data, reducing the analyst-to-security-engineer bottleneck.

Business intelligence teams accelerate report development. Generating Power BI semantic model definitions with proper schemas takes minutes through AI-assisted workflows that reference Fabric MCP’s embedded JSON schemas. Report developers spend less time debugging configuration issues and more time designing visualizations and business logic.

The self-service analytics benefit extends data access to non-technical users. Marketing analysts query campaign performance through natural language: “Show customers and purchase patterns filtered by region and channel.” The GraphQL MCP Server introspects schemas, constructs appropriate queries, and returns filtered results—no SQL knowledge required. Data team backlogs shrink as business users independently explore data.

Preview limitations and production considerations

Both implementations carry public preview status with explicit warnings that “implementation may significantly change prior to General Availability.” Microsoft does not recommend production workloads during preview. Breaking changes are possible, and API contracts may evolve as Microsoft incorporates feedback.

Feature gaps exist in current implementations. The RTI MCP Server supports Eventhouse and basic Eventstream operations, but the roadmap includes unreleased capabilities: Activator integration for proactive insights, expanded Eventstream support, richer visualization tools, and additional RTI components. Some Fabric resources lack comprehensive documentation, requiring workarounds like manually creating resources and interrogating them with Fabric CLI to discover configuration patterns.

Technical requirements create adoption barriers. The Fabric MCP Server needs .NET 9.x SDK, while RTI MCP requires Python 3.10+. Clients must support MCP protocol—VS Code needs GitHub Copilot extensions, Claude requires Desktop application setup, and all solutions need UV package manager or equivalent tooling. Authentication demands Azure CLI installation and proper configuration. Path resolution issues on Windows require using absolute paths in MCP configurations.

Schema and data type limitations affect GraphQL implementations. Some columns in Spark Delta tables don’t appear in SQL analytics endpoints due to limited data type support. Unsupported types render as NULL. JSON strings exceeding 8KB cause formatting errors. Nested data structures often require Lakehouse shortcuts with Spark Notebooks instead of SQL endpoints. GraphQL introspection must be manually enabled by Workspace Admins, which some organizations avoid for security reasons since it exposes schema information.

Authentication complexity surfaces in enterprise scenarios. The OAuth on-behalf-of (OBO) flow requires Microsoft Entra App configuration with Federated Credentials, Azure Data Explorer API permissions, and gateway setup through Azure API Management. Token refresh limitations mean workflows must complete within one-hour windows or risk “InvalidConnectionCredentials” errors when access tokens expire.

Performance considerations include caching behavior that can serve stale data after resource changes. Developers must manually invoke clear_fabric_data_cache or clear_name_resolution_cache after modifications. Query validation timeouts limit Dataflow Gen2 operations to 10-minute execution windows per query, requiring complex transformations to be split across multiple dataflows.

Security risks require careful evaluation. Misconfigured authorization logic can expose data. OAuth token theft on local MCP servers enables credential impersonation. The MCP specification update from April 2025 recommends delegating authentication to external services like Microsoft Entra ID rather than handling tokens directly. Organizations should implement zero trust architecture, enable security monitoring, use principle of least privilege for workspace roles, and conduct thorough security reviews before broader deployment.

The local-first architecture of Fabric MCP (general) provides inherent security by never connecting to live environments. It generates code that developers explicitly review and execute, eliminating accidental production changes. However, RTI MCP connects to live data sources, requiring proper credential management and network security controls. Microsoft recommends Azure Key Vault for production secrets instead of .env files, along with VNet integration and comprehensive logging.

Ecosystem momentum and community response

The October 1, 2025 preview announcement for Fabric MCP represents rapid movement since Anthropic introduced the Model Context Protocol in November 2024. In less than a year, Microsoft developed multiple production-quality MCP servers, integrated them across VS Code, Visual Studio, and Copilot Studio, and built comprehensive documentation and samples.

FabCon Vienna in September 2025 served as the coming-out party for Fabric MCP capabilities. The sold-out conference with 4,000+ attendees featured keynotes from Amir Netz (CTO, Microsoft Fabric) showcasing MCP as part of expanded developer tooling. Workshop sessions on AI-assisted development and multi-agent orchestration drew capacity crowds. Announcements included Fabric Data Agents with MCP support, integration with Copilot Studio for multi-agent scenarios, and the Fabric Extensibility Toolkit evolution.

Microsoft’s official MCP repository on GitHub catalogs 12+ server implementations spanning Azure services, Developer tools, data platforms, and productivity applications. The repository includes Azure MCP Server (consolidating all Azure tools), Azure AI Foundry MCP, Azure DevOps MCP, SQL MCP Server, Microsoft 365 Agents Toolkit MCP, and specialized servers for Dev Box, Clarity analytics, Learn documentation, and NuGet package management. Supporting infrastructure includes mcp-for-beginners curriculum in six programming languages and planned MCP Dev Days virtual events.

Community adoption signals are strong. The Augustab/microsoft_fabric_mcp community implementation demonstrates grassroots development filling gaps in official tooling. Blog posts from Nimblelearn, Telefonicatech, Stratola, and technical community members analyze MCP capabilities and share implementation patterns. YouTube tutorials are emerging, and Stack Overflow discussions show developers actively experimenting with configurations and troubleshooting integration issues.

Developer sentiment leans positive with realistic acknowledgment of preview status. Comments describe MCP as “game-changing for AI-assisted development” and “powerful integration capabilities” while noting security considerations need attention. The “USB-C for AI” analogy resonates with developers who’ve experienced integration fragmentation. Microsoft’s open-source approach and collaboration with Anthropic receive praise, contrasting with concerns about proprietary lock-in from other platforms.

The broader MCP ecosystem includes major platform adoptions beyond Microsoft. OpenAI integrated MCP in its Agents SDK in March 2025. Google DeepMind announced support for Gemini models in April 2025. GitHub, Databricks, and Snowflake have announced integrations or partnerships. The MCP Registry launched in September 2025 as a community-driven catalog, growing to thousands of servers across databases, cloud platforms, development tools, and productivity applications. Development tools from Replit, Codeium, Sourcegraph, Zed, and JetBrains have announced or shipped MCP support.

Training and certification programs are scaling rapidly. Microsoft’s DP-600 Fabric Analytics Engineer certification is reportedly the “fastest growing certification in Microsoft history.” The DP-700 Fabric Data Engineer certification reached general availability in January. Microsoft offered 50% discount vouchers during FabCon Vienna, driving thousands of new certifications. The mcp-for-beginners curriculum provides structured learning paths across .NET, Java, TypeScript, JavaScript, Rust, and Python.

Community events are proliferating. FabCon Atlanta 2026 was announced for March 16-20. Microsoft Ignite 2025 in San Francisco will feature Fabric MCP sessions. The Microsoft Fabric Global Hackathon running through November 2025 offers up to $10K in prizes, driving experimentation and real-world implementations. Super User and MVP programs are engaging deeply with MCP capabilities, creating content and supporting community members learning the technology.

Strategic implications for enterprises

Organizations evaluating Fabric MCP should adopt a phased approach. The preview status makes this ideal for development environments, learning initiatives, and pilot programs with single teams or workspaces. Development acceleration benefits—50-70% time savings in API integration and infrastructure-as-code—are significant enough to justify adoption for non-production workflows today. Teams building Fabric solutions can immediately leverage AI-assisted code generation, schema discovery, and natural language data exploration.

Security reviews should precede broader deployment. Conduct assessments of authentication flows, evaluate token management practices, implement monitoring and logging, and establish procedures for credential rotation and access reviews. The local-first architecture of Fabric MCP (general) provides inherent safety for code generation scenarios. RTI MCP requires more stringent controls given its connection to live data sources.

Training investments should focus on MCP fundamentals and prompt engineering. Developers need to understand the client-server architecture, tool discovery mechanisms, and effective prompting strategies. Fortunately, the learning curve is gentle—developers already familiar with AI-assisted coding through GitHub Copilot or similar tools transition easily. The challenge shifts from memorizing API documentation to crafting effective natural language prompts and validating AI-generated outputs.

Production planning should wait for General Availability announcements. Microsoft hasn’t published GA timelines, but typical preview periods for Fabric features last 3-6 months. Organizations should monitor the roadmap, test implementations in development, provide feedback through official channels, and maintain traditional development approaches as fallbacks during the transition period.

The competitive positioning matters for data platform selection. Fabric MCP integrates natively with Microsoft’s ecosystem (Azure, GitHub, Visual Studio, Copilot Studio), creating a unified experience for organizations already standardized on Microsoft technologies. The open MCP standard provides investment protection—future AI models and platforms adopting MCP mean organizations aren’t locked into specific vendors. This contrasts with proprietary agent frameworks that create switching costs and vendor dependency.

Multi-agent orchestration emerges as a killer capability once MCP reaches production maturity. Imagine specialized agents working collaboratively: one agent monitors real-time data streams through RTI MCP, another generates pipeline code through Fabric MCP, a third manages infrastructure through Azure MCP, and a fourth coordinates workflow through Copilot Studio. This isn’t theoretical—the architecture exists today in preview, and early adopters are building proofs-of-concept.

Looking ahead

Microsoft’s roadmap includes expanded Eventstream support with richer visualization capabilities, Activator integration for proactive insights triggering actions based on real-time patterns, and additional RTI components for comprehensive analytics scenarios. The general Fabric MCP will likely add enhanced templates, more example patterns, and production deployment guides as it matures toward GA.

The MCP specification itself continues evolving. Recent updates added OAuth 2.1 support, Streamable HTTP Transport for efficient data transfer, and security enhancements around authentication delegation. C# SDK collaboration between Microsoft and Anthropic enables the .NET community to build MCP servers natively. SDKs in Java/Kotlin, Ruby, and PHP expand language coverage. These specification improvements flow into Fabric MCP implementations automatically.

Integration density should increase across Microsoft’s product portfolio. Expect tighter coupling between Fabric MCP and Power BI for report generation, deeper Dataverse integration through Microsoft 365 MCP servers, and cross-cloud scenarios leveraging Azure Arc and hybrid architectures. The pattern of specialized MCP servers for each service combined with cross-server orchestration creates combinatorial possibilities.

The community contribution model invites ecosystem growth. Microsoft explicitly welcomes pull requests to official repositories, accepts community-built templates and examples, and highlights community implementations in documentation. Organizations building internal MCP servers for proprietary systems can leverage Microsoft’s SDK infrastructure and reference implementations. This open approach accelerates innovation beyond what Microsoft could develop internally.

Conclusion

Microsoft Fabric MCP Server transforms how developers interact with unified data platforms, reducing API integration time by 50-70% and enabling natural language access to enterprise data. The October 2025 preview launch delivers two complementary implementations: a local-first code generation framework and a real-time intelligence server with KQL translation. Integration with GitHub Copilot, VS Code, Claude, and Copilot Studio provides immediate value for development workflows.

Preview status requires caution for production workloads, but the technology is production-ready for development environments. Security considerations around authentication, token management, and data access deserve careful evaluation. Organizations already standardized on Microsoft technologies gain the most value from deep ecosystem integration, while the open MCP standard provides insurance against vendor lock-in.

Early adopters report transformational impacts on development velocity, data democratization, and AI-assisted workflows. As implementations mature toward general availability and the broader MCP ecosystem expands, expect Fabric MCP to become foundational infrastructure for AI-native data analytics. The question shifts from whether to adopt to how quickly organizations can capitalize on AI-assisted development advantages while competitors experiment cautiously on the sidelines.

🚀 Ready to Master Power Platform, Fabric, Dynamics 365 and BizApps?

Join us at the European BizAps Summit to dive deeper into cutting-edge technologies and transform your organization’s approach to business applications.

Join 3,000+ Power Platform, Fabric, and Dynamics 365 practicioners, technology leaders, and innovators from across Europe at the premier event where the future of AI integration is shaped.

Secure Your Tickets Now

Early bird pricing available • The sooner you register, the more you save