MCP Server for Dynamics 365Preview

Dataverse access for AI agents. Plan-hash safe.

Query tables, describe schemas, create/update/delete records and tables — all from Claude Desktop, Cursor, or Windsurf. Every mutation requires a SHA-256 plan hash from the matching plan tool.

Preview. Read tools enabled by default. Start with --enable-writes to unlock mutations.

Claude Desktop

You:

List the tables in our Dynamics environment

Claude:

Running dynamics.list_tables...

Dataverse Tables

account          (Account)         3,847 rows

contact          (Contact)         12,094 rows

opportunity      (Opportunity)     891 rows

lead             (Lead)            2,456 rows

cr7a2_project    (Project)         234 rows [custom]

Total: 47 tables (35 system, 12 custom)

Claude Desktop / Cursor / Windsurf Config

Add this to your MCP client configuration. Read and plan tools are available by default.

{ "mcpServers": { "g-gremlin-dynamics": { "command": "g-gremlin", "args": ["mcp", "serve", "--provider", "dynamics"] } } }

To enable apply tools, add "--enable-writes" to the args array.

Safety Contract

How Writes Are Governed

Four layers of protection between an AI agent and your Dataverse environment.

1

Default: Read-Only

4 read + 6 plan tools are registered by default. No apply tools are exposed unless explicitly enabled.

2

Applies Require --enable-writes

Apply tools are only registered when the server starts with the explicit flag. You control this at the server level, not the AI agent level.

3

Mutations Are Plan → Apply

Every apply operation has a corresponding plan tool. The plan step produces a preview and a SHA-256 plan_hash. No plan, no mutation.

4

Hash Mismatch = Rejected

Apply requires the exact plan_hash from the plan step. If anything changed — different parameters, stale plan, modified payload — the hash won't match and the operation is rejected.

What Dynamics Teams Actually Need

Every feature exists because someone hit a wall trying to use AI with Dataverse.

AI assistants can't query Dataverse from Claude Desktop

dynamics.read_query runs OData queries against any Dataverse table. Select columns, filter, sort, and paginate.

Dataverse table schemas are buried in make.powerapps.com

dynamics.describe_table returns full metadata: columns, types, relationships. Ask Claude to map your data model.

Letting an AI create or delete Dataverse records sounds dangerous

Every mutation has a matching plan tool. Plan produces a SHA-256 plan_hash. Apply requires the exact hash. No hash, no write.

Six different mutation types, each needing safety guardrails

Symmetric plan/apply pairs for create_record, update_record, delete_record, create_table, update_table, delete_table. Same safety model on every operation.

Dataverse API calls may have billing implications

Read tools are lightweight discovery calls. Write tools are gated behind --enable-writes so you can audit costs before enabling mutations.

Prerequisites for Dataverse access are unclear

dynamics.doctor checks all prerequisites: credentials, connectivity, and runtime diagnostics. Know what's missing before you start.

Every Write Has a Matching Plan

Six mutation types, each with a deterministic preview step. No plan_hash, no mutation.

dynamics.create_record.plan

generates plan_hash

dynamics.create_record.apply

requires plan_hash

dynamics.update_record.plan

generates plan_hash

dynamics.update_record.apply

requires plan_hash

dynamics.delete_record.plan

generates plan_hash

dynamics.delete_record.apply

requires plan_hash

dynamics.create_table.plan

generates plan_hash

dynamics.create_table.apply

requires plan_hash

dynamics.update_table.plan

generates plan_hash

dynamics.update_table.apply

requires plan_hash

dynamics.delete_table.plan

generates plan_hash

dynamics.delete_table.apply

requires plan_hash

16 MCP Tools

Structured JSON responses designed for AI agent consumption, not human terminal output.

Tier 1: Read & DiscoverREAD

dynamics.doctor

Check Dataverse runtime prerequisites and credential setup

dynamics.list_tables

List Dataverse tables (EntityDefinitions)

dynamics.describe_table

Describe table metadata and attributes

dynamics.read_query

Read rows from a table using OData query options

Tier 2: PlanPLAN

dynamics.create_record.plan

Preview and generate plan_hash for a new record

dynamics.update_record.plan

Preview and generate plan_hash for a record update

dynamics.delete_record.plan

Preview and generate plan_hash for a record deletion

dynamics.create_table.plan

Preview and generate plan_hash for a new table

dynamics.update_table.plan

Preview and generate plan_hash for a table update

dynamics.delete_table.plan

Preview and generate plan_hash for a table deletion

Tier 3: ApplyAPPLY

dynamics.create_record.apply

Create a Dataverse record (requires plan_hash)

dynamics.update_record.apply

Update a Dataverse record (requires plan_hash)

dynamics.delete_record.apply

Delete a Dataverse record (requires plan_hash)

dynamics.create_table.apply

Create a Dataverse table (requires plan_hash)

dynamics.update_table.apply

Update a Dataverse table (requires plan_hash)

dynamics.delete_table.apply

Delete a Dataverse table (requires plan_hash)

Two-Phase Safety on Every Mutation

Nothing writes to your Dataverse environment until you've reviewed the plan. Every mutation requires a cryptographic hash.

1

Plan

Call the .plan tool with your parameters. Returns a preview and a SHA-256 plan_hash.

"operation": "create_record",

"plan": { "table": "account", "data": {} },

"plan_hash": "sha256:b7e4d1f3..."

2

Apply

Call the .apply tool with the same parameters + the plan_hash. Mismatched hash = rejected.

"plan_hash": "sha256:b7e4d1f3...",

"ok": true, "id": "a1b2c3d4-..."

Dataverse API Billing

External API calls to Dataverse consume Power Platform request capacity. Read tools (doctor, list_tables, describe_table, read_query) are lightweight discovery calls. Write operations may consume additional capacity depending on your licensing tier.

Review your Power Platform admin center for current API limits before enabling write tools in production.

FAQ

Common Questions

How does this relate to Microsoft's official Dataverse MCP?

Microsoft ships a Dataverse MCP proxy that requires PPAC enablement and allow-listing for non-Microsoft clients. g-gremlin-dynamics implements native Dataverse tools with the same safety model used for Salesforce and HubSpot — plan_hash verification on every mutation, gated writes, and consistent JSON response envelope.

Do I need to enable the official Dataverse MCP in PPAC?

No. g-gremlin-dynamics connects to Dataverse directly using the Web API. However, dynamics.doctor will check and report whether the official proxy is also available, in case you want to use it alongside.

Is it safe to let an AI create and delete Dataverse tables?

Every mutation has a matching plan tool that produces a SHA-256 plan_hash. The apply tool requires the exact hash. If the payload changes between plan and apply, the hash won't match and the operation is rejected. Additionally, all write/apply tools are hidden unless you start the server with --enable-writes.

Are there billing implications for Dataverse API usage?

Yes. External API calls to Dataverse consume Power Platform request capacity. Read tools (doctor, list_tables, describe_table, read_query) are lightweight. Write operations should be reviewed against your Dataverse API limits.

Which MCP clients are supported?

Claude Desktop, Cursor, Windsurf, and any MCP-compatible client. The server uses stdio transport.

How do I install it?

pipx install g-gremlin[mcp], configure Dataverse credentials (tenant ID, client ID, environment URL), and add the server to your MCP client config. Requires the MCP-enabled g-gremlin build — check release notes for availability. dynamics.doctor will verify everything is connected.

Your AI assistant just got Dataverse access.

Preview is live. Start a 30-day free trial for full access.