A general-purpose connection for AI assistants and agents
Steady’s MCP server is the universal way to bring Steady into your AI tooling. If your client speaks the Model Context Protocol — and most modern AI assistants do — it can read and write your Steady data alongside whatever else it’s connected to.
This is the right starting point if your team uses Gemini, Microsoft Copilot, Cursor, or other assistants beyond Claude and ChatGPT. It’s also the right primitive if you’re building a custom agent or workflow that needs to pull from Steady.
What’s available
Reads: Daily Digest, Smart Check-ins, Goal Stories, Goal Story updates, Echoes, activities, Insights, and absences.
Writes: Submit check-ins, create Goal Stories, post Goal Story updates.
Use cases
Format-shift your data. Live in a Gemini side panel or a Cursor chat? Pull your Daily Digest into the surface you actually work in.
Conversational updates. Use any MCP-aware assistant to write check-ins and goal updates, with prior Steady context loaded automatically.
One-off questions about your team. Echoes are designed for recurring context. For one-off questions (“Has anyone mentioned the Q2 launch this week?”), an AI assistant with MCP access is a good fit.
Bridge tools that don’t natively integrate with Steady. If your team relies on a tool that doesn’t have a dedicated Steady integration, an MCP-aware assistant can pull from there and Steady, then post a synthesized update.
Move data between systems. Turn Jira Epics into Steady Goal Stories. Push a Steady digest into a Slack channel. Draft stakeholder emails from goal updates. The MCP server is general-purpose; whatever your client can chain together, you can build.
Installation
The server URL is https://app.steady.space/mcp. Add it wherever your AI client accepts a remote MCP server. Setup steps vary per application — see the AI Assistants doc for guidance on common clients including Gemini, Microsoft Copilot, GitHub Copilot, Cursor, and Codex.
Security
Access is authenticated via OAuth 2.1 and personally scoped. An MCP client can see and write exactly the same data its authenticating user has access to in Steady. There’s no service account, no shared access, and no way for the connection to elevate beyond the user’s own permissions.
About MCP server
The Model Context Protocol (MCP) is an open standard for connecting AI assistants and agents to the tools and data they need. Steady's MCP server speaks the standard, which means any MCP-compatible client — Gemini, Microsoft Copilot, GitHub Copilot, Cursor, Codex, Claude, ChatGPT, or a custom agent you've built — can read and write your Steady data through it.
Frequently asked questions
Which AI assistants can connect to Steady?
Any client that supports the Model Context Protocol. That currently includes Claude (Desktop, Code, and mobile), ChatGPT, Gemini and Gemini CLI, Microsoft Copilot, GitHub Copilot, Cursor, Codex, and a long list of others. Custom agents built with frameworks that speak MCP work too.
What can MCP clients do with my Steady data?
The current release supports reading from your Daily Digest, Smart Check-ins, Goal Stories, Goal Story updates, Echoes, activities, Insights, and absences, and creating Check-ins, Goal Stories, and Goal Story updates. Future releases will add tools like Quick Fill rendering, Action Items, and more.
Where do I add the MCP server?
Add Steady’s MCP server URL — https://app.steady.space/mcp — wherever your AI assistant accepts custom connectors or remote MCP endpoints. Steps vary per app; the AI Assistants doc has guidance for common clients.
How is access secured?
Connections authenticate via OAuth 2.1, and access is personally scoped. The agent or assistant connecting on your behalf sees exactly the data you can see in Steady, and nothing else. Each user authenticates individually with their own credentials.
Why bring AI assistants into Steady at all?
Steady’s coordination tools are useful precisely because they live where your team lives. As more of that surface area moves to AI assistants, an MCP server makes sure Steady comes along for the ride — both for individual workflow flexibility and as a building block for the agent-driven coordination patterns we’re investing in.