It’s well known that a surprisingly small portion of a developer’s work week is dedicated coding. Big chunks of their time is dedicated to administrative tasks like refining user stories, managing epics, detailing acceptance criteria, and all the other work that happens around the coding.
This administrative load reduces the time available for core development work, limiting how fast your team can move.
However, the same models that power AI coding assistants like Cursor, GitHub Copilot, and Windsurf AI are the same models you use in Claude or ChatGPT or Google Gemini.
This means the AI coding assistants can be used for more than just writing code.
By employing the right tooling and practices, AI can potentially cut down the time spent on writing, reviewing, and synchronising epics, stories, and acceptance criteria by a significant margin.
Leveraging AI for More Than Code
AI coding assistants can be used to reduce the administrative overhead that consumes developers’ time without requiring them to swap to a different app or a browser.
It can all be done directly within the IDE. Cursor and Windsurf AI allow developers to create “rules” – documents that instruct the AI on how to complete specific tasks. While they were intended to provide task-dependent context and guidance to the coding assistant, these rules can also be used to provide context and guidance in generating drafts and revisions of project and sprint documentation, user stories, and other essential agile artefacts.
Streamlining Workflows with MCP and Project Management Integration
The coding agents within these AI-powered IDEs can also be connected to popular project management tools like Jira and Linear through the Model Context Protocol (MCP).
MCP is an open standard designed to enable two-way communication between AI applications and external data sources or tools. This protocol allows AI assistants to pull information from these project management systems and even push updates like new tickets or status changes, further automating administrative tasks.
This integration means that an AI assistant, guided by predefined rules and connected via MCP, can:
- Draft a user story based on a brief description.
- Populate relevant fields in a Jira or Linear ticket.
- Update the status of tasks based on development progress.
- Flag inconsistencies or dependencies between different user stories or epics.
Example: Cursor Rule for Drafting a User Story
AI coding assistants like Cursor use rule files (for Cursor, .mdc files in a .cursor/rules directory, where .mdc is just Markdown with Cursor-specific metadata in the header) to guide the AI’s behaviour. These rules can define the AI’s persona, its understanding of project-specific conventions, and the desired output format for various tasks.
Here’s a very short, conceptual example of what a Cursor rule file for drafting a user story might look like:
--- description: "User Story Generation Rule" globs: alwaysApply: false ---
You are an expert Agile Business Analyst. Your role is to help draft clear, concise, and actionable user stories. ### User Story Structure: When asked to draft a user story, follow this format: **As a** [type of user], **I want to** [perform an action], **So that** [I can achieve a goal/benefit].
### Clarification: If the request to draft a user story does not include details about the user or action or benefit stop and ask for clarification on the user type, desired action, or intended benefit before drafting the story. Only the user can decide what the user story is about.
### Acceptance Criteria: For each user story, also draft a preliminary list of acceptance criteria. Start with at least three criteria. - Acceptance Criterion 1: - Acceptance Criterion 2: - Acceptance Criterion 3: ### Task Generation: Suggest 2-3 initial development tasks that would be required to implement this user story. - Task 1: - Task 2: ### Final Step:
Follow the user's instructions for any requested changes. After each change ask the user if the User Story is complete. If they confirm it is complete, use the Jira MCP server to add the User Story to the current project.
This rules file instructs the AI on the standard user story format, the need for acceptance criteria and related tasks. It has a final step that instructs the AI to use a Jira tool to add the created user story to the current project.
It doesn’t make up the User Story itself. That thinking still needs to be done by the developer who understands the broader context of the project beyond the code. It does however, rely on the AI to generate initial acceptance criteria and tasks. How well these will match your developers’ intentions depends on how well represented your product is in the AI’s training data.
Now, this rules file is just a draft. It will need tweaks to work consistently in your codebase. Use it more as a source of inspiration. What other steps in your process can you automate or streamline using the AI in your team’s coding assistant? And don’t forget that you can use the AI coding assistant to write the rules files for you.
Remember: Your Coding Assistant is an Everything Assistant
For now, the AI under the hood of your coding assistant is a SOTA frontier model that can do more than just code. With the right rules files and attached to the right MCP servers, your coding assistant can do everything any other AI tool can do. All from the one interface. Make the most of it to accelerate your team.