Skip to content
AI Field Notes

Part II · Shipping Workflows

07

Jira Ticket to Cursor to PR

Turning a Jira ticket into a structured Cursor task with Figma design context

At iO, we use an internal AI tool to generate first-draft Jira tickets from a feature description. Scope, acceptance criteria, edge cases, analytics requirements. Sounds like a small thing — but a well-structured ticket is excellent context for Cursor, and that changes the entire development workflow.

The ticket generation step

A good AI-generated ticket has:

  • Scope: what's in, what's explicitly out
  • Acceptance criteria: specific, testable conditions
  • Edge cases: the scenarios that are easy to miss
  • Analytics: what events to track, what properties to include

These are generated from a brief description, not written from scratch. The model fills in the structure and forces you to think about edge cases you might have skipped.

Using the ticket as Cursor context

Once you have a well-structured ticket, paste it into your Cursor session, add the relevant code files, and the model has everything it needs.

Acceptance criteria become test cases. Edge cases become guardrails. Scope definition prevents the model from going off on tangents.

For UI tasks, grab the Figma link for the component you're implementing. In Figma, select the component or frame you want to build, right-click and choose Copy/paste → Copy link to selection (or press ⌘L). Add it to your Cursor prompt as:

Figma design: https://www.figma.com/design/abc123/...?node-id=12-34

Keep Figma Desktop open during your session — Cursor's Figma MCP server connects to the running desktop app to pull design tokens, component properties, and layout data directly into your prompt context.

Setting up the Figma MCP

Figma MCP gives Cursor direct access to design tokens, component variants, and layout data. Keep Figma Desktop open during your session. See Figma MCP Workflows for setup.

The flow: ticket context + Figma design link + Figma MCP → plan → implementation → PR. The PR description writes itself from the ticket's acceptance criteria.

Model strategy

Ticket generation benefits from a capable model — you want it to think carefully about edge cases. Implementation can use a faster model once the context is set. Final review? A reasoning model.

Model Mixer

Compare costs and build optimized model pipelines

Pick a task

Review a medium-sized file and get structured feedback on issues and improvements.

~4k in / ~2k out

How often?

DeepSeek-V3.2
$1.05/mo
$0.0018/run
Composer 2
$3.45/mo
$0.0057/run
Gemini 3 Flash
$3.90/mo
$0.0065/run
Claude Haiku 4.5
$6.90/mo
$0.011/run
GPT-5.4
$19.50/mo
$0.033/run
Claude Sonnet 4.6
$20.70/mo
$0.035/run
Claude Opus 4.6
$34.50/mo
$0.057/run

DeepSeek-V3.2 is 97% cheaper than Claude Opus 4.6 for this task — $1.05 vs $34.50/mo

A fast model catches obvious issues. A stronger model finds subtle logic bugs and gives more actionable feedback — fewer missed problems means less rework later.

Prices from official API docs, verified 2026-03-20. Anthropic · OpenAI · Google · DeepSeek · Cursor

Workflow Recipe

Copy-pasteable flows and guided workflow finder

Pick a workflow

Prompt

I need to build [feature]. Here’s the spec: inputs, outputs, edge cases, constraints. Produce a plan before writing any code.

Steps

1

Spec

Write a developer spec with inputs, outputs, edge cases, and constraints.

2

Plan

Ask the model to produce a plan — files to create/modify, key decisions. Review before coding.

3

Code

Implement against the plan. One feature, one PR. Pull the model back if it goes out of scope.

4

Review

Ask the model to review its own code: "What edge cases might this miss? What would break this?"

5

Tests

Generate tests from the review’s edge cases: "Write tests for the edge cases you identified."

6

PR description

Generate the PR description from the spec and the diff. Full context produces clear descriptions.

Tools

CursorClaude CodeGitHub Actions

Guardrails

  • One feature, one PR — keep scope tight
  • Review the plan before writing code
  • Don’t let the model touch files outside scope
  • Ask before refactoring adjacent code

Expected output

Working PR with passing CI, clear description, and tests covering the identified edge cases.

Paste into CLAUDE.md, .cursorrules, or your workflow docs