Jellyfish
Profile
Jellyfish is a Software Engineering Intelligence platform built for teams that need visibility into the right indicators across all stages of the SDLC, especially as AI tools reshape developer workflows and outcomes.
It brings system-level clarity by aggregating signals across the engineering stack into a single, consistent model, so platform and engineering leaders can reduce friction, improve developer experience, and identify where AI can best be deployed to drive engineering productivity.
The result is a clearer view of what is changing, what is working, and what to do next.
Focus
AI is changing the SDLC, but most teams still lack a practical way to measure adoption, deliver outcomes, and ROI across engineering workflows. Jellyfish is designed to give clear visibility into how AI tools impact team productivity at different stages of the SDLC so leaders can optimize AI strategy and report on metrics that matter.
Jellyfish also makes developer experience measurable by mapping developer feedback to engineering outcomes, correlating sentiment with DORA, SPACE, and system metrics so you can quantify how experience affects speed, quality, and impact.
Finally, Jellyfish helps connect engineering work to financial reporting by collecting engineering signals, layering in cost data, and generating software cost and capitalization reporting as a byproduct of how teams already work.
Background
Jellyfish started in 2017 when its founders, Andrew Lau, David Gourley, and Phil Braden, experienced firsthand how difficult it was to communicate the value and business impact delivered by engineering teams. With Jellyfish, they set out to change that.
At its core, Jellyfish exists to help R&D teams make better decisions about engineering work. It brings clearer visibility to how work moves from planning to delivery, so teams can align effort to what matters, improve how they operate, and communicate outcomes with more confidence.
Today, that same mission extends into the AI era. Jellyfish is focused on transforming developer productivity by combining system and sentiment data from planning to delivery, and building capabilities that help teams understand how AI coding tools are changing the way software gets built.
Main features
Engineering resource allocation and delivery forecasting
Jellyfish provides capacity planning and delivery forecasting capabilities that leverage historical performance data to generate realistic projections of team capacity and completion timelines. The platform analyzes patterns from version control commits, pull requests, and issue tracking activities to determine how engineering effort distributes across projects, initiatives, and work categories without requiring manual time tracking. Organizations use these insights to set delivery goals aligned with actual team capacity, prevent overcommitment, and communicate realistic timelines to stakeholders. The Scenario Planner functionality enables leaders to model different resource allocation scenarios, adjusting variables like team size and project priorities to receive probabilistic forecasts of delivery impacts, translating engineering trade-offs into business language for executive decision-making.
AI coding tool impact measurement
The AI Impact module tracks adoption patterns, usage frequency, and productivity outcomes across multiple AI development assistants including GitHub Copilot, Cursor, Claude Code, and Amazon Q. Organizations can compare the relative effectiveness of different AI tools by measuring quantitative impacts on delivery speed, code quality metrics, and developer satisfaction rather than relying on vendor claims or anecdotal evidence. The platform segments impact analysis by team, individual developer, programming language, and work type, revealing where AI assistance proves most effective and where adoption barriers persist. This capability enables data-driven decisions about scaling AI tool investments, evaluating alternatives, and demonstrating ROI through concrete metrics like throughput increases and quality improvements.
Developer experience monitoring and team health analytics
Jellyfish combines quantitative system metrics with qualitative developer sentiment data through research-backed surveys tailored to organizational context. The platform correlates survey feedback with objective performance data from development tools, enabling leaders to distinguish actual productivity issues from perceptions that contradict evidence. DevEx functionality provides proactive alerts flagging concerning trends like excessive meeting time, high context switching, or developers working on too many parallel projects that signal burnout risk. Organizations use the DevEx Index to track whether productivity improvement initiatives deliver measurable results, benchmark developer experience against industry peers, and validate that platform investments reduce developer friction through adoption metrics and performance correlation analysis.


