Engineering success should be measured by business impact
Velocity looks great on a dashboard, but it doesn’t close deals or solve customer problems. For years, teams have measured software development success by how many tasks they completed in a sprint. The number feels objective, data-driven. It’s easy to report. But it’s not hard to hit arbitrary velocity targets and still fail to move the business forward.
Ben Matthews, Senior Director of Engineering at Stack Overflow, said it simply: velocity is a tool, but it’s not the goal. If your engineering team completes 200 points in a sprint, but sales, retention, and customer value stay flat, or worse, decline, it means you’re moving fast in the wrong direction.
Productivity in engineering should be tied to outcomes that matter: shipping a feature users love, solving a major customer pain, or improving infrastructure so your business can scale. These outcomes align with growth. They create value. That’s what real impact looks like. Executives should expect engineering leaders to connect their teams’ work to these business objectives directly and visibly.
So let’s cut the noise. Velocity on its own doesn’t signal progress; impact does. Make that the north star, and everything else, the dashboards, the metrics, should flow outward from that.
Overreliance on velocity can undermine product quality
Pushing developers to crank out more story points each sprint won’t make your product better. It just adds pressure. Faster output without purpose turns into waste. You’ll see developers skipping reviews, cutting corners, and ignoring long-term stability. Quality drops fast. Then comes burnout.
This happens because the metric becomes the mission. If you tell your engineers that higher velocity equals more value, they’ll deliver higher velocity, even if it means shipping half-baked features or taking on tech debt. And when quality problems show up later in production, morale collapses. Everyone knows deep down the work doesn’t matter if it doesn’t help the business or the user.
Ben Matthews wrote about this directly on the Stack Overflow blog: when teams are driven by arbitrary velocity targets, they stop making thoughtful decisions. Developers, and their managers, end up optimizing for the wrong thing. Velocity becomes a scoreboard, but no one’s winning.
Let’s look at the business impact. A Deloitte study shows 75% of IT, engineering, and business leaders agree developer experience is key to overall success. Burned-out, misaligned teams don’t create healthy businesses. You don’t fix bad working conditions with more metrics. You fix them by aligning teams with real outcomes and giving them the context to care.
If you’re a business leader, ask your engineering heads one question: is your team moving faster, or are they moving in the right direction? Velocity has value, but when it becomes the goal, it guarantees you’ll miss what actually matters.
Velocity retains value as a diagnostic tool rather than a standalone measure of success
Velocity isn’t useless. It just gets misused.
When used properly, velocity is a signal, a data point to help leadership understand how things are moving inside the team. It can point to problems: hiring gaps, shifting priorities, workflow friction. But by itself, it doesn’t explain what’s working or why outcomes aren’t being delivered. That’s the catch.
Ben Matthews from Stack Overflow put it straight: people often view velocity as a team’s performance score, but the smarter view is to treat it as one input among many. That’s accurate. For example, rising velocity could mean your team eliminated major blockers. Or, it could mean they’re inflating numbers to meet reporting pressure. That’s why context matters.
More importantly, velocity is highly sensitive to external factors, reorganizations, holidays, team restructuring, even changes in company leadership. These impact output but have nothing to do with core capability. That inconsistency makes it a poor foundation for strategic planning.
For executives, the takeaway is simple: track velocity, yes. But never in isolation. Combine it with signals from product adoption, customer feedback, production stability, and developer satisfaction. When reviewed in context, velocity helps leaders ask better questions, not make wholesale assumptions.
Use it to recalibrate, not to dictate direction.
Success should be measured through metrics that are aligned with business objectives
If the goal is growth, alignment must be tight between what engineering creates and what the business actually needs.
Right now, too many teams ship code fast but without clarity on whether that work solves high-priority problems. That’s where failure creeps in. Performance reviews, incentives, and strategy sessions often miss this alignment. And it’s not hard to fix.
Executives should demand a metrics framework that ties engineering OKRs directly to top-line goals. That means asking questions like: Are we reducing time to deploy customer-requested features? Are we cutting defect rates that cause user churn? Are we hitting a level of speed and stability that helps sales teams close larger contracts? These are business-aligned indicators.
Dan Lines, COO and co-founder of LinearB, stated it well: different companies define impact differently. For some, it’s getting to production faster. For others, it’s fixing stability and customer satisfaction. But the common thread is that engineering isn’t separate from the core business. The work has to serve the outcomes.
The data backs this up. According to Google’s EngSat survey, developer satisfaction and productivity improve when the focus is on Speed, Ease, and Quality. These are not just developer-friendly ideas, they’re operational performance levers. When these factors improve, engineering becomes a growth function, not a cost center.
So, if you’re leading a company, make sure every engineering metric you track has a direct line to customer or business impact. That eliminates wasted motion, drives accountability, and keeps your teams building what matters.
Developers need clear visibility into how their work impacts broader business goals
Developers do their best work when they know the purpose behind it. Work without context feels transactional. That leads to disengagement. Eventually, output declines, not in quantity, but in value.
Most engineering teams are still siloed. They get tasks, complete them, and move to the next sprint. Meanwhile, key outcomes, the growth of the customer base, product expansion, revenue impact, remain hidden from view. That gap is a problem. It creates teams that move fast without understanding if they’re solving the right problems.
Ben Matthews from Stack Overflow addressed this directly. He said engineers need to be treated as stakeholders, not just executors. That means giving them a seat in customer feedback loops, product meetings, and conversations with sales and support. If engineers understand which pain points matter most to the business, they can code solutions that move the needle.
This isn’t just about internal productivity. Business leaders who want real traction out of their engineering investments must ensure engineers see the full picture. The connection between developer actions and company results must be clear. That clarity fuels better decision-making, smarter prioritization, and more ownership inside the engineering organization.
Visibility isn’t optional. It’s a baseline for building teams that solve the right problems at the right times, with maximum efficiency.
Transitioning to impact-oriented metrics
Moving from output-based metrics to impact-driven ones is necessary. But it must be done with precision. Driving performance through unclear or rushed metric changes only creates confusion. To execute this shift properly, leadership needs to communicate the “why,” not just the “what.”
Teams need time to understand how success will now be measured, what will change in their day-to-day work, and how it connects to long-term goals. Without this, they won’t trust the system. This slows down buy-in and drags on momentum.
This transition also demands feedback loops. Metrics aren’t perfect. They reveal insights, but they must evolve as the business grows. Leaders need to review what’s working, what’s breaking, and adjust without hesitation. Treat dips in performance not as red flags, but as useful data about where assumptions were wrong, and use that to refine your approach.
There’s evidence this works. Itamar Gilad’s research shows that teams who invest time in customer research, set clear feature goals, and validate ideas early have far higher success rates in feature delivery. That’s what shifting to impact-based measurement looks like in practice: fewer wasted features, more sustained impact.
For executives, the takeaway is straightforward. Redefining how performance is measured is a strategic lever. Do it with structure, transparency, and regular review. That’s how teams start building with purpose, and how their output begins to consistently support revenue, retention, and scale.
Key executive takeaways
- Shift success metrics toward business impact: Velocity is not proof of value. Leaders should prioritize engineering outcomes tied directly to revenue, retention, and customer experience over raw delivery speed.
- Watch for velocity-driven burnout and quality loss: Overemphasis on sprint numbers often leads to developer fatigue and declining product quality. Executive mandates should focus on long-term value over short-term throughput.
- Use velocity as diagnostic, instead of a directive: Velocity can flag performance trends but should not drive strategy. Treat it as one of several indicators to guide team health assessments and resource decisions.
- Align engineering metrics with business objectives: Metrics must reflect what truly moves the business. Leaders should tie engineering OKRs to production speed, product reliability, or customer satisfaction depending on core goals.
- Ensure developers see their impact: Developers must understand how their work supports company outcomes. Integrate engineering teams into cross-functional communication to boost engagement and informed decision-making.
- Guide teams through metric shifts with clarity: Moving to impact-focused measurement requires transparency and consistency. Executives should clearly explain the “why,” involve teams early, and revise metrics based on ongoing learning.