- DE
- Services
- Our service portfolio
We bring your digital product vision to life, from crafting real-world testable prototypes to delivering comprehensive product solutions.
- Collaboration models
Explore collaboration models customized to your specific needs: Complete nearshoring teams, Local heroes from partners with the nearshoring team, or Mixed tech teams with partners.
- Way of working
Through close collaboration with your business, we create customized solutions aligned with your specific requirements, resulting in sustainable outcomes.
- Our service portfolio
- About Us
- Who we are
We are a full-service nearshoring provider for digital software products, uniquely positioned as a high-quality partner with native-speaking local experts, perfectly aligned with your business needs.
- Meet our team
ProductDock’s experienced team proficient in modern technologies and tools, boasts 15 years of successful projects, collaborating with prominent companies.
- Our locations
We are ProductDock, a full-service nearshoring provider for digital software products, headquartered in Berlin, with engineering hubs in Lisbon, Novi Sad, Banja Luka, and Doboj.
- Why nearshoring
Elevate your business efficiently with our premium full-service software development services that blend nearshore and local expertise to support you throughout your digital product journey.
- Who we are
- Our work
- Career
- Life at ProductDock
We’re all about fostering teamwork, creativity, and empowerment within our team of over 120 incredibly talented experts in modern technologies.
- Open positions
Do you enjoy working on exciting projects and feel rewarded when those efforts are successful? If so, we’d like you to join our team.
- Candidate info guide
How we choose our crew members? We think of you as a member of our crew. We are happy to share our process with you!
- Life at ProductDock
- Newsroom
- News
Stay engaged with our most recent updates and releases, ensuring you are always up-to-date with the latest developments in the dynamic world of ProductDock.
- Events
Expand your expertise through networking with like-minded individuals and engaging in knowledge-sharing sessions at our upcoming events.
- News
- Blog
- Get in touch
16. Aug 2023 •10 minutes read
Essential metrics composition for your product
Jovica Zorić
Chief Technology Officer
Corinna Strebel
Chief Product Officer
Having metrics for your product is crucial for its success. You want to use these metrics to understand your product’s strengths and weaknesses, identify areas for improvement, and ultimately make data-driven decisions to grow your business. We have so many different metrics available that it can be overwhelming to figure out which ones are most important for your particular product.
In addition, the metrics for your product alone are not sufficient, as they must also encompass the infrastructure where your product is deployed and the team responsible for developing the solution. Everything counts. That’s why, in this article, we will talk about the essential metrics that you can look into when starting. As mentioned, we will look at metrics from three different perspectives: System: the infrastructure for your product. Team: the ones who are building the solution. Product: your solution to a business problem.
System perspective
Your application needs to run somewhere, be it in the cloud, on-premise, or hybrid. You need to know how your system behaves. How do you start? The answer is with four golden signals.
Golden signals are key performance indicators that provide valuable insight into the health and performance of a production system. They are used as a basis for monitoring and alerting which can help you quickly identify and respond to issues before they escalate into major incidents. Four golden signals are:
Latency – the amount of time it takes to serve a request.
Traffic – the volume of requests that a system is currently handling.
Error rate – the number of requests that fail or return an unexpected response.
Resource saturation – the percentage of available resources being consumed.
A great resource that I can recommend for diving more into production systems, their reliability, management, and principles is Google’s book on site reliability engineering. If you just want to check details on four golden signals, visit this page.
To help you monitor and gather data on the four golden signals of observability, here are a few tools to check out:
The list above is not complete, and depending on the requirements and infrastructure, some tools may be better suited than others. Also, check the pricing model of these tools and how they fit into your planning and strategy.
Team perspective
How do you measure a group of people, a team? What are the numbers that can give insights into team performance or success?
Usually, you want a cross-functional team from different departments that work on a common goal, including marketing, sales, finance, engineering, and HR. This article will focus on the engineering department and DevOps as a philosophy.
DevOps is a set of practices that aim to enhance software delivery by emphasizing collaboration, automation, and integration, leading to faster, better, and more efficient outcomes. To assess a team’s performance, we need to consider a combination of technical and soft metrics.
How do you start? You start with DORA metrics for technical and surveys for soft metrics.
DORA metrics
DORA (DevOps Research and Assessment) metrics are a set of performance indicators that are used to evaluate the performance of an organization’s DevOps. Using these metrics, you can tell if your DevOps teams are “low performers” or “elite performers.”
There are four main categories of DORA metrics:
Lead time for changes is the time it takes for a change request to go from development to production. It’s an indicator of the speed and efficiency of the software delivery process.
Deployment frequency measures the frequency of software deployments. It’s an indicator of the speed and frequency of software releases.
Time to restore service (MTTR) measures the time it takes to restore service after an incident. It’s an indicator of the reliability and stability of software releases.
Change failure rate measures the number of failed changes to production relative to the number of total changes. It’s an indicator of the reliability and stability of software releases.
You can find more details here, and run a quick check here.
Client satisfaction
Good communication with your client/stakeholder is the key, and one way of learning more about it is by conducting a survey. Regularly performed surveys will enable you to get insightful feedback and show you where you and your client need to improve. Also, surveying will show that you value their opinions and are committed to making improvements based on feedback.
As an example, you can create a survey using these open questions (use a rating scale from 1-5 and free text):
How satisfied are you with the communication and collaboration between the team and stakeholders?
How satisfied are you with the team’s ability to deliver working software?
How satisfied are you with the level of transparency and visibility into the project’s progress and results?
How satisfied are you with the team’s ability to adapt to changes and handle uncertainty?
How satisfied are you with the overall results and outcomes of the project?
Any areas where you think we could improve?
Team happiness/Team resilience
Team happiness/resilience refers to a team check-in and looking into how they are satisfied with the current project/product development. It can be anonymous or public, depending on the team’s preference.
As an example, you can create a survey using these open questions (use a rating scale from 1-5):
How satisfied are you with the overall project?
How effective do you feel the communication and collaboration are within the team?
How would you rate the relationship with the client?
How fairly do you feel decisions are made within the team?
How well-equipped do you feel with the resources and tools necessary for success?
How satisfied are you with the support provided by team leaders and managers?
To conclude on team metrics, there are numerous metrics that you can collect to evaluate your team’s performance. However, it is crucial to involve your team in the whole decision-making process. If team members do not perceive the significance of a metric, they may either lack the motivation to improve it or manipulate the results to prove its irrelevance.
Therefore, team members’ input is vital for selecting metrics that align with the team’s objectives and interests, resulting in more effective performance evaluations, which lead to the following recommendation:
Avoid counting Jira tickets. Why? Because the metric is flawed, neglecting the complexity and difficulty of tasks. It fails to offer insight into team productivity and effectiveness. Plus, it can be manipulated through over-slicing tickets into tiny subtasks, whether intentionally or not.
Avoid using these metrics to make comparisons between teams. Context is vital in performance measurement and should be taken into account. Teams should communicate and learn from one another by sharing knowledge, experiences, and best practices. This way, they can identify areas of improvement and develop effective strategies to enhance their performance.
Product perspective
In today’s digital world, data is king. If you want to improve your product, you need to measure its performance. Product metrics help you understand how users interact with your product, how effective it is, and where you need to make improvements. But with so many metrics to choose from, it can be difficult to know which ones to track.
How do you start? You start with defining your North Star and understanding the concepts of leading and lagging indicators. You can then explore popular frameworks like AARRR and HEART and learn how to effectively use OKRs (Objectives and Key Results) and KPIs (Key Performance Indicators) to track progress toward your goals. Additionally, it’s important to be aware of Vanity Metrics and how to avoid them.
North Star
When starting with product metrics, the first step is establishing a North Star metric. This metric represents the product’s primary objective and serves as a guiding principle for all other metrics. For example, popular companies such as Facebook may look for daily active users (DAU) – while active means “time spent actively engaging with the feed,” Amazon for “purchases per Prime subscriber,” and Netflix for “numbers of subscribers watching content” (see Amplitude Blog post). The North Star metric should align with the company’s vision and objectives and be measurable over time.
Once the North Star metric has been identified, it’s important to establish leading and lagging indicators. Leading indicators are early indicators that provide insights into the product’s future performance while lagging indicators are retrospective measures of past performance. Leading indicators can help identify potential issues before they occur while lagging indicators can provide insights into the effectiveness of previous actions. Therefore, your North Star metric should always be a leading indicator that allows you to take action if needed.
If you wish to learn more about leading and lagging indicators, check out this great article – Measure the Progress of OKRs using Leading and Lagging Indicators.
After defining your North Star, you need to find other metrics and consider which ones to include. You should not rely on one metric only but challenge it with another. A simple question that can help you decide is this: What would you do differently based on this metric information? Find here a good guide on how to define your NSM.
Popular frameworks
Check out the list below to discover popular frameworks that can aid your search for metrics:
AARRR Metrics
AARRR Framework also known as Pirate Metrics. The framework focuses on the entire customer lifecycle, from initial acquisition to advocacy. By tracking metrics for each stage of the customer journey, companies can identify areas that need improvement and optimize performance.
AARRR is an acronym for:
- Acquisition metrics: how are you acquiring new customers and users (e.g., customer acquisition cost, the open rate in relation to your communication channel)
- Activation metrics: how are you engaging and retaining new users (e.g., sign-ups, subscriptions)
- Retention metrics: how are you retaining users over time (e.g., cohort analysis, monthly active users)
- Revenue metrics: track financial performance (e.g., weekly recurring revenue, conversion rate)
- Referral metrics: do users recommend your product (e.g., k-factor or viral coefficient)
To find out more about the topic, check out this blog post.
HEART Metrics
HEART metrics is a framework developed by Google which focuses on the user experience and aims to measure how users perceive the product. By tracking metrics related to these categories, companies can gain insights into how to enhance the user experience and increase user satisfaction.
HEART is an acronym for:
- Happiness: Measuring overall user satisfaction usually using surveys that ask users to rate their experience on a scale from one to five or one to ten.
- Engagement: Engagement measures the level of user involvement with the product or service, which can be measured through metrics like the number of visits, time spent on the site or app, or the number of interactions with the product.
- Adoption: Measuring how many users are actually using the product or service by tracking the number of users who sign up or create accounts.
- Retention: Retention measures how many users continue to use the product or service over time. It can be measured by tracking the number of returning users, the frequency of use, or the lifetime value of a customer.
- Task success: Task success measures how effectively users are able to accomplish their goals using the product or service. It can be measured by tracking the completion rate of tasks or the time it takes to complete a task.
To find out more, check out the research paper here.
OKR and KPI
OKRs (Objective and Key Results) and KPIs (Key Performance Indicators) are useful for product management because they provide a framework for setting goals and measuring progress towards those goals.
While OKRs help ensure the product team is aligned towards a common objective and can track progress towards that objective, KPIs will mainly provide the health status and can be decoupled from the objective. Nevertheless, both can be combined since some KPIs can be perfect Key Results that will help to achieve a company’s overall goal(s) or objective. By using OKRs and KPIs, the product team can prioritize their work and make data-driven decisions to improve the product over time.
For OKRs and KPIs, it’s important to understand different characteristics in terms of focus, scope, measurement, and timeframe:
Focus: OKRs are typically used to set new goals and focus on achieving outcomes, while KPIs are used to measure the performance of specific metrics or processes that are already in place.
Scope: OKRs tend to be more broad and aspirational, while KPIs tend to be more specific and focused on particular areas of performance.
Measurement: OKRs are often qualitative and based on subjective progress assessments, while KPIs are typically quantitative and based on measurable data.
Timeframe: OKRs are typically set for longer timeframes (e.g., a quarter or a year), while KPIs may be measured more frequently (e.g., weekly or monthly).
For more information about the co-existence of OKRs, and KPIs, have a look at: What Matters: The Difference Between KPIs and OKRs.
Check out the following article for a more detailed, in-depth explanation of OKR: What Matters: OKR: Definition & Examples of Objectives & Key Results
Vanity metrics
Before we wrap up, we should be aware of dangerous or “false” metrics, known as Vanity Metrics. Vanity metrics refer to metrics that look impressive at first glance but don’t really provide much value in terms of measuring the success of a product or business. These metrics may make a product or business look good on paper, but they don’t necessarily indicate real growth or engagement.
Here are a few examples of vanity metrics:
- Registered users: Doesn’t necessarily indicate how many of those users are actively using your product
- Pageviews: They don’t actually provide any insight into how users are engaging with your product
- App ratings: They can be easily manipulated and don’t necessarily provide an accurate picture of product quality
- Followers/likes: Doesn’t necessarily indicate engagement or revenue
While vanity metrics can provide some insight into a product’s or business’s popularity, they don’t necessarily indicate the success of the overall strategy. Instead, it’s important to focus on metrics that are more relevant to the business goals.
To find out more about vanity metrics, check out this article.
Conclusion
Before you take a deep dive into discussions about which product metrics to measure, consider a metric that contributes to your goals and strategy, and determine how that metric will influence decision-making before choosing it. Actionable metrics are crucial – it must be clear how they can be replicated and easily accessible to everyone involved in the product. Additionally, they must be auditable so that everyone understands how they were collected and their reliability.
Many relevant product metrics are expressed as rates such as customer acquisition costs, conversion rates and churn rates. Ultimately, the most crucial metric is whether it tells you if a user benefits from using your product or not. Remember, selecting the right metric is key to making informed decisions that will keep your product on track.
Tags:
Jovica Zorić
Chief Technology OfficerJovica is a techie with more than ten years of experience. His job has evolved throughout the years, leading to his present position as the CTO at ProductDock. He will help you to choose the right technology for your business ideas and focus on the outcomes.
Corinna Strebel
Chief Product OfficerCorinna has worked for over 20 years with development teams and managed dozens of digital projects. Her role has changed significantly throughout the years, evolving from a Developer to a Project Manager, followed by a Product Owner, Agile Coach, and Product Manager roles, which all led her to her current role as the CPO at ProductDock. She became a fan of breaking up silos and fostering communication. She says that her secret ingredient turned out to be the experimental approach, aka Product mode, in which she coaches the teams she works with.