Event cardinality in analytics data tracking plan

Event cardinality refers to the total number of distinct events being monitored. While it may be tempting to embrace the “more is better” mentality or the “let’s track now and decide later” approach, it’s imperative to ensure that the number of distinct events aligns with the specific needs and maturity of a project.

Drawing a parallel from Machine Learning, columns that aren’t valuable for analysis are often dropped to conserve computational resources, an approach known as Dimensionality Reduction. Similarly, in analytics, it’s beneficial to correlate the number of tracked events with their corresponding traffic levels.

The Five Stages of Project Maturity and Cardinality

  1. Peak Performance: The pinnacle where there’s ample traffic and a well-rounded set of events, providing a rich understanding of user interactions.
  2. Underutilized Analytics: High traffic but fewer event types, signifying missed opportunities for deeper insights.
  3. Preparatory Stage: High event tracking but moderate-to-low traffic, indicating a robust foundation, possibly in anticipation of future traffic surges.
  4. Exploratory Phase: Moderate events and traffic, typical of projects that are testing or launching new offerings.
  5. Foundation: Low traffic and minimal events, the starting point of many projects, setting the groundwork for subsequent growth.

While these stages help in gauging a project’s current position, several external and internal factors influence its journey through these stages.

Factors Shaping Event Cardinality Decisions

  1. Saturation Point: There’s a threshold beyond which adding more events doesn’t necessarily yield better insights. This overwhelming cardinality can clutter analysis rather than enhance it.
  2. Data Quality: Quality trumps quantity. It’s crucial to ensure events are consistently relevant, accurate, and actionable.
  3. Transition Dynamics: Projects often transition between the stages based on strategic decisions, market demands, or growth trajectories. Recognizing these dynamics can lead to proactive planning.
  4. Resource Allocation: Balancing resources between event tracking and other essential facets of a project is vital. Over-allocating to one can lead to missed opportunities in another.
  5. External Factors: Market dynamics, technological shifts, and regulatory changes can all influence a project’s event tracking strategy.

To draw an analogy, just as in A/B testing where one needs sufficient experimental power to derive meaningful results, understanding customer behavior through analytics also requires the right mix of traffic and event types. It’s a balance. On one end of the spectrum, having too many event types with insufficient audience can lead to “analysis paralysis.” On the other, not having enough events for a large audience can mean missed insights.

In conclusion, while the allure of tracking everything is undeniable, it’s essential to align event cardinality with the project’s phase and external influencing factors. This alignment ensures not just data richness but relevance, paving the way for actionable insights that drive business growth.

Saturation point in your analytics tracking plan

The saturation point refers to a scenario where the addition of more unique events (increasing cardinality) to the tracking plan does not yield any significant incremental value or insights, even with high volumes of traffic.

When there are too many unique events being tracked, it becomes challenging to find any meaningful patterns or insights. Think of it like trying to find a specific tree in a dense forest; sometimes, having too much to look at can be as unhelpful as having too little.


  1. Overwhelming Data Volume: There’s so much data coming in that it becomes challenging to derive actionable insights from it. This can lead to the classic “paralysis by analysis” situation.
  2. Event Redundancy: Many of the additional events being tracked might be redundant or provide overlapping insights. For example, tracking both “button hover” and “button click” might not give drastically different insights in certain contexts.
  3. Maintenance Overhead: With a vast number of events, there’s an increased maintenance cost, both in terms of ensuring tracking accuracy and managing changes or updates to the event structure.


  1. Resource Drain: A lot of time and effort can be spent sifting through the data, setting up filters, and creating customized reports, which might not yield significantly different results from a setup with slightly lower cardinality.
  2. Cost Implications: Most analytics tools have costs associated with the volume of events being tracked. The financial cost can rise without a corresponding increase in the value derived from the additional data.
  3. Complexity in Analysis: The more events you have, the harder it becomes to identify which ones are truly important. It may lead to situations where crucial insights are missed amidst the noise of numerous events.


  1. Regular Audits: Periodically review the tracked events to identify and remove redundancies or irrelevant ones. This can help in streamlining the analytics setup.
  2. Focus on Key Metrics: Prioritize events that directly correlate with core business KPIs. By focusing on these primary indicators, one can ensure that the most critical data points are not lost in the mix.
  3. Use Advanced Analytics Tools: Employ machine learning or AI-driven tools that can sift through large datasets and automatically highlight anomalies or key insights. These tools can adapt to large cardinalities and still derive meaningful patterns.

While it’s tempting to track as much as possible with the belief that more data leads to better insights, there’s a tipping point. Beyond this point, the incremental benefits of additional tracking diminish, leading to the saturation point. Recognizing and addressing this saturation can lead to more efficient and effective data analytics.

How data quality affects your digital analytics setup

Imagine listening to a radio with lots of static. Even if your favorite song is playing, the noise can make it hard to enjoy. Similarly, even with a lot of event data, if the quality is poor, it’s hard to get clear insights.

Data quality is paramount in analytics. Even with high cardinality, if the events tracked are not relevant, accurate, or consistently captured, they can lead to misleading insights.


  1. Inconsistencies: This can occur when similar actions are tracked under different event names or when the same event name is used for different actions. It can also arise from tracking errors or software bugs.
  2. Irrelevance: Some events, while precisely captured, might not offer meaningful insights into user behavior or business KPIs. Tracking such events can divert attention from more critical data points.
  3. Incompleteness: Missing data or partial event captures can lead to gaps in the data, making it difficult to draw comprehensive conclusions.


  1. Misguided Decisions: Poor data quality can lead to incorrect analysis and insights, which in turn can result in misguided business decisions.
  2. Loss of Trust: Stakeholders might start doubting the reliability of the analytics if they encounter inconsistencies or inaccuracies.
  3. Increased Overhead: Cleaning up poor-quality data can be resource-intensive. It requires time to identify, rectify, and validate the data.


  1. Regular Audits: Periodic checks on the data being collected can help identify inconsistencies or inaccuracies early on.
  2. Define Clear Tracking Protocols: Having a well-documented and consistently followed protocol for naming and defining events can reduce ambiguities.
  3. Integrate Data Validation: Tools or scripts can be used to automatically validate data, ensuring consistency and completeness.
  4. Feedback Loop: Establishing a feedback mechanism where end-users of the data (analysts, business users) can report anomalies or issues they encounter. This helps in timely identification and rectification.

The quality of the events being tracked is just as, if not more, important than the quantity (cardinality). Ensuring high-quality data capture ensures that insights drawn are accurate and actionable.

Resource allocation

Resource Allocation pertains to how a project distributes its available resources—both human and technical—to ensure optimal data collection, analysis, and insight generation, particularly in relation to its position within the quadrants.

Sometimes, external situations, like a sudden rainstorm or a new trend, can change how we go about our day. In the same way, external market trends, tech updates, or new regulations can influence how a project approaches its data analytics.


  1. Priority Setting: Deciding which events or features should be prioritized for tracking based on their potential impact and relevance to business goals.
  2. Balancing Act: Striking a balance between investing in advanced tracking mechanisms (high cardinality) versus other essential project needs.
  3. Technical Overhead: The infrastructure and tools needed to support varying levels of cardinality and traffic, and the costs associated with them.


  1. Opportunity Cost: Over-investing resources in one area, like excessive event tracking, can mean missing out on opportunities in others, such as user experience enhancements or marketing.
  2. Maintenance Challenges: A complex tracking setup with high cardinality can lead to more significant maintenance demands, which requires ongoing resource commitment.
  3. Budgetary Constraints: Particularly for projects with limited budgets, allocating resources to extensively detailed tracking might strain finances that could be directed elsewhere.


  1. Data-Driven Decision Making: Using available data to make informed choices about which events are essential and allocating resources accordingly.
  2. Scalable Infrastructure: Investing in scalable analytics tools that can handle changes in traffic and cardinality without requiring frequent resource reallocation.
  3. Periodic Review: Regularly assessing the resource allocation strategy to ensure it aligns with current project needs and goals. This can help in timely reallocations.
  4. Stakeholder Collaboration: Collaborating with different departments or stakeholders to understand their data needs. This ensures that resources are channeled towards tracking the most valuable events.

Effective resource allocation is strategic in the execution of your analytics tracking plan. It’s about ensuring that the project’s resources are utilised in a way that maximizes the value of its analytics while still catering to its broader objectives and constraints.

External factors

External factors encompass external influences or conditions that can affect a project’s positioning within the quadrants, beyond just the intrinsic data and analytics considerations.

Think of managing your analytics tracking plan like cooking a meal. You need the right balance of ingredients. If you use too much of one thing, you might run out of another, and the dish won’t turn out right. Similarly, projects need to balance where they put their resources to get the best results.


  1. Market Dynamics: Changes in market conditions, such as emerging trends, consumer preferences, or competitive landscapes.
  2. Technological Shifts: Introduction of new technologies or platforms, or significant updates to existing ones.
  3. Regulatory Changes: Adjustments in data protection and privacy laws can influence what data can be collected and how it’s processed.


  1. Adaptability Challenges: Rapid changes in external conditions can make it challenging for projects to adapt their analytics strategy quickly.
  2. Potential Data Gaps: Regulatory changes, especially, can lead to restrictions in data collection, creating potential blind spots in analytics.
  3. Competitive Pressures: If competitors adopt new technologies or strategies, it might necessitate shifts in one’s own analytics approach to stay competitive.


  1. Stay Informed: Regularly monitor industry news, technological updates, and regulatory environments to anticipate potential shifts.
  2. Flexible Analytics Framework: Adopt an analytics framework that can easily integrate new tools or adjust to new data collection paradigms.
  3. Scenario Planning: Consider potential external shifts and plan for multiple scenarios. This ensures that the project is not caught off guard and can react swiftly to external changes.
  4. Stakeholder Engagement: Maintain open communication with stakeholders to understand external pressures they might be experiencing, ensuring that the analytics strategy aligns with broader business considerations.

While internal considerations like traffic and cardinality are crucial, it’s vital not to overlook the impact of external factors on your analytics tracking plan.

Staying proactive and adaptive in the face of these external shifts ensures that a project’s analytics remain relevant and insightful, irrespective of changing conditions.

Transition dynamics

Projects evolve, much like seasons changing in a year. As they grow, shift, and adapt, the way they handle data and analytics needs to evolve too. It’s all about understanding and navigating these shifts smoothly.

Transition Dynamics refers to the shifts or movements an analytics tracking plan experiences between the different quadrants over time, as well as the factors that drive these transitions.


  1. Evolution Path: The typical progression a project might take as it grows and matures, moving from one quadrant to another.
  2. Driving Factors: Elements like business decisions, market changes, technology adoption, or strategic pivots that propel a project to transition between the quadrants.
  3. Rate of Transition: The speed at which these transitions occur. Some projects might rapidly move between quadrants due to aggressive strategies, while others might transition more slowly.


  1. Adaptability: Projects need to be agile. As they transition, their analytics needs and strategies will change. Being able to adapt quickly can be a competitive advantage.
  2. Resource Allocation: Transitioning, especially rapidly, can require significant resources, both in terms of technology and manpower.
  3. Continuous Learning: As projects move between quadrants, there’s a need to continually reassess and learn about the new challenges and opportunities each quadrant presents.


  1. Strategic Roadmaps: Planning for potential transitions by having a clear strategic roadmap. This helps in anticipating the needs and challenges of future quadrants.
  2. Feedback Mechanisms: Regularly gathering feedback from stakeholders to understand the current position within a quadrant and identifying signs that a transition might be imminent.
  3. Skill Development: Ensuring that the team has the necessary skills and training to handle the challenges of different quadrants. As a project transitions, new analytics techniques or tools might become essential.
  4. Flexible Tech Stack: Using adaptable technologies that can scale and evolve as the project transitions between quadrants. This ensures continuity in data collection and analysis.

Understanding Transition Dynamics is crucial for long-term planning your analytics tracking plan. Recognizing the signs of an impending transition and being prepared can ensure smoother shifts between quadrants, leading to more effective analytics and better-informed business decisions.