Metric and data strategies for data teams and business users – Which PowerMetrics solution is right for you?

PowerMetrics is purpose-built for the creation and consumption of metrics. A metric catalog and self-serve analysis tool, PowerMetrics enables data and business teams to work together to ensure single source of truth data for everyone.

If you’re looking for a new, better solution that bridges the gap between data and business teams, ensures data integrity, complies with security requirements, enables consumers to independently access and analyze organizational metrics, and grows with you as your data stack evolves, then this article is for you. It describes each of our solutions in detail, so you can be confident you’re making the right decision for your unique situation.

This article includes:

Bringing data management and business intelligence together: Overcoming challenges

Every day and everywhere data teams are experiencing various challenges and finding ways to solve them. Data team members wear many hats. They’re responsible for such tasks as optimizing system performance, ensuring data security compliance, and querying, organizing, and formatting data that’s stored in data warehouses. In addition, as managers and gatekeepers of the company’s data, they must provide business users with quick, easy access to single source of truth (SSOT) data for effective decision making, without compromising data privacy.

As we see it, here are some of the main challenges faced by modern data teams:

  • Data teams are spending too much time building dashboards and reports when they’d rather focus on what they’re best at – data management and system optimization. Business teams need quick access to up-to-date metrics to assist in daily decision-making tasks. However, if their existing BI tool is too complicated or doesn’t include the features they’re looking for, they won’t use it. Instead, they’ll ask the data team, often on an ad hoc, last-minute basis, to build them the reports and dashboards they need.

  • There are data inconsistencies across reports, dashboards, and departments. This is sometimes the result of individuals or teams using different BI tools. However, it’s more likely because they’re choosing different data, from different sources, mistakenly thinking the associated metrics measure the same thing.

  • Their current system is having difficulty adhering to the company’s strict data privacy rules.

  • Business users are accessing copies of data from several places instead of choosing the clean SSOT data that’s built and maintained by the data team.

  • Their organization has a lot of data and it's growing. Their current solution doesn’t handle large amounts of data or scale well.

  • Their current system doesn’t enable them to fine tune and improve their metric query performance.

  • They have data in different formats that come from several sources, for example, databases, local and cloud spreadsheets and files, cloud services, etc. As a result, data and business teams are dealing with multiple BI tools.

  • The organization’s data requires a lot of manual modelling and transformation to get it into a format that works for metrics.

  • They want an easier solution for retrieving data from cloud services.

  • Most services store a limited window of history and data before that window is lost when querying those services directly.

PowerMetrics solutions

PowerMetrics includes a data management/BI solution for virtually every situation.

This article describes:

What is the PowerMetrics – dbt Semantic Layer integration?

The PowerMetrics – dbt Semantic Layer integration enables you to import metrics that are defined in a dbt Semantic Layer project into PowerMetrics, where they can be visualized, analyzed, and added to dashboards.

Is the PowerMetrics – dbt Semantic Layer integration right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this is the right solution for you:

  • Your data is stored in a data warehouse (for example, Snowflake, BigQuery, Redshift, and Databricks) and you’ve deployed a dbt Semantic Layer.

  • You want to centralize metric definitions in your dbt Semantic layer.

  • You have a large amount of data that needs to be aggregated in real time.

  • You want fine-grained control over how data is aggregated and processed when querying a metric.

  • You have large amounts of existing structured data you’d like to define metrics against without having to move data to another system.

  • You have systems and processes in place already to import and resolve new data over time within a centralized data warehouse.

  • You want real-time updates from your data warehouse.

  • Your data privacy policy requires data to be stored internally in your data warehouse.

  • You have a dedicated data team that manages and optimizes data performance for your organization.

  • You process sensitive, confidential data and require a heightened level of security.

What does the PowerMetrics – dbt Semantic Layer metric workflow look like?

Here's how data flows in a PowerMetrics – dbt Semantic Layer setup (see the following diagram for a visual representation of these steps):

  1. Users import metrics from their dbt Cloud Semantic Layer project into PowerMetrics.

  2. To visualize and analyze a metric, one or more metric queries are sent to the Metric Service in Klipfolio PowerMetrics.

  3. PowerMetrics translates the queries into dbt Semantic Layer queries and forwards them to dbt Cloud to execute and retrieve data.
    A part of this process, PowerMetrics does some transformation on the results from dbt to improve visualizations for business users, for example, when possible, we fill in missing time periods.

  4. dbt Cloud transforms the query into standard SQL and sends it to the customer’s chosen data warehouse, for example, Snowflake.
    To assist with troubleshooting, in PowerMetrics you can see the dbt SQL query we generate and the raw SQL that’s sent to your data warehouse.

  5. Once the data is retrieved from dbt Cloud for a query, we store it in a short term, in-memory cache to avoid repetitive queries. This query cache can be cleared manually or via a webhook event.

dbt workflow diagram

What are the benefits of a PowerMetrics – dbt Semantic Layer integration?

Here are a few of the ways a PowerMetrics – dbt Semantic Layer integration solves the issues faced by modern data teams:

  • An all-in-one solution that combines the data modelling and management features of the dbt Semantic Layer with the visualization and analysis tools in PowerMetrics.

  • A centralized set of metric definitions, curated and managed by the data team in a dbt Semantic Layer project, makes for consistent, accurate metrics and aligned data across reports and dashboards.

  • Approved metrics. Data teams can certify metrics in PowerMetrics, ensuring all users are accessing the same, approved metrics.

  • Self-serve analytics in PowerMetrics means business users can independently process and visualize their metric data without assistance from data team members - no more ad hoc report and dashboard requests.

  • SSOT metrics, stored in your data warehouse, governed by the data team.

  • Built-in metric governance with roles and permissions set for each user at the account admin level.

  • Highly scalable. The amount of data that can be handled is only limited by the capacity of the underlying data warehouse.

  • Independent performance tuning. Performance can be improved by tuning your dbt semantic model and your underlying data warehouse.

A few considerations

Here are a few things to consider when deciding if the dbt Semantic Layer solution is right for you:

  • Query costs and solutions: Queries to your dbt Cloud Semantic Layer service and to your data warehouse service have associated costs. PowerMetrics caches your queries for a short period of time to help avoid duplicate queries, thereby reducing cost and load on your systems. Setting up a webhook to clear the cache only when data is updated in your data warehouse will help keep your data up to date and also reduce the number of queries.

  • Optimizing performance: Query performance when using dbt Semantic Layer is highly dependent on your optimization of the dbt semantic model and its underlying data warehouse.

  • Consider a hybrid solution: Using the PowerMetrics – dbt Semantic Layer integration enables you to visualize and analyze data that comes from your dbt Semantic Layer (i.e., data that’s stored in supported data warehouses). However, if you want to also connect to and create metrics for other sources and services, such as local spreadsheets and cloud applications, you can add data feed metrics alongside the dbt Semantic Layer integration in a hybrid solution.

In a hybrid solution, business users can:

  • Combine dbt Semantic Layer metrics with other metric types in multi-metric charts (in the Explorer and on a dashboard). Multi-metric charts enable them to visualize and compare data for multiple metrics in a single chart. For instance, they could add a dbt Semantic Layer “Revenue” metric to Explorer and analyze it alongside one or more of our many Facebook Ads instant metrics to see the impact of their Facebook Ads campaign. In Explorer, or in a multi-metric visualization on dashboard, up to 5 unique metrics can be combined for analysis and free-form exploration.

  • Combine dbt Semantic Layer metrics with Klipfolio’s built-in instant, custom, or calculated metric types to create calculated metrics. Calculated metrics combine metric values using an equation (with simple math - addition, subtraction, multiplication, and division) to create a metric that can be expressed as a number, a percentage, or a ratio. With calculated metrics, dbt users can create the equivalent of a ratio or derived dbt metric. As a bonus, unlike dbt ratio and derived metrics, calculated metrics can be comprised of metrics from any data source.

  • Add any and all types of metrics as visualizations to a dashboard. All metric types behave the same way for consumers, so the decision of which type to use depends entirely on which PowerMetrics solution works better for your data scenario.

What are data feeds?

Data feeds enable you to extract data from any source (for example, files, REST APIs, and databases) and transform and model that data into a single, consumable table format for ingestion and visualization in PowerMetrics.

A data feed is the information channel between your source data and your metrics. When you create a data feed, you define the query to use when retrieving data. The resulting data is transformed and optimized into time series data, which is then added to the metric and reconciled. After the data has been reconciled, the new data gets incorporated into the metric’s history and is available for users to query for all the metrics that use the data feed.

Users, typically account administrators, choose a refresh schedule that determines how often data is refreshed. Every time your data refreshes, the data feed is updated, building a history of the values stored within the metrics.

Are data feeds right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this is the right solution for you:

  • You need self-serve analytics.

  • You work with data that comes from spreadsheets and cloud/API services.

  • Your data isn’t stored in a data warehouse.

  • You want to collect the latest data on a regular basis from an external source and safely merge (reconcile) it into the metric’s historical data.

  • You work with aggregated data or raw data that fits within our 10MB data feed size limit. (Note that this limit applies to data feeds, not to metrics.)

  • You want to query and add the latest data to your metrics on a regular basis.

  • You want fast, self-serve access for ad hoc analysis of local files.

  • You want an in-the-box, pre-optimized solution for metrics with a consistent query performance for all your metrics.

  • A limit of 5 - 10 dimensions per metric is sufficient for your data analysis. (The number of dimensions available depends on your pricing plan.)

  • Your external data source has query frequency limits, preventing you from interactively analyzing your data. For example, the external service only allows you to send 100 queries/day. If you’re building your own solution and running frequent queries, you’ll quickly use up your daily allowance. With data feeds, you can use those 100 queries/day to retrieve and store data in PowerMetrics. From that point, the queries are made to our service, not your external service, bypassing its query limit.

What does the data feed workflow look like?

Here’s how data flows in a PowerMetrics data feeds setup (see the following diagram for a visual representation of these steps):

  1. PowerMetrics connects to your raw data in the specified external data source (for example, uploaded CSV, XLS, JSON, or XML files, data that’s stored in a database, or data from cloud services) via JDBC, REST API, or GraphQL.

  2. Our data refresh service uses the connection information that was defined in step #1 to contact the external data source and retrieve data for previewing and caching.

  3. The raw data returned by the external service is brought into Klipfolio PowerMetrics and cached to be used for transformation and ingestion at a later stage in the workflow.

  4. The raw data is transformed in the data feed to ready it for ingestion.

  5. PowerMetrics ingests and queries the data. These are independent processes that can happen concurrently:
    • Ingestion builds the time series database: Data is periodically retrieved from external sources, transformed, and ingested into a time series database where it’s reconciled and stored.
      This enables us to store history and reconcile historical data based on the data feed columns (time dimension) selected when defining the metric. The reconciliation occurs on each ingestion into the time series database to ensure we add new, and update existing, records appropriately.
    • Queries retrieve data from the time series database: When users analyze metrics in PowerMetrics, the metric is translated into a query which we send to the time series database and, then, return the correct result.

  6. The data feed workflow description to this point has focused on data connection, retrieval, storage, and transformation, however, equally important are the steps to configure metrics once the data feeds have been created. Note: The metric configuration steps are not part of the visual workflow representation below.
    The selections made when configuring a metric, for example, which columns to use for the value and the dimensions, how time will be determined (from the data or at each ingestion), the data shape, and the aggregation type, all go together to define what data is stored in the time series database and how it will be queried.

data feed workflow diagram

What are the benefits of data feeds?

Here are a few of the ways PowerMetrics data feeds solve issues faced by modern data teams:

  • Comprehensive. Connect to data from hundreds of different sources.

  • Fast track to metric visualizations. Instant metrics, created by Klipfolio PowerMetrics based on best practices and industry expertise. No data modelling, formula writing, or further configuration is required.

  • Data history. PowerMetrics stores and reconciles your data history. This is especially important for external services with API limitations on the amount of history they’ll store.

  • Built-in query and retrieval tool. Use the query builder to query and retrieve data from popular cloud services instead of working with a REST API and manually processing JSON responses.

  • Editing tool for transforming and refining raw data. With Excel-like formulas and functions and features, such as pivot, and merge/join, the data feed editor helps you organize large data sets and cleanse messy, incomplete, or incorrect source data.

A few considerations

Here are a few things to consider when deciding if the data feed solution is right for you:

  • Maximum data feed size is 10 MB. Each time the raw data is queried it needs to fit within this limit after the transformations are defined in the feed. (Note that this limit applies to data feeds, not to metrics.) This restriction can make it difficult to handle data from large scale data warehouses. However, it is possible to create specific views of data from a data warehouse that will fit within this limit.

  • Refresh frequency for data feeds is dependent on your pricing plan. The shortest interval available between refreshes is 15 minutes. As a result, you can achieve near-real-time data.

  • Metric definitions reside in the PowerMetrics platform. As a result, to correctly retrieve, reconcile, and query metric data, you must first configure your metric definitions in PowerMetrics.

  • Data feeds are not particularly well-suited to transactional or individual record data.

  • Transformed data is stored in our system. If you have concerns about data privacy and how we store data, please refer to our Privacy Policy or Contact Us. If you have specific data residency or isolation requirements, ask us about our private tenancy options.

Quick comparison table

To help you decide which solution is best for you, we’ve compiled this side-by-side comparison of key factors:

  dbt Semantic Layer integration Data Feeds
How is source data accessed?

Metric definitions are imported into PowerMetrics from your dbt Semantic Layer projects.

Authorization for PowerMetrics to access your data is granted using your dbt Cloud Service token and Environment ID.

Data feeds, either created by PowerMetrics (for instant metrics) or by you (for custom metrics) connect to the data source and channel the data into your metrics.

OAuth Token authentication is typically used to authorize PowerMetrics to access your data, however, depending on the service being connected to, API Key authentication is also an option.

How is data retrieved?

PowerMetrics creates a metric query that the dbt Semantic Layer translates into direct queries against the data warehouse.

To retrieve data for metrics, PowerMetrics sends queries to our proprietary metrics database (where the data is stored).

How is data stored?

PowerMetrics does not store your data.

Data is stored in the dbt user’s data warehouse, for example, Google BigQuery or Snowflake.

When importing dbt metrics, PowerMetrics directly queries your data in real time from the dbt Semantic Layer project. This query is cached for a short time in memory and removed when the cache is cleared (either automatically with a webhook setup or manually).

For instant and custom metrics, the data we retrieve from external data sources and APIs is stored in our proprietary metrics database, which is optimized for metric-based ingestion, updates, and queries.

For calculated metrics, we don’t store data for the calculated metric itself, but, instead, the results are generated by querying each operand in the formula, and then applying the formula to the results.

How is data updated?

Data is automatically updated in PowerMetrics, to align with the data in your dbt project, whenever a deployment job is successfully run (requires webhook setup). Users can also manually align data by clearing the cache for their connected dbt Semantic Layer account in PowerMetrics.

Data is queried automatically based on a schedule. However, users (typically the account administrators) can also manually queue a data feed to be refreshed.

Every time your data refreshes, the associated data feed gets updated. The resulting data is then added to the existing data for the metrics that use the feed. The metrics then incorporate the new data into their history.

Can end users edit source data in PowerMetrics?

As previously mentioned, data is stored in your chosen database.

To guarantee SSOT data, in PowerMetrics, users can edit a metric’s display properties only. They cannot edit a metric’s underlying data.

Editing options and access depend on the type of metric and on the user’s role in PowerMetrics.

For example, instant metrics, which are created and managed by Klipfolio, have fewer editing options than custom metrics that you create and manage via data feeds.

Editor users have a higher level of access and, as such, can modify visualization display settings and the underlying data that’s being represented. In comparison, users with a viewer role can personalize metric visualization settings but they cannot modify a visualization’s underlying data.

Note: Visualizations use metrics for their data, and are created and edited as separate artifacts. When users edit visualizations, they’re not changing the metric, they’re changing the visualized representation of the metric’s data.

What do I need to do to start?

You’ll need a dbt Cloud Team or Enterprise level account with a version of dbt that supports dbt Semantic Layer (dbt v1.6 or higher). You’ll also need to set up the dbt Semantic Layer and build and define metrics in a dbt Semantic Layer project.

Sign up for a free PowerMetrics account at https://www.klipfolio.com/.

Sign up for a free PowerMetrics account at https://www.klipfolio.com/.

A peek at some PowerMetrics features

Here’s a list of a few of the features you can expect in PowerMetrics, regardless of which solution you choose:

  • Metric catalog - Business users access metrics in a centralized location where, among other actions and depending on their role and set of permissions, they can open metrics for self-serve analysis and share, delete, and add metrics to dashboards. Learn more.

  • Customizable dashboards - Users can add any metric type to a dashboard. Other examples of dashboard actions include: Personalizing with colours, images, and text, sharing internally or externally (with public or passcode access), downloading content as a PDF, and applying dimensional filters and different date ranges. Learn more.

  • The metric homepage - The metric homepage presents the complete picture for a single metric. It enables business users to quickly compare progress over time, access metric details, and visualize data automatically using a professionally curated, yet customizable, template. Learn more.

  • Calculated metrics - Calculated metrics combine metric values using an equation (with simple math - addition, subtraction, multiplication, and division) to create a metric that can be expressed as a number, a percentage, or a ratio. Users can combine metrics of any type when creating calculated metrics. Learn more.

  • Multi-metrics - Multi-metric charts enable business users to visualize and compare data for multiple metrics in a single chart (in Explorer and on dashboards). As with calculated metrics, users can combine metrics of any type when creating multi-metrics. Learn more.

  • Certified metrics - Data teams can ensure alignment across reports and dashboards by certifying metrics as approved versions for end users. Learn more.

  • Personalized, free-form data analysis - Business users can add one or more metrics to Explorer to investigate and learn without affecting the metric’s underlying data or its default display for others. Learn more.

  • Metric goals - Business users can track metric progress by setting target and/or recurring goals. When a goal is reached, they (and everyone with shared access to the metric) can be notified with in-app and email notifications. Learn more.

  • Normal range and forecast analysis tools - These built-in analysis tools help business users assess past, current, and future trends in their metrics. They’ll be able to quickly identify outliers in their data (values that fall outside the metric’s normal range) and see a prediction of future metric performance. Learn more.
Have more questions? Submit a request