Metric and data strategies for data teams and business users – Which PowerMetrics solution is right for you?

PowerMetrics is purpose-built for the creation and consumption of metrics. A metric catalog and self-serve analysis tool, PowerMetrics enables data and business teams to work together to ensure single-source-of-truth data for everyone.

If you’re looking for a new, better solution that bridges the gap between data and business teams, ensures data integrity, complies with security requirements, enables consumers to independently access and analyze organizational metrics, and grows with you as your data stack evolves, then this article is for you. It describes each of our solutions in detail, so you can be confident you’re making the right decision for your unique situation.

As you read and learn about each integration, keep in mind that PowerMetrics is designed to work for everyone. The final section of this article dives into our hybrid solution – one that can combine metrics of every kind and data source. If a single solution doesn’t seem right for your organization, PowerMetrics, with its built-in flexibility and comprehensive options may still be the perfect fit.

This article includes:

Bringing data management and business intelligence together: Overcoming challenges

Every day and everywhere data teams are experiencing various challenges and finding ways to solve them. Data team members wear many hats. They’re responsible for such tasks as optimizing system performance, ensuring data security compliance, and querying, organizing, and formatting data that’s stored in data warehouses. In addition, as managers and gatekeepers of the company’s data, they must provide business users with quick, easy access to single source of truth (SSOT) data for effective decision making, without compromising data privacy.

As we see it, here are some of the main challenges faced by modern data teams:

  • Data teams are spending too much time building dashboards and reports when they’d rather focus on what they’re best at – data management and system optimization. Business teams need quick access to up-to-date metrics to assist in daily decision-making tasks. However, if their existing BI tool is too complicated or doesn’t include the features they’re looking for, they won’t use it. Instead, they’ll ask the data team, often on an ad hoc, last-minute basis, to build them the reports and dashboards they need.

  • There are data inconsistencies across reports, dashboards, and departments. This is sometimes the result of individuals or teams using different BI tools. However, it’s more likely because they’re choosing different data, from different sources, mistakenly thinking the associated metrics measure the same thing.

  • Business users are accessing copies of data from several places instead of choosing the clean SSOT data that’s built and maintained by the data team.

  • Their organization has a lot of data and it's growing. Their current solution doesn’t handle large amounts of data or scale well.

  • Their current system doesn’t enable them to fine tune and improve their metric query performance.

  • They have data in different formats that come from several sources, for example, databases, local and cloud spreadsheets and files, cloud services, etc. As a result, data and business teams are dealing with multiple BI tools.

  • The organization’s data requires a lot of manual modelling and transformation to get it into a format that works for metrics.

  • They want an easier solution for retrieving data from cloud services.

  • Most services store a limited window of history and data before that window is lost when querying those services directly.

PowerMetrics solutions

PowerMetrics includes a data management/BI solution for virtually every situation.

This article describes:

What is the PowerMetrics – Data warehouse integration?

With the PowerMetrics – Data warehouse integration, you can connect directly to data that’s stored and managed in a warehouse. Our current integrations are documented here. After connecting, your organization’s metrics are added to a centralized metric catalog where they can be accessed by business users for visualization, discovery, and dashboarding.

Note: The data warehouse integration is included in our Enterprise plan or available as an add-on in our Professional plan. We understand you might want to check it out first. When you sign up for a free PowerMetrics account, you’ll get access for 30 days to this and other premium features. When the 30-day trial period ends, you can seamlessly upgrade to the Professional plan and purchase the add-on or contact our Success Team to upgrade to the Enterprise plan.

Is the PowerMetrics – Data warehouse integration right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this may be the right solution for you:

  • Your data is stored in one of our supported data warehouse services, as documented here.

  • Your data is organized in tables and views that are optimized for metric consumption.

  • Your organization has the expertise and personnel for managing and optimizing a data warehouse for metric consumption.

  • Your data privacy policy requires data to be stored internally in your data warehouse.

  • You process sensitive, confidential data and require a heightened level of security.

  • You want real-time updates from your data warehouse.

What does the PowerMetrics – Data warehouse workflow look like?

Here's how data flows in a PowerMetrics – Data warehouse setup (see the following diagram for a visual representation of these steps):

  1. In PowerMetrics, you enter service-specific information to create an account connection that’s used to connect to the data warehouse. Metrics are defined by connecting to columns within a table in the warehouse. Metric settings, such as formatting, aggregation, and naming are set as you configure the metric.

  2. To visualize and analyze a metric, one or more metrics queries are sent to the Metrics Service in PowerMetrics via REST API.

  3. PowerMetrics translates the queries into vendor-specific SQL according to the metric definition and forwards them to the data warehouse to execute and retrieve data.
    The query result returned from the data warehouse is transformed to a PowerMetrics-specific format to be used for visualization and other consumption purposes.
    To assist with troubleshooting, in PowerMetrics you can see the SQL query we generate and send to your data warehouse.

  4. Once the data is retrieved from the warehouse for a query, it’s cached for a duration of time (as configured by you) to optimize query performance. To get up-to-date data, you can either use a webhook request (recommended) to clear the cache or set a fixed cache expiration duration.

direct to data warehouse workflow diagram

What are the benefits of a PowerMetrics – Data warehouse integration?

Here are a few of the ways a PowerMetrics – Data warehouse integration solves the issues faced by modern data teams:

  • Self-serve analytics for business users. Data consumers can independently process and analyze their metric data, including building multi-metric and calculated visualizations, without assistance from data team members - no more ad hoc report and dashboard requests.

  • Approved, single-source-of-truth metrics in a centralized catalog. Metrics are curated and certified by the data team and accessible to business users, ensuring consistent data is used across reports and dashboards.

  • No external data transfer when running queries. All queries are performed in real time on the data warehouse. Only aggregated data, as part of the query result, is sent back for analysis and visualization.

  • Secure data storage. All data is stored and secured in the data warehouse.

  • Real-time updates. The data in PowerMetrics is aligned with changes in the warehouse via a webhook.

  • Data governance. Built-in users and roles, set for each individual at the account admin level, let you control data access.

  • Highly scalable. The amount of data that can be handled is only limited by the capacity of the data warehouse.

  • Independent performance tuning. Data teams can improve performance by tuning the data warehouse and the PowerMetrics query cache strategy.

A few considerations

Here are some things to consider when deciding if the data warehouse solution is right for you:

  • Query costs and solutions: Queries to your data warehouse service have associated costs. PowerMetrics caches your queries for a user-configurable duration of time to help avoid duplicate queries, thereby reducing cost and load on your systems. Setting up a cache clearing step in your ELT pipeline using a webhook request when your data is updated in your data warehouse will help keep your data up to date and also reduce the number of queries.

  • Optimizing performance: Query performance is highly dependent on your optimization of the data warehouse. Data managers require the expertise to set up, optimize, and maintain the data warehouse.

  • Data volume: If you have a large volume of data and it’s no longer feasible or performant to manage it using data feeds, it’s time to move to a PowerMetrics – data warehouse integration.

What is the PowerMetrics – Semantic layer integration?

The PowerMetrics – Semantic layer integration enables you to connect to metrics from a dbt Semantic Layer project or to data from Cube for visualization, analysis, and dashboard building in PowerMetrics.

Note: The semantic layer integration is included in our Enterprise plan. We understand you might want to check it out first. When you sign up for a free PowerMetrics account, you’ll get access for 30 days to this and other premium features. When the 30-day trial ends you can contact our Success Team to upgrade to the Enterprise plan.

Is a PowerMetrics – Semantic layer integration right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this may be the right solution for you:

  • Your data is stored in a data warehouse and you’ve deployed a dbt or Cube semantic Layer.

  • You have a dedicated data team that manages and optimizes data performance for your organization.

  • Your data privacy policy requires data to be stored internally in your data warehouse.

  • You process sensitive, confidential data and require a heightened level of security.

  • You want real-time updates from your data warehouse.

  • You have a large amount of data that needs to be aggregated in real time.

  • You want fine-grained control over how data is aggregated and processed when querying a metric.

  • You have large amounts of existing structured data you’d like to define metrics against without having to move data to another system.

  • You have systems and processes in place already to import and resolve new data over time within a centralized data warehouse.

What does the PowerMetrics – dbt Semantic Layer workflow look like?

Here's how data flows in a PowerMetrics – dbt Semantic Layer setup (see the following diagram for a visual representation of these steps):

  1. Users connect PowerMetrics to metrics in a dbt Cloud Semantic Layer project.

  2. To visualize and analyze a metric, one or more metrics queries are sent to the Metrics Service in Klipfolio PowerMetrics.

  3. PowerMetrics translates the queries into dbt Semantic Layer queries and forwards them to dbt Cloud to execute and retrieve data.
    As part of this process, PowerMetrics does some transformation on the results from dbt to improve visualizations for business users, for example, when possible, we fill in missing time periods.

  4. dbt Cloud transforms the query into standard SQL and sends it to the customer’s chosen data warehouse, for example, Snowflake.
    To assist with troubleshooting, in PowerMetrics you can see the dbt Semantic Layer query we generate and the raw SQL that’s sent to your data warehouse.

  5. Once the data is retrieved from dbt Cloud for a query, we cache it in a short term, in-memory cache to avoid repetitive queries. This query cache can either be cleared manually or via a webhook request.

dbt workflow diagram

What does the PowerMetrics – Cube workflow look like?

Here's how data flows in a PowerMetrics – Cube setup (see the following diagram for a visual representation of these steps):

  1. In PowerMetrics, you enter your Cube host URL and API key to create an account connection that’s used to connect PowerMetrics to Cube. Metrics are defined by connecting to columns within a cube. Metric settings, such as formatting, aggregation, and naming are set as you configure the metric.

  2. To visualize and analyze a metric, one or more metrics queries are sent to the Metrics Service in Klipfolio PowerMetrics.
    A part of this process, PowerMetrics does some transformation on the results from Cube to improve visualizations for business users, for example, when possible, we fill in missing time periods.
    To assist with troubleshooting, in PowerMetrics you can see the Cube API request we generate and the raw SQL generated by Cube.

  3. Once the data is retrieved from Cube for a query, we cache it in a short term, in-memory cache to optimize query performance. This query cache can either be cleared manually, using a cache expiration strategy, or via a webhook request.

Cube workflow diagram

What are the benefits of a PowerMetrics – Semantic layer integration?

Here are a few of the ways a PowerMetrics – Semantic layer integration solves the issues faced by modern data teams:

  • An all-in-one solution that integrates the data modelling features of dbt Semantic Layer and the data management abilities of dbt Semantic Layer and Cube with the visualization and analysis tools found in PowerMetrics.

  • Self-serve analytics in PowerMetrics means business users can independently process and visualize their metric data without assistance from data team members - no more ad hoc report and dashboard requests. A list of approved metrics, certified by the data team and accessible to everyone (based on user role), ensures consistent data is used across reports and dashboards.

  • Secure data storage. Data is stored in your data source, for example, a data warehouse. The dbt Semantic Layer transforms and models data and includes metric definitions for warehouse data. Cube provides a query abstraction layer (semantic layer) to simplify access to and querying of the data.

  • Metrics defined in a dbt Semantic Layer. Metrics are curated and managed by the data team in a dbt Semantic Layer project, ensuring consistent, accurate metrics and aligned data across reports and dashboards.

  • Metrics defined in PowerMetrics. Cube does not contain a metrics layer. As a result, it doesn’t include individual, atomic metric definitions. Connecting to PowerMetrics enables you to define Cube metrics according to the specific needs of your organization.

  • Real-time updates. The data in PowerMetrics is aligned with changes in the data source via a webhook request or by manually clearing the cache (dbt Semantic Layer and Cube) or by using a cache expiration strategy (Cube). We also suggest you investigate in-app caching mechanisms that may be found in dbt Semantic Layer and Cube.

  • Highly scalable. The amount of data that can be handled is only limited by the capacity of the underlying warehouse that’s being used by the semantic layer (dbt Semantic Layer and Cube).

  • Highly customizable performance tuning. Performance can be improved by tuning your dbt semantic model, Cube cache configuration, and underlying data warehouse.

A few considerations

Here are some things to consider when deciding if a PowerMetrics semantic layer solution is right for you:

  • Query costs and solutions: Queries to your semantic layer service and data warehouse service have associated costs. PowerMetrics caches your queries for a short period of time to help avoid duplicate queries, thereby reducing cost and load on your systems. Setting up a webhook (dbt Semantic Layer and Cube) or cache expiration strategy (Cube) to clear the cache only when data is updated in your data warehouse will help keep your data up to date and also reduce the number of queries.

  • Optimizing performance: Query performance is highly dependent on your optimization of the dbt semantic model, Cube, and the underlying data warehouse.

  • Complex data model: A semantic layer solution helps ensure consistency and SSOT for complex data models, as all data is centrally defined and managed. A dbt Semantic Layer implementation extends this centralization to also include defining and managing metric definitions in a single place.

What is the data feed solution?

Data feeds enable you to extract data from any source (for example, files, REST APIs, and databases) and transform and model that data into a single, consumable table format for ingestion and visualization in PowerMetrics.

A data feed is the information channel between your source data and your metrics. When you create a data feed, you define the query to use when retrieving data. The resulting data is transformed and optimized into time series data, which is then added to the metric and reconciled. After the data has been reconciled, the new data gets incorporated into the metric’s history and is available for users to query for all the metrics that use the data feed.

Users, typically account administrators, choose a refresh schedule that determines how often data is refreshed. Every time your data refreshes, the data feed is updated, building a history of the values stored within the metrics.

Is the data feed solution right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this may be the right solution for you:

  • You need self-serve analytics.

  • You work with data that comes from spreadsheets and cloud/API services.

  • Your data isn’t stored in a data warehouse.

  • You want to collect the latest data on a regular basis from an external source and safely merge (reconcile) it into the metric’s historical data.

  • You work with aggregated data or raw data that fits within our 10MB data feed size limit. (Note that this limit applies to data feeds, not to metrics.)

  • You want to query and add the latest data to your metrics on a regular basis.

  • You want fast, self-serve access for ad hoc analysis of local files.

  • You want an all-in-one, pre-optimized solution for metrics with a consistent query performance for all your metrics.

  • A limit of 5 - 10 dimensions per metric is sufficient for your data analysis. (The number of dimensions available depends on your pricing plan.)

  • Your external data source has query frequency limits, preventing you from interactively analyzing your data. For example, the external service only allows you to send 100 queries/day. If you’re building your own solution and running frequent queries, you’ll quickly use up your daily allowance. With data feeds, you can use those 100 queries/day to retrieve and store data in PowerMetrics. From that point, the queries are made to our service, not your external service, bypassing its query limit.

What does the data feed workflow look like?

Here’s how data flows in a PowerMetrics data feeds setup (see the following diagram for a visual representation of these steps):

  1. PowerMetrics connects to your raw data in the specified external data source (for example, uploaded CSV, XLS, JSON, or XML files, data that’s stored in a database, or data from cloud services) via JDBC, REST API, or GraphQL.

  2. Our data refresh service uses the connection information that was defined in step #1 to contact the external data source and retrieve data for previewing and caching.

  3. The raw data returned by the external service is brought into Klipfolio PowerMetrics and cached to be used for transformation and ingestion at a later stage in the workflow.

  4. The raw data is transformed in the data feed to ready it for ingestion.

  5. PowerMetrics ingests and queries the data. These are independent processes that can happen concurrently:
    • Ingestion builds the time series database: Data is periodically retrieved from external sources, transformed, and ingested into a time series database where it’s reconciled and stored.
      This enables us to store history and reconcile historical data based on the data feed column selected as the primary time dimension when defining the metric. The reconciliation occurs on each ingestion into the time series database to ensure we add new, and update existing, records appropriately.
    • Queries retrieve data from the time series database: When users analyze metrics in PowerMetrics, metrics queries are generated, based on the metric definition, and sent to the Metrics Service to execute and return the correct result.

  6. The data feed workflow description to this point has focused on data connection, retrieval, storage, and transformation, however, equally important are the steps to configure metrics once the data feeds have been created. Note: The metric configuration steps are not part of the visual workflow representation below.
    The selections made when configuring a metric, for example, which columns to use for the value (measure) and time dimension, how time will be determined (from the data or at each ingestion), the data shape, and the aggregation type, all go together to define what data is stored in the time series database and how it will be queried.

data feed workflow diagram

What are the benefits of a data feed solution?

Here are a few of the ways a data feed solution solves the issues faced by modern data teams:

  • Comprehensive. Connect to data from hundreds of different sources.

  • Fast track to metric visualizations. Instant metrics, created by Klipfolio PowerMetrics based on best practices and industry expertise. No data modelling, formula writing, or further configuration is required.

  • Great for spreadsheet data. If you have local data in spreadsheets (data that won’t or shouldn’t be stored in a warehouse), then data feeds are an ideal solution. In addition, by creating metrics using data feeds and spreadsheets, business users can analyze their local data against the official data warehouse metrics.

  • Data history. PowerMetrics stores and reconciles your data history. This is especially important for external services with API limitations on the amount of history they’ll store.

  • Built-in query and retrieval tool. Use the query builder to query and retrieve data from popular cloud services instead of working with a REST API and manually processing JSON responses.

  • Editing tool for transforming and refining raw data. With Excel-like formulas and functions and features, such as pivot, and merge/join, the data feed editor helps you organize large data sets and cleanse messy, incomplete, or incorrect source data.

A few considerations

Here are a few things to consider when deciding if the data feed solution is right for you:

  • Maximum data feed size is 10 MB. Each time the raw data is queried it needs to fit within this limit after the transformations are defined in the feed. (Note that this limit applies to data feeds, not to metrics.) This restriction can make it difficult to handle data from large scale data warehouses. However, it is possible to create specific views of data from a data warehouse that will fit within this limit.

  • Refresh frequency for data feeds is dependent on your pricing plan. The shortest interval available between refreshes is 15 minutes. As a result, you can achieve near-real-time data.

  • Metric definitions reside in the PowerMetrics platform. As a result, to correctly retrieve, reconcile, and query metric data, you must first correctly configure your metric definitions in PowerMetrics.

  • Data feeds are not particularly well-suited to transactional or individual record data.

  • Transformed data and data history are stored in our system. If you have concerns about data privacy and how we store data, please refer to our Privacy Policy or Contact Us. If you have specific data residency or isolation requirements, ask us about our private tenancy options.

Quick comparison table

To help you decide which solution is best for you, we’ve compiled this side-by-side comparison of key factors:

  Data warehouse integration Semantic layer integration Data feeds
How is source data accessed?

PowerMetrics connects directly to the warehouse service using an account connection you create.

The authorization method used to allow PowerMetrics to access your data depends on the warehouse service but typically requires you to enter a username and password that’s associated with the correct role/permissions.

For dbt Semantic Layer users, metric definitions are imported into PowerMetrics from your dbt Semantic Layer projects.

Authorization for PowerMetrics to access your data is granted using your dbt Cloud Service token and Environment ID.

For Cube users, metrics are defined in PowerMetrics by referencing columns in cubes.

Authorization for PowerMetrics to access your data is granted using an API key.

Data feeds, either created by PowerMetrics (for instant metrics) or by you (for custom metrics) connect to the data source and channel the data into your metrics.

OAuth Token authentication is often used to authorize PowerMetrics to access your data, however, depending on the service being connected to, API Key authentication is also an option.

File upload is also a popular choice for accessing local data.

How is data retrieved?

PowerMetrics connects to the warehouse and retrieves data using an SQL query.

For dbt Semantic Layer users, PowerMetrics creates a dbt semantic layer query that the dbt Semantic Layer translates into direct queries against the data warehouse.

For Cube users, PowerMetrics creates a Cube API query. Cube translates this query into an SQL query, which is then executed against your data warehouse.

To retrieve data for data feed metrics, PowerMetrics sends queries to our proprietary metrics database (where the data is stored).

How is data stored?

PowerMetrics does not store your data.

Data is stored in the data warehouse.

When the data is queried, it’s cached for a short time in memory and removed when the cache is cleared (automatically, with a webhook setup, with a cache expiration strategy, or manually).

PowerMetrics does not store your data.

Data is stored in the underlying data warehouse, for example, Google BigQuery or Snowflake.

When creating semantic layer metrics, PowerMetrics directly queries your data in real time from the semantic layer. This query is cached for a short time in memory and removed when the cache is cleared (automatically with a webhook setup or manually (dbt Semantic Layer and Cube), or by using a cache expiration strategy (Cube)).

For instant and custom data feed metrics, the data we retrieve from external data sources and APIs is stored in our proprietary metrics database, which is optimized for metric-based ingestion, updates, and queries.

For calculated data feed metrics, we don’t store data for the calculated metric itself, but, instead, the results are generated by querying each operand in the formula, and then applying the formula to the results.

How is data updated?

Data is automatically updated in PowerMetrics, to align with the data in your warehouse via a webhook. Users can also manually align data by clearing the cache or using a cache expiration strategy.

For dbt Semantic Layer users, data is automatically updated in PowerMetrics whenever a deployment job is successfully run (requires webhook setup) or manually by clearing the cache for the connected dbt Semantic Layer account in PowerMetrics.

For Cube users, Cube has its own cache strategy.

In PowerMetrics, when configuring the cache expiration, we recommend choosing the shortest available TTL value.

Data is queried automatically based on a schedule. However, users (typically the account administrators) can also manually queue a data feed to be refreshed.

Every time your data refreshes, the associated data feed gets updated. The resulting data is then added to the existing data for the metrics that use the feed. The metrics then incorporate the new data into their history.

Users don’t have to explicitly manage any caching, PowerMetrics expires the cache when the underlying data feed updates.

Can end users edit source data in PowerMetrics?

As previously mentioned, data is stored in the database.

To guarantee SSOT data, in PowerMetrics, users can edit a metric’s display properties only. They cannot edit a metric’s underlying data.

As previously mentioned, data is stored in your chosen database.

To guarantee SSOT data, in PowerMetrics, users can edit a metric’s display properties only. They cannot edit a metric’s underlying data.

Editing options and access depend on the type of metric and on the user’s role in PowerMetrics.

For example, instant data feed metrics, which are created and managed by Klipfolio, have fewer editing options than custom metrics that you create and manage via data feeds.

Editor users have a higher level of access and, as such, can modify visualization display settings and the underlying data that’s being represented. In comparison, users with a viewer role can personalize metric visualization settings but they cannot modify a visualization’s underlying data.

Note: Visualizations use metrics for their data, and are created and edited as separate artifacts. When users edit visualizations, they’re not changing the metric, they’re changing the visualized representation of the metric’s data.

What do I need to do to start?

You’ll need data that’s managed and stored in a data warehouse.

Our supported data warehouse integrations are documented here.

Sign up for a free PowerMetrics account at https://www.klipfolio.com/.

dbt Semantic Layer users need a dbt Cloud Team or Enterprise level account with a version of dbt that supports dbt Semantic Layer (dbt v1.6 or higher). You’ll also need to set up the dbt Semantic Layer and build and define metrics in a dbt Semantic Layer project.

Cube users need their Cube host URL and an API key. You’ll enter this information when creating an account connection with PowerMetrics.

Sign up for a free PowerMetrics account at https://www.klipfolio.com/.

Sign up for a free PowerMetrics account at https://www.klipfolio.com/.

PowerMetrics as a hybrid solution

As a hybrid solution, PowerMetrics supports data warehouse metrics, semantic layer metrics (dbt Semantic Layer and Cube), and a custom modeling and hosted data solution used to create metrics with data pulled from hundreds of cloud based services, such as Shopify, Facebook Ads, and Zendesk, files, such as Excel, Google Sheets, Smartsheet, and Airtable, and direct REST/API queries for custom data applications (data feeds).

This article defines a “hybrid solution” as one that combines any or all of the solutions available in PowerMetrics. Note: The ability to create metrics based on data feeds is included in every PowerMetrics plan, enabling that option to be used alongside any other solution at no additional cost.

Is a hybrid solution right for my organization?

If any (or all) of the following describe your unique data management and business analytics needs, then this may be the right solution for you:

  • Departments in your organization rely on data that comes from various data sources. For example, your marketing team tracks customer data from cloud services, like HubSpot and Facebook Ads, and your sales team analyzes revenue data that’s stored in Snowflake or a dbt Semantic Layer project.

  • Your data is stored in more than one place. For example, you have centralized data storage and governance for core business data (stored in a warehouse) but also have specific departmental data that’s not stored in the primary system.

  • Your data infrastructure is in a transitional stage. You’re handling larger volumes of data but aren’t ready to fully commit to a data warehouse or semantic layer system.

  • You want a flexible solution that enables you to move to a different data technology stack in the future.

What are the benefits of a hybrid solution?

Here are a few of the ways a hybrid solution solves the issues faced by modern data teams:

  • Ability to connect to and visualize data from virtually any source. By combining solutions, you can use PowerMetrics to retrieve and visualize data from just about anywhere.

  • A solution that grows with you as your data stack evolves. PowerMetrics is designed to meet the needs of modern data teams today and grow with them into the future. As your data management needs change and grow, you can add a new PowerMetrics solution or shift from one PowerMetrics solution to another. For example, you might start out with data feeds and, as your data stack matures, add a data warehouse or semantic layer solution. If, at some point, all of your data is stored in a warehouse and/or accessed through a semantic layer, and you have no more need for data feeds, you can seamlessly switch the data source for your metrics from data feeds to warehouse or semantic layer sources.

  • The flexibility to combine metrics of every type. PowerMetrics includes several metric types, depending on the solution or solutions you use: Instant and custom data feed metrics (data feed solution), data warehouse metrics (data warehouse integration), dbt Semantic Layer metrics, and Cube metrics (semantic layer integration). All metrics, of every type, can be combined as calculated metrics or multi-metrics and displayed alongside each other on dashboards.

  • Combining metric types in multi-metric charts (in the Explorer and on dashboards). Multi-metric charts enable business users to visualize and compare data for multiple metrics in a single chart. For instance, they could add a Snowflake “Revenue” metric to Explorer and analyze it alongside one or more of our many Facebook Ads instant (data feed) metrics to see the impact of their Facebook Ads campaign. In Explorer, or in a multi-metric visualization on a dashboard, up to 5 unique metrics can be combined for analysis and free-form exploration.

  • Combining metric types in calculated metrics. Calculated metrics combine metric values using an equation (with simple math - addition, subtraction, multiplication, and division) to create a metric that can be expressed as a number, a percentage, or a ratio. For example, you can combine instant or custom data feed metrics with dbt Semantic Layer metrics to create the equivalent of a ratio or derived dbt metric. As a bonus, unlike dbt ratio and derived metrics, calculated metrics can be made up of metrics from any data source.

  • Adding any and all metric types as visualizations to a dashboard. All metric types behave the same way for consumers, so the decision of which type to use depends entirely on which PowerMetrics solution works better for your data scenario.

A peek at some PowerMetrics features

Here’s a list of a few of the features you'll find in PowerMetrics, regardless of which solution you choose:

  • Metric catalog - Business users access metrics in a centralized location where, among other actions and depending on their role and set of permissions, they can open metrics for self-serve analysis and share, delete, and add metrics to dashboards. Learn more.

  • Customizable dashboards - Users can add any metric type to a dashboard. Other examples of dashboard actions include: Personalizing with colours, images, and text, sharing internally or externally (with public or passcode access), downloading content as a PDF, and applying dimensional filters and different date ranges. Learn more.

  • The metric homepage - The metric homepage presents the complete picture for a single metric. It enables business users to quickly compare progress over time, access metric details, and visualize data automatically using a professionally curated, yet customizable, template. Learn more.

  • Calculated metrics - Calculated metrics combine metric values using an equation (with simple math - addition, subtraction, multiplication, and division) to create a metric that can be expressed as a number, a percentage, or a ratio. Users can combine metrics of any type when creating calculated metrics. Learn more.

  • Multi-metrics - Multi-metric charts enable business users to visualize and compare data for multiple metrics in a single chart (in Explorer and on dashboards). As with calculated metrics, users can combine metrics of any type when creating multi-metrics. Learn more.

  • Certified metrics - Data teams can ensure alignment across reports and dashboards by certifying metrics as approved versions for end users. Learn more.

  • Personalized, free-form data analysis - Business users can add one or more metrics to Explorer to investigate and learn without affecting the metric’s underlying data or its default display for others. Metrics can be added manually or with assistance from PowerMetrics AI. This premium feature uses a natural language interface to interpret your questions and automatically create visualizations. Learn more.

  • Metric goals - Business users can track metric progress by setting target and/or recurring goals. When a goal is reached, they (and everyone with shared access to the metric) can be notified with in-app and email notifications. Learn more.

  • Normal range and forecast analysis tools - These built-in analysis tools help business users assess past, current, and future trends in their metrics. They’ll be able to quickly identify outliers in their data (values that fall outside the metric’s normal range) and see a prediction of future metric performance. Learn more.
Have more questions? Submit a request