Part of this equation is effectively managing and communicating the context around the asset – this includes whether it is supported by the analytics team, and its lifecycle status (i.e. in development vs. active, expiring or archived).
There is nothing worse than an end user relying on an incomplete or out of date dashboard, and using that information to come to a flawed decision. In helping you to navigate entropy, your data asset management solution should allow you to easily establish and communicate this context and, for the sake of your team’s bandwidth, automate significant parts of the effort.
The core question to answer with contextual signposts like lifecycle and certification is simple: “how should I use this?”
For example, an active asset most likely has current data and has passed out of the development stage, while an archived asset is no longer relevant to the business or might reflect stale data. A certified asset, meanwhile, has been built or approved by the team that knows your analytics environment best. This clarity is critical for end users who might not have been involved in the process of building a given report or dashboard, and may not have the context to know if it is complete.
Clarity around which assets to use is hugely beneficial to the team who is building the assets as well. Less confusion for end users means fewer support requests and time wasted answering questions.
Your team likely has many different systems where users can access and analyze data. While maintaining lifecycle and certification status for every asset in each of these systems is best practice, few analytics tools have built-in functionality to do so. And even if they do, the placement and terminology may vary from system to system. Having data builders set statuses for assets doesn’t do much good if end users don’t know where to find the status, or how to interpret it.
This is why using a data asset management solution is so powerful for managing these statuses. Every asset’s lifecycle and certification status is clearly marked and exposed wherever users consume data, providing visibility and consistency across every analytics platform you use.
Certifying an asset is a common practice for modern data teams, and indicates that the report or dashboard is supported by the data team, and that they stand behind its use and will maintain it. You might be wondering how lifecycle management is different from certification, where a certification might in some cases be equivalent to marking an asset as “active.”
The answer is that lifecycles are more flexible and less binary than certification, and are best used in complement. For example, you may have a core set of certified dashboards that get cycled out yearly or quarterly. If a user accesses one of these and sees that it is both certified and expiring, they will know that it is safe to use, but that most likely a new version of the dashboard will be coming in the near future.
Likewise with an asset that may cycle back and forth between active and in development as new information or tweaks to functionality become necessary. Completely decertifying the asset sends a less clear signal about what is happening with the asset than simply leaving it certified and changing its lifecycle status.
With the importance of contextual quality markers, teams have taken a variety of measures to keep their end users in the loop.
For asset lifecycle these include:
For asset certification they include:
As different conventions develop across different teams and systems, even appropriately marked assets may cause confusion. Representing your contextual markers in a consistent way in your data knowledge management platform is a better way of making it available to your users.
There is significant value for both end users and data producers in having a clear picture of every asset’s lifecycle status.
However, maintaining these statuses can be time-consuming for an analytics team. Even if they are editing the statuses in a well-organized analytics hub rather than navigating to each of your native analytics systems, there is still a built-in maintenance cost in time and lost opportunity to build new data products.
This is why the best data asset management solutions will give you functionality to automate lifecycles. The most impactful area where this automation can occur is toward the end of an asset’s life. Using dynamic rules, you can set how long you want to elapse before a report or dashboard is marked as expiring, and eventually archived.
You could even consider setting rules around how many people are viewing that asset, so that frequently used assets are not auto-expired. Over time, this type of automation results in a much cleaner data environment as old assets no longer pile up alongside your new ones, as well as far less work for the analytics team.
For asset certification, some level of automation provides improvements beyond simply saving time. Most certification is very top-down, with the data team distributing their certified assets via quality markers (i.e. naming conventions) or through folders of assets. New certifications happen either via ad hoc requests, or in batches as new development occurs. With an automated solution that brings certification requests into your data enablement workflows, you can democratize which assets end up with your team’s seal of approval. All while maintaining the same high standards of quality.
Most teams understand the need for context when it comes to their data. By automating and centralizing this context, they can achieve this goal while providing the most consistent experience for end users, at the lowest cost in manual work for the data team.
Receive regular updates about Workstream, and on our research into the past, present and future of how teams make decisions.