Language

[Webinar] How Birmingham City Council transformed data sharing with Opendatasoft to power smarter decisions and greater efficiency

Register now!
Glossary

Data Service Level Agreement (SLA)

A data service level agreement (SLA) sets out guaranteed performance levels around quality, reliability and availability for data assets.

What is a data SLA?

A data service level agreement (SLA) sets out guaranteed performance levels around quality, reliability and availability for data assets shared between stakeholders. It ensures that all parties are aligned on data quality expectations and includes ways of measuring and monitoring adherence.

Just as service level agreements provide guarantees for the availability and quality of services, such as software uptime, or response times, data SLAs outline publicly and formally what stakeholders should expect from the shared data they access – and any penalties if they are not met. This approach is vital at a time when data is being created in a more distributed manner, and shared more widely. Data SLAs help to increase communication and build trust and therefore drive greater data usage and value.

Data SLAs can be internal, such as between data producers and consumers, or externally between companies and their customers if they are providing data services. They should be written and formalized to make commitments concrete and clear, using agreed language, such as around timescales for the regularity of data updates.

Why are data SLAs important?

Data SLAs deliver a range of benefits:

  • They set expectations around the supply of data assets, including KPIs around quality, reliability and timeliness
  • They reduce conflict and disagreement through clear terms and expectations
  • They ensure accountability between data producers and consumers
  • They build trust within data ecosystems
  • They bring together departments and create a common language and communication channels around data 

What does a data SLA contain?

Data SLAs are normally short (250-500 word) documents shared internally in an accessible way via corporate intranets or Google Docs. They should cover six key elements:

  • Purpose: Why does this data SLA exist? 
  • Promise: What is being committed to?
  • Measurement: How will performance be measured and by whom?
  • Penalties: What happens if data SLAs are breached?
  • Requirements: What needs to happen for the data SLA to be valid?
  • Signatures: Who is committing to the SLA?

They are made up of two main parts:

  • Service Level Indicators (SLIs), which identify and set quantifiable metrics for data quality
  • Service Level Objectives (SLOs), these set the agreed parameters that these metrics will meet, such as normal update frequency

SLAs then combine these two parts and set out any consequences if the SLOs are breached.

What do data SLAs commonly cover?

The type of metrics covered include:

  • Data freshness: Measured in agreed terms (minutes/hours/days/weeks) 
  • Data completeness: How complete data should be in terms of fields etc
  • Data accuracy: Defining acceptable error rates for data values, including formatting
  • Data availability: Uptime and accessibility for data

What is the difference between a data SLA and a data contract?

While they are connected, data SLAs and data contracts differ in how they support effective data sharing. Often part of data products, data contracts are broader documents agreed between data producers and consumers outlining what data they will receive, including its format and schema and how it can be used. By contrast data SLAs set the performance metrics and KPIs around areas such as freshness and quality. Often, a data SLA is included within the data contract itself.

What are data SLA best practices?

To ensure success, the process of creating, agreeing and monitoring data SLAs should

  • Involve all relevant stakeholders, especially different departments
  • Be formally written, agreed and signed by both sides
  • Align with business and compliance objectives
  • Be technically possible with existing technology infrastructures
  • Be clear in the metrics/targets set, and use agreed language and terms to describe them 
  • Ensure accountability for non-compliance and assign key roles and responsibilities
  • Include monitoring and enforcement mechanisms if SLAs are breached
  • Be revisited regularly to ensure they are still relevant and performing in line with expectations
Learn more
6 high-ROI use cases for data product marketplaces in the insurance sector
Blog
6 high-ROI use cases for data product marketplaces in the insurance sector

The insurance sector faces rising pressure on margins and a need to meet increasing regulatory requirements. With costs and claim volumes growing, insurers need to rethink their data sharing strategies to increase consumption and value to boost efficiency and innovation.

Increasing collaboration and monetization of data products: an interview with Snowflake
Blog
Increasing collaboration and monetization of data products: an interview with Snowflake

To maximize the value of their data and effectively monetize it internally and externally Chief Data Officers (CDOs) and other data leaders need to build an agile, interoperable data stack. This has to integrate data, wherever it is stored and create seamless end-to-end flows that make it available as easily-consumable, business-focused data products to internal and external users, shared through our data product marketplace solution.

Delivering long-term data product success – lessons from Gartner
Blog
Delivering long-term data product success – lessons from Gartner

How can you create and scale data product programs? Based on new insights from Gartner, we explain the key processes required in building relevant data products that meet user needs on an ongoing basis, emphasizing the importance of data product marketplaces to drive consumption and ROI.

Start creating the best data experiences