Language

[Webinar] How Birmingham City Council transformed data sharing with Opendatasoft to power smarter decisions and greater efficiency

Register now!
Glossary

Data Service Level Agreement (SLA)

A data service level agreement (SLA) sets out guaranteed performance levels around quality, reliability and availability for data assets.

What is a data SLA?

A data service level agreement (SLA) sets out guaranteed performance levels around quality, reliability and availability for data assets shared between stakeholders. It ensures that all parties are aligned on data quality expectations and includes ways of measuring and monitoring adherence.

Just as service level agreements provide guarantees for the availability and quality of services, such as software uptime, or response times, data SLAs outline publicly and formally what stakeholders should expect from the shared data they access – and any penalties if they are not met. This approach is vital at a time when data is being created in a more distributed manner, and shared more widely. Data SLAs help to increase communication and build trust and therefore drive greater data usage and value.

Data SLAs can be internal, such as between data producers and consumers, or externally between companies and their customers if they are providing data services. They should be written and formalized to make commitments concrete and clear, using agreed language, such as around timescales for the regularity of data updates.

Why are data SLAs important?

Data SLAs deliver a range of benefits:

  • They set expectations around the supply of data assets, including KPIs around quality, reliability and timeliness
  • They reduce conflict and disagreement through clear terms and expectations
  • They ensure accountability between data producers and consumers
  • They build trust within data ecosystems
  • They bring together departments and create a common language and communication channels around data 

What does a data SLA contain?

Data SLAs are normally short (250-500 word) documents shared internally in an accessible way via corporate intranets or Google Docs. They should cover six key elements:

  • Purpose: Why does this data SLA exist? 
  • Promise: What is being committed to?
  • Measurement: How will performance be measured and by whom?
  • Penalties: What happens if data SLAs are breached?
  • Requirements: What needs to happen for the data SLA to be valid?
  • Signatures: Who is committing to the SLA?

They are made up of two main parts:

  • Service Level Indicators (SLIs), which identify and set quantifiable metrics for data quality
  • Service Level Objectives (SLOs), these set the agreed parameters that these metrics will meet, such as normal update frequency

SLAs then combine these two parts and set out any consequences if the SLOs are breached.

What do data SLAs commonly cover?

The type of metrics covered include:

  • Data freshness: Measured in agreed terms (minutes/hours/days/weeks) 
  • Data completeness: How complete data should be in terms of fields etc
  • Data accuracy: Defining acceptable error rates for data values, including formatting
  • Data availability: Uptime and accessibility for data

What is the difference between a data SLA and a data contract?

While they are connected, data SLAs and data contracts differ in how they support effective data sharing. Often part of data products, data contracts are broader documents agreed between data producers and consumers outlining what data they will receive, including its format and schema and how it can be used. By contrast data SLAs set the performance metrics and KPIs around areas such as freshness and quality. Often, a data SLA is included within the data contract itself.

What are data SLA best practices?

To ensure success, the process of creating, agreeing and monitoring data SLAs should

  • Involve all relevant stakeholders, especially different departments
  • Be formally written, agreed and signed by both sides
  • Align with business and compliance objectives
  • Be technically possible with existing technology infrastructures
  • Be clear in the metrics/targets set, and use agreed language and terms to describe them 
  • Ensure accountability for non-compliance and assign key roles and responsibilities
  • Include monitoring and enforcement mechanisms if SLAs are breached
  • Be revisited regularly to ensure they are still relevant and performing in line with expectations
Learn more
Opendatasoft harnesses agentic AI to connect AI models to real-world data, driving greater business impact
Blog
Opendatasoft harnesses agentic AI to connect AI models to real-world data, driving greater business impact

What is agentic AI and how does it help increase data consumption? Our Q&A blog explains the current state of AI, and how Opendatasoft is innovating to drive forward its impact for customers.

Opendatasoft boosts data enrichment, even when using the largest reference sources
Blog
Opendatasoft boosts data enrichment, even when using the largest reference sources

Enriching your data is a key step in creating relevant insights and analysis that drives value. However, when it comes to using massive reference sources such as national company databases, detailed weather data or geographic/administrative boundary datato enrich your data, technical limitations often become a challenge.

Successfully scaling data products – best practice from McKinsey
Blog
Successfully scaling data products – best practice from McKinsey

Data products are central to increasing data consumption across the organization. But how can you ensure your data product program delivers lasting value? We explore the latest best practice from McKinsey, designed to scale data product creation and usage.

Start creating the best data experiences