[WEBINAR] Product Talk: Using AI to enhance the data marketplace search experience

Save your place
Glossary

Linked Open Data (LOD)

Linked Open Data (LOD) refers to information that is both accessible to everyone and structured in a way that machines can interpret. Linked Open Data is based on the Linked Data model developed by computer scientist, physicist, and World Wide Web inventor, Tim Berners-Lee. This model provides a structured approach to interconnecting data openly and accessibly on the web.

Linked Open Data is part of the concept of the Data-focused Web 3.0. It consists of raw data that adheres to the same standards, facilitating its use, sharing, and interoperability.

The Data Web takes shape through the model proposed by the W3C [World Wide Web Consortium – the main international standards organization for the World Wide Web]: the Semantic Web, a common model that promotes data sharing and reuse primarily through applications, organizations, or communities.

Differences between Open Data and Linked Open Data

It is important to distinguish between Open Data and Linked Open Data.

Indeed, data can be open without being linked, and vice versa.

Open Data

Open Data refers to data that is freely accessible without legal (such as copyright), technical, or financial restrictions. This promotes transparency, innovation and data-driven decision-making as data can be used, shared, and modified by anyone outside the organization that originally created it.

Linked Open Data

Linked Open Data differs due to its semantic interconnections. It is structured in a way that is understandable by machines, thus facilitating integration and automatic analysis.

Advantages of Linked Open Data

  • Efficient use of resources: Linked Open Data enables individuals and organizations to benefit from data that has already been collected, processed, and made available. This encourages collaboration and reuse of existing resources.
  • Improvement of information quality: Linked Open Data encourages standardization of metadata and data formats, making the data more reliable and credible.
  • Creation of added value: By directly connecting to other data, Linked Open Data allows users to discover, use, and reuse information in innovative ways.
  • Identification of information gaps: Linked Open Data allows errors in the data to be highlighted and corrected.
  • Enhancement of transparency: Linked Open Data is accessible to everyone.

Technical principles of Linked Open Data

Linked Open Data is based on several fundamental technical principles:

Availability without excessive restriction

Linked Open Data may be subject to licenses, but these licenses must be open and promote free reuse of the data. Licenses such as Creative Commons provide this legal framework while encouraging collaboration and data reuse

RDF Model

To ensure their interoperability and understanding by automated systems, data must be based on the RDF (Resource Description Framework) model.

Developed by W3C, RDF is used to formally describe web resources and their metadata.

This model uses subject-predicate-object triplets to represent data in a structured manner and enable machine processing.

Uniform Resource Identifier (URI)

Each data resource must have a unique and permanent URI online, which simplifies its identification and access.

URI is a global identification system that distinguishes real, abstract, or digital items through unique names. URIs are essential for linking data together and enabling exploration across the web.

Standard HTTP protocol

Data must be made available online following the standard HTTP protocol (Hypertext Transfer Protocol), ensuring its availability and retrieval by users and applications.

Steps for publishing Linked Open Data

Linked Open Data can be implemented in 7 steps:

1. License selection

It is crucial to define the rights to use the published data by specifying the data owner and the conditions for reuse.

2. Collection of Linked Open Data

This step involves the rigorous collection of relevant data to be published, ensuring that it is complete, reliable, and compliant with established quality standards. This may include identifying data sources and validating the accuracy of the collected data.

3. Attribution of URIs

URI (Uniform Resource Identifier) attribution is an essential process for describing resources and their links. This step requires defining a domain name that will remain stable over time. It is also necessary to define the identifier that will describe the resource. This can be an existing identifier, such as the ISBN number for a book.

4. Analysis of Linked Open Data

This step involves editorializing and normalizing Linked Open Data. It requires close collaboration with data producers or experts and involves a thorough analysis of the data, including format, quantity, quality, etc.

5. Enrichment of Linked Open Data

Data enrichment involves adding information that gives meaning to the original data. This can include translations, definitions, or contextual information.

6. Modeling of Linked Open Data

This step involves organizing Linked Open Data into a semantic model to ensure interoperability.

7. Publication of Linked Open Data

Before the final step of publishing data, it is recommended that the quality of the dataset is checked, to see if, for example, it is linked to other datasets, if a usage license is available, or if the metadata provenance is clearly established.

Learn more
The central role of data in delivering the Paris 2024 Olympic and Paralympic Games Company news
The central role of data in delivering the Paris 2024 Olympic and Paralympic Games

As we get closer to the start of the world's biggest sporting event, we look at the role of data in preparing for the Paris 2024 Olympic and Paralympic Games, which start on July 26th 2024.

Transforming banking operations with data portals Banking & Insurance
Transforming banking operations with data portals

Embracing data at scale enables banks to digitize their operations and improve efficiency, increase productivity, better manage risk and meet regulatory compliance needs. We explain how data portals are central to effective data sharing across banks and their operations.

What are the benefits to using your data portal to feed AI models? Digital transformation
What are the benefits to using your data portal to feed AI models?

Learn how data portals enhance the training and effectiveness of artificial intelligence models by providing reliable, high-quality and trustworthy data, which is essential to ethically deploy AI and harness its benefits.

Start creating the best data experiences