Using shape expressions (ShEx) to share rdf data models and to guide curation with rigorous validation

Katherine Thornton, Harold Solbrig, Gregory S. Stupp, Jose Emilio Labra Gayo, Daniel Mietchen, Eric Prud’hommeaux, Andra Waagmeester

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations


We discuss Shape Expressions (ShEx), a concise, formal, modeling and validation language for RDF structures. For instance, a Shape Expression could prescribe that subjects in a given RDF graph that fall into the shape “Paper” are expected to have a section called “Abstract”, and any ShEx implementation can confirm whether that is indeed the case for all such subjects within a given graph or subgraph. There are currently five actively maintained ShEx implementations. We discuss how we use the JavaScript, Scala and Python implementations in RDF data validation workflows in distinct, applied contexts. We present examples of how ShEx can be used to model and validate data from two different sources, the domain-specific Fast Healthcare Interoperability Resources (FHIR) and the domain-generic Wikidata knowledge base, which is the linked database built and maintained by the Wikimedia Foundation as a sister project to Wikipedia. Example projects that are using Wikidata as a data curation platform are presented as well, along with ways in which they are using ShEx for modeling and validation. When reusing RDF graphs created by others, it is important to know how the data is represented. Current practices of using human-readable descriptions or ontologies to communicate data structures often lack sufficient precision for data consumers to quickly and easily understand data representation details. We provide concrete examples of how we use ShEx as a constraint and validation language that allows humans and machines to communicate unambiguously about data assets. We use ShEx to exchange and understand data models of different origins, and to express a shared model of a resource’s footprint in a Linked Data source. We also use ShEx to agilely develop data models, test them against sample data, and revise or refine them. The expressivity of ShEx allows us to catch disagreement, inconsistencies, or errors efficiently, both at the time of input, and through batch inspections. ShEx addresses the need of the Semantic Web community to ensure data quality for RDF graphs. It is currently being used in the development of FHIR/RDF. The language is sufficiently expressive to capture constraints in FHIR, and the intuitive syntax helps people to quickly grasp the range of conformant documents. The publication workflow for FHIR tests all of these examples against the ShEx schemas, catching non-conformant data before they reach the public. ShEx is also currently used in Wikidata projects such as Gene Wiki and WikiCite to develop quality-control pipelines to maintain data integrity and incorporate or harmonize differences in data across different parts of the pipelines.

Original languageEnglish (US)
Title of host publicationThe Semantic Web - 16th International Conference, ESWC 2019, Proceedings
EditorsPascal Hitzler, Miriam Fernández, Alasdair J.G. Gray, Karl Hammar, Krzysztof Janowicz, Armin Haller, Amrapali Zaveri, Vanessa Lopez
PublisherSpringer Verlag
Number of pages15
ISBN (Print)9783030213473
StatePublished - 2019
Event16th International Semantic Web Conference, ESWC 2019 - Portorož, Slovenia
Duration: Jun 2 2019Jun 6 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11503 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference16th International Semantic Web Conference, ESWC 2019


  • Digital preservation wd:Q632897
  • FHIR wd:Q19597236
  • RDF wd:Q54872
  • ShEx wd:Q29377880
  • Wikidata wd:Q2013

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'Using shape expressions (ShEx) to share rdf data models and to guide curation with rigorous validation'. Together they form a unique fingerprint.

Cite this