Design and deploy a set of universal plugins for Data sharing, monetization and trading platforms that enable actors in common European data spaces to collaboratively negotiate, improve and enforce data sharing contracts automatically, providing dynamic fair pricing mechanisms while implementing energy-efficient data exchange, ensuring privacy, confidentiality and legislation compliance and adhering to ethical and responsibility guidelines
The general objective of the KnowledgeSpaces project is to provide theoretical and technological advances for the creation and exploitation of knowledge spaces, using them in three pilots on the domains of cities, public administrations and science. The specific objectives are as follows:
- Governance and Architecture of knowledge spaces, including legal and ethical compliance by design
- Scalable methods, algorithms and tools for rich ethical/legal by design KG construction from huge tabular and tree-based data sources
- Scalable methods, algorithms and tools for KG construction and hybrid KG-NLP exploitation from multilingual textual sources
- KG secure storage and exploitation, with explainability and compliance validation
- Creation of knowledge spaces in three domains:
- In the domain of a city, focusing on how the city is addressing Sustainable Development Goals
- For public administration, enabling innovative ‘‘legal tech” applications supporting practitioners as well as other services of public interest
- In a scientific domain, focused on nuclear energy and biomedicine
SolarChem aims to develop “Digital Solar Chemistry” technologies based on integration of Artificial Intelligence in the design of high effective and selective bio-based hybrid photoelectrodes and solar reactors to transform Earth abundant resources and waste feedstock into fuels and chemicals. The specific objectives are as follows:
- To develop AI-robotized photoelectrode manufacturing strategies
- To design and optimize heterogeneous photoelectro-enzyme-based catalysts.
- To design and construct a solar photoelectrochemical device to develop chemical reactions.
- Generation of an Open Science framework for representing solar chemistry processes commonly reported as texts in the scientific literature and data from real experiments carried out in wet-labs associated with photoelectrocatalytic processes as a Knowledge Graph.
- Application of Machine Learning algorithms over the experiments represented in the KG in order to optimize the chemical reactions, guide the interactive decision-making procedure and to design new experiments and schedules by actively learning from previously acquired data.
In the ATTENTION! project, we analyse the largest trade database of imports and exports available globally together with extensive Web content- and metadata. We develop machine learning models to understand and detect patterns of illicit trade activity and to expose the perpetrators and their support systems.
The mission and objectives of Polifonia can be summarized as follows. To increase large-scale interoperability between heterogeneous musical heritage (MH) resources (at data level and at Web scale) and to reduce the effort required for integrating MH resources. To encode MH knowledge that is hidden in texts or orally transmitted in order to support its preservation and protection. To discover and make explicit common features of music objects and their link to tangible heritage. To support music classification. To enhance music comprehension as well as the understanding of its identity, history and socio-cultural impact. To facilitate the discovery, indexing, querying, and searching of interlinked MH sources on the Web. To facilitate the management of large collections through automated music classification. To facilitate the study of MH knowledge and the reuse of results by enabling visual, interactive exploration of interlinked collections. To enhance the way we perceive, experience and access music. To facilitate the creation of a MH knowledge ecosystem. To support economically sustainable and effective promotion of European MH. To increase the economic and social impact of MH knowledge.
Semantics- and data-driven citizen curation of cultural heritage: we provide technologies that help communities to create, share and reflect over their own interpretations of cultural heritage. Citizens can use our tools to share their opinions and engage with a diverse range of perspectives.
Efficient Explainable Learning on Knowledge Graphs (ENEXA) is a European project developing human-centered explainable machine learning approaches for real world knowledge graphs.
ENEXA devises new machine learning approaches that maintain formal guarantees pertaining to completeness and correctness while exploiting different representations (formal logics, embeddings and tensors) of knowledge graphs in a concurrent fashion. With our new methods, we plan to achieve significant advances in the scalability of machine learning, especially on knowledge graphs. A key innovation of ENEXA lies in its approach to explainability. Here, we focus on devising human-centred explainability techniques based on the concept of co-constructio
Advancements in digital transformation have enriched the interactive experience of cultural heritage. The adoption of digital technologies has benefited the development of virtual museum tours, preservation of cultural assets, digitised archives enabling broader and engaging means of access to cultural heritage. Despite critical advances, structural deficiencies remain: lack of accessibility for all, limited opportunities for all members of society to participate in cultural and creative industries on equal basis, and limited interoperable digital repositories for archiving accessible multisensory cultural assets and related archival structures. With one billion people experiencing some form of disability and one-fifth of these experiencing significant disabilities, access or opportunities to engage with cultural assets is not equally available to all.
The MuseIT project aims to co-design, develop, and co-evaluate a multisensory, user-centred platform for enriched engagement with cultural assets with inclusion and equal opportunity for all as core principles. The MuseIT innovation is rooted in multisensory representations of cultural heritage which extend beyond the visual and auditory senses. Participatory co-design activities will be integral to enable users with disabilities to be involved in decisions and designs that will affect their lives. Enriched experiences of cultural assets created by MuseIT will not be limited to this group and is likely to enhance the level of participation and enjoyment for all people. Impacts are likely to include the democratization of cultural asset experiences, and tangible growth in creative and cultural industries.
The KATY project pursues the following objectives:
Linking omics and clinical data in a Knowledge Graph Providing a predictive system to clinicians for AI-based treatment recommendations to support them in their process of selecting the treatment best suited to each patient Setting up a proof-of-concept application of AI-models and knowledge graphs in the context of a clinical pilot in renal cancer Reducing the burden of disease for renal cancer patients by applying existing treatments in a more targeted way Enhancing the diagnostic capacity overall for complex diseases by using AI-based models to predict patient response to targeted therapies and the identification of molecular evidence to support these predictions.
DataCloud delivers a toolbox of new languages, methods, infrastructures, and prototypes for discovering, simulating, deploying, and adapting Big Data pipelines on heterogeneous and untrusted resources. DataCloud separates the design from the run-time aspects of Big Data pipeline deployment, empowering domain experts to take an active part in their definitions.
Its aim is to lower the technological entry barriers for the incorporation of Big Data pipelines in organizations’ business processes and make them accessible to a wider set of stakeholders regardless of the hardware infrastructure. DataCloud validates its plan through a strong selection of complementary business cases offered by SMEs and a large company targeting higher mobile business revenues in smart marketing campaigns, reduced production costs of sport events, trustworthy eHealth patient data management, and reduced time to production and better analytics in Industry 4.0 manufacturing.
OntoCommons lays the foundation for interoperable, harmonised and standardised data documentation through ontologies, facilitating data sharing and pushing data-driven innovation, to bring out a truly Digital Single Market and new business models for European industry, exploit the opportunities of digitalisation and address sustainability challenges. This will be achieved by developing the Ontology Commons EcoSystem (OCES) – a set of ontologies and tools that follows specific standardisation rules – and provide a sustainable approach, making the data FAIR (Findable, Accessible, Interoperable and Reusable). Moreover, the OCES implements practical and user-friendly mechanisms of intra- and cross-domain interoperability focusing.
Designed to meet the needs of semantic web and linked data environments, VocBench development has also been driven by the feedback gathered from a community of users made of public organizations, companies and independent users looking for open source solutions for maintaining their ontologies, thesauri, code lists and authority resources.
VocBench (or, simply, VB) business and data access layers are realized by Semantic Turkey, an open-source platform for Knowledge Acquisition and Management realized by the ART Research Group at the University of Rome Tor Vergata.
VocBench offers a powerful editing environment, with facilities for management of OWL ontologies, SKOS/SKOS-XL thesauri, OntoLex lexicons and any sort of RDF dataset. It aims to set new standards for flexibility, openness and expressive power as a free and open source RDF modelling platform.
COGITO aims to materialise the digitalisation benefits for the construction industry by harmonising Digital Twins with the Building Information Model and building a digital Construction 4.0 tool-box to unleash the untapped potential in productivity improvement and increased safety.
- To deliver a Construction Digital Twin platform that fuses as-designed BIM information with as-is BIM information and live big data stream from IoT sensors into a live digital representation of the construction site enabling real-time, remote visual inspection of ongoing works by construction stakeholders.
- To alleviate construction project cost and time over-runs through the delivery and validation of vertical digital tools for Quality Control and Workflow Management based on a Digital Twin platform that facilitates immediate detection of discrepancies or deviations from the construction site.
- To effectively reduce construction site accidents through the development and showcasing of digital tools for Health & Safety management.
- Construction digital twin demonstration on actual construction sites to quantify the benefits, evaluate acceptance by construction labour force, obtain feedback for improvements.
- Research, design and promote for standardization data exchange formats that facilitate the interoperability of construction digital twins with future evolutions of the Building Information Model
- Promote the adoption of the COGITO solution through intense dissemination and knowledge transfer of the project outcomes toward the targeted stakeholders, reaching out to audiences within and beyond the EU