top of page

Our technological outcomes

The image shows a long aisle in a well-lit library or study space, with tall shelves lined with books and other materials. The shelves are organized with numbered sections, and the perspective stretches toward a distant seating area with chairs and tables, suggesting a quiet, spacious environment for reading or research. The ceiling is fitted with modern track lighting, casting soft light throughout the space.

​​On this page, you can discover descriptions and links to the main interactive technologies we developed.
Explore our multisensory WebUI, dive into immersive VR exhibitions, and experience AI-driven emotional soundscapes with MuseMeUp.
Browse inclusive creative resources in our Repository, access open-source tools on our GitHub, and discover more innovative platforms designed to make culture accessible, engaging, and interactive for all.

WebUI

MuseIT web UI is a multi-sensory, multilingual web gallery where cultural heritage comes alive through image, text, sound, and touch. From a clean landing page, users can dive into a curated exhibition of roughly 900 artefacts and explore each piece with layered representations designed for accessibility and discovery. For many items, visuals translate into notes or haptics, turning browsing into an embodied experience.

The interface adapts to users: a preference panel adjusts haptic intensity and audio levels.  Recommendations suggest related works via the MuseIT Knowledge Graph, and an optional camera-driven feature composes a personalized soundtrack from users’ emotions as they navigate through the collection. Screen-reader compatibility pairs with WCAG 2.2–tested controls—zoom, high-contrast, increased letter spacing, dyslexia mode, hide images, saturation and cursor options.

Virtual Reality Exhibition (in WebUI)

The Virtual Reality (VR) exhibition is a fully immersive, multi-sensory cultural heritage experience integrating visual, auditory, musical, and haptic modalities. The exhibition plan of the Virtual Museum consists of four thematic units: Landscape, History, Religion, and People. In a fifth and special thematic area, five paintings with interactive audio and haptic feedback further enrich inclusion, cultural engagement, and accessibility. During VR sessions, participants wear non-invasive sensors recording brain activity, skin conductance, and heart rate. The signals are analyzed with machine-learning methods to gain insights into how visitors emotionally connect with cultural assets. The Virtual Museum serves as an experimental space for multimodal, multi-sensory research, making inclusion and immersion in digital cultural experiences a reality.

MuseMeUp (in WebUI)

MuseMeUp is an innovative application developed by Catalink LTD. It all starts while the user engages with a cultural artifact (for example, a painting or video). During this interaction the facial expressions and heart activity are recorded through a camera and a chest strap sensor. The AI models analyze this data in real time, to detect the mood and stress levels of the user. Meanwhile, this data is enriched with context from the Cultural Heritage Knowledge Graph, which adds contextual information like keywords, historical period, and themes based on what is being explored. Finally,  a piece of music tailored to the user's emotional state and experience is synthesized and played.

Inclusive Repository

This digital resource offers materials centered on inclusion in the area of creative arts and disability. It includes a collection of primary source materials—such as film, photo, and more—that document examples of good practice and art. The library also features a scientific section, along with a toolkit that presents best practices, case studies, methods, and technologies. All of these resources are available in multiple formats. You are welcome to explore and share these materials and the artwork gathered from a variety of sources and from the MuseIT project.

Music Co-creation Technologies

Co-creation  platform developed low-latency communication, using multisensory data to predict states of mind and body during co-creative experiences, communicating those states remotely through multiple modalities, and accessibility of creative technologies.

These technologies were validated as key enablers for transforming the performing arts.

HaptiVerse

HaptiVerse is a haptic pattern creation toolkit, which has offline and online versions. It contains the HaptiDesigner, which can be used to create haptic templates and patterns, the HaptiMux, which is a routing centre as well as API for all resources, and HaptiTec, which acts as a repository for storing haptic patterns. HaptiVerse was designed from the ground up to allow for long range transmission of haptic data, be intercompatible with nearly all haptic paradigms and accessible hardware, and cater to different levels of expertise.

Github

MuseIT’s Github is filled with the software deliverables, as well as the JSON-based datasets we developed and enriched during the course of the project. The programming language depends on the repo, but most use Python.

CONTACT US

SEARCH ON THE WEBSITE

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Research Executive Agency. Neither the European Union nor the granting authority can be held responsible for them.

European Flag
Disclaimer : Co-funded by the European Union
bottom of page