Autonomous Materials

Interface to a Technical System, Multi-sensory Interaction and for the Generation of Emotion, Safety and Trust


The transdisciplinary project “Autonomous Materials: Interface to a Technical System, Multi-Sensory Interaction and for the Generation of Emotion, Safety and Trust” deals with the production of shape-changing, dynamic and smart materials for intuitive, multi-sensory interactions, with a focus on functional surfaces and textiles in a spatial context. A novelty is the cooperative, transdisciplinary development of a holistic composite of material and HMI, combining expertise from material design, physical HMI and 3D printing technology (FGF) for the new development and large-scale production of smart, textile-based materials for bodily material interactions. Over the course of the project, geometries, topologies and large-volume smart material systems with various integration levels of interaction, dynamics and function, with different applications and operating options, will be developed and their transfer options prepared in cooperation with the experts. The study-based evaluation is used for verification and quantification in order to design transformative materials and surfaces that serve as haptic interfaces to promote an emotional connection between the user and the underlying product/technology.

Researcher
Virginia Binsch
Jan Stackfleth

Period
01.01.24 – 31.12.27

PROJECT FUNDING
Europäische Union & Land Sachsen-Anhalt


Image: Speculative future materialities. Image Credits: Min Nguyen using Midjourney

The goal is to develop advanced materials that facilitate multi-sensory interactions and evoke emotional responses, especially within the realm of autonomous technologies and Physical Human-Machine Interfaces (PHMI). Using state-of-the-art Fused Granulate Fabrication (FGF) 3D printing technology, dynamic materials will be created that not only respond to environmental stimuli and human input but also enable enriched, emotional engagements. These materials are designed with built-in intelligence, achieved through their internal structure and the integration of smart properties, allowing them to dynamically adapt and interact with their surroundings.

Image Credits: Min Nguyen using Midjourney

RESEARCHER
Virginia Binsch
Jan Stackfleth

Period
01.01.24 – 31.12.27

PROJECT FUNDING
Europäische Union & Land Sachsen-Anhalt

 

 


Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google

info(at)materiability.com