The 2nd Workshop on

Neur    design in Human-Robot Interaction

The making of engaging HRI technology your brain can’t resist.

Full-day HYBRID workshop as part of the IEEE International Conference on Robotics and Automation (ICRA 2024)

October 9, 2022

14:00 - 18:30 (CET)

powered by


Lorem ipsum dolor sit amet, consectetur adipiscing elit. xbdfljnldjn;djv;dljvn;dljvnvarius enim in eros elementum Duis cursus, mi quis viverra gergegegegeggegeinterutrumergrgegegegeggjnjnjntjtngjntjjtggb lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

3d-Carousel

     Our Mission     

Bring      Human-Robot Interaction Research
to the Real World with
Brain-Centered Experience

     Statement     

Problem

  • Current robots interacting with humans don't understand personalized intents, attentions, specific needs and/or emotions to serve people appropriately in different contexts.
  • The design of interactive robot behaviors doesn’t match human's intuition, user's expectations or social norms, such that the people have a hard time interpreting the objectives of the robots.
  • The humans and the robots don't have mutual understandings to perform coordinated, co-adaptive joint actions in close contact, proximity or tele-operated spaces.
  • The human-machine interfaces are not ergonomically designed. The software AI algorithms or the embodied intelligence in mechanisms used in applications lack the cognitive smoothness for the people to interact with. As a result, human and machine cannot communicate naturally and effortlessly.
  • The academic HRI researches have difficulties to enter into the consumer markets to make immediate practical impact.

Solution

  • Reinventing human-robot interaction by working with scientists, entrepreneurs and end-users to define, prove, innovate and scale the need-driven, value-based HRI innovations.
  • Placing the brain-centric experiences at the center of design to compensate, augment, facilitate us human, our societies and mother nature's harmony, as well as to improve the quality of life everywhere.
  • We investigate the special characteristics of human behavioral psychologies when experimenting with the users. Building upon that, we will explore and implement the fundamental neuroscience principles of sensing, cognition, planning/decision making and motor control in tuning HRI interaction dynamics and designing bio-compatible human-robot interaction devices, algorithms, theories and strategies.
  • Mixing multi-disciplinary domain expertise, diverse mindsets, market-driven research funding to shorten the time-to-market of traditional lab researches during the entire innovation process.

Approach

  • Optimize Cognitive Load of HRI
  • Emotion-Aware Interactions
  • Intuitive Human-Machine Channels
  • AI-Enhanced Brain-Centered Experience
  • Design the Innovation Pipeline through the Lens of Entrepreneurship

Invited Speakers

giulio sandini

Professor, Department Head
Robotics, Brain and Cognitive Sciences
Italian Institute of Technology (IIT)
“Kind, warm-hearted and considerate” are always the accolades Dr. Sugaya receives from her students. And that’s why her research in human emotion estimation and the applications on robots and Internet of Things are so touching people’s hearts. Dr. Sugaya has work experiences in both industry and academia, with tremendous knowledge on how to combine research findings and practical applications in the sweet spots.
Explore more

Arash Ajoudani

Tenured Senior Scientist, Director
Human-Robot Interfaces and Interaction (HRI²) laboratory
Italian Institute of Technology (IIT)
Dr. Alimardani is a scientist and educator who combines her passion for robots and neuroscience in her research. She spent many years in Japan, the land of robots, before moving to the Netherlands. When she is not coding or writing, you can find her gardening or doing Yoga.
Explore more

GIACINTO BARRESI

Researcher & ENACT Project Coordinator
Rehab Technologies Lab
Italian Institute of Technology (IIT)
NSF award-winning hero in dimensionality reduction in control and coordination of human hand, Dr. Vinjamuri graduated from University of Pittsburgh with PhD in BCI and neuroprosthesis. During his daily life, he is an enthusiastic researcher and prize-winning teacher. His dream is to decipher the working principles of complex neuromuscular control, to one day bring the most intuitive and simplest human-machine interface to our society.
Explore more

Misha Sra

Assistant Professor
Department of Computer Science
UC Santa Barbara (UCSB)
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Yukie Nagai

Project Professor
International Research Center for Neurointelligence
University of Tokyo
NSF award-winning hero in dimensionality reduction in control and coordination of human hand, Dr. Vinjamuri graduated from University of Pittsburgh with PhD in BCI and neuroprosthesis. During his daily life, he is an enthusiastic researcher and prize-winning teacher. His dream is to decipher the working principles of complex neuromuscular control, to one day bring the most intuitive and simplest human-machine interface to our society.
Explore more

Sean Montgomery

Founder, EmotiBit
Founder and Director of Engineering
Connected Future Labs
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Joana Cerejo

Senior Experience Designer
Manufacturing Intelligence division
Hexagon
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Dani Clode

Designer & Senior Technical Specialist
MRC Cognition and Brain Sciences Unit
University of Cambridge
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Midori sugaya

Professor
Department of Computer Science and Engineering
Shibaura Institute of Technology
Jennifer Molnar is a PhD student in Robotics at Georgia Tech with a passion for the movement arts. When she's not fire spinning, rock climbing, or playing piano, she's outside watching animals and plants in the woods and trying to figure out how to make robots that subtle and effective.
Explore more

Yueh-Hsuan Weng

Associate Professor
Institute of Advanced Study | ROBOLAW.ASIA
Kyushu University
As a father with a daughter diagnosed with cerebral palsy, Jeremiah founded CIONIC and introduced the first FDA-cleared Cionic Neural Sleeve, combining powerful adaptive algorithms, sensing, analysis and augmentation into a wearable garment to help individual's mobility needs. Jeremiah graduated with BS and MS from Computer Science at Stanford, with 20 years experience in product innovation at Apple, Openwave Systems, Slide, and Jawbone. Superpowering the human body is his dedicated lifetime mission.
Explore more

HENI BEN AMOR

Associate Professor, Head
ASU Interactive Robotics Laboratory
Arizona State University (ASU)
As a father with a daughter diagnosed with cerebral palsy, Jeremiah founded CIONIC and introduced the first FDA-cleared Cionic Neural Sleeve, combining powerful adaptive algorithms, sensing, analysis and augmentation into a wearable garment to help individual's mobility needs. Jeremiah graduated with BS and MS from Computer Science at Stanford, with 20 years experience in product innovation at Apple, Openwave Systems, Slide, and Jawbone. Superpowering the human body is his dedicated lifetime mission.
Explore more

Hasti Seifi

Assistant Professor
CS, Computing and Augmented Intelligence
Arizona State University (ASU)
Dr. Ueda is a scientist and inventor, trailblazing his research from Japan to the US, blending the mysterious Japanese craftsmanship and the western critical thinking to build unique bio-inspired sensing, actuation and control algorithms for effective integration of intelligent human-robotic systems. His achievements have gained several prestigious international recognitions, including the recent Nagamori Award.
Explore more

All Stake Holders

Involved for Discussion

Event Schedule

08:30

Set Up

Welcome and greeting all people
08:45

Opening Remarks

Organizers
09:00

The Machinery of Anthropomorphic Cognition and Interaction

Giulio Sandini, Italian Institute of Technology, Italy
09:30

Predictive Brain as a Key for Social Cognition

Yukie Nagai, University of Tokyo, Japan
09:55

Coffee Break (Poster and NeuroDesign Showcase Demo)

10:25

Learning, Control, and Interfaces for Situational Awareness in HRI

Arash Ajoudani, Italian Institute of Technology, Italy
10:50

Language, Vision, Motion: Human-Robot Collaborative Multimodal Communication

Heni Ben Amor, Arizona State University, USA
11:15

Enhancing Human-Centered Robotics through Neuroergonomic Design

Giacinto Barresi, Italian Institute of Technology, Italy
11:40

Designing for the User Experience of Touching Robots

Hasti Seifi, Arizona State University, USA
12:05
Yueh-Hsuan Weng, Kyushu University, Japan

Ethical Design and Standardization for Robot Governance

12:30

Lunch Break (Poster and NeuroDesign Showcase Demo)

13:30

Neuro Emotion Estimation for Robots and Edge AI platform for Nursing Care

Midori Sugaya, Shibaura Institute of Technology, Japan
13:55

Human Augmentation

Misha Sra, UC Santa Barbara, USA
14:20

Assistive, Augmentative and Adaptive: Considerations for Designing the Future Body

Dani Clode, University of Cambridge, UK
14:45

Bridging Futures: Leveraging Anticipatory Design to Enhance and Streamline HRI

Joana Cerejo, Hexagon, Portugal
15:10

Coffee Break (Poster and NeuroDesign Showcase Demo)

15:30

NeuroDesign EXPO Competition

Competition teams
16:30

Panel Discussion

All experts
17:00

Award Ceremony and Networking

All participants

NeuroDesign EXPO & Competition

NeuroDesign in HRI Student Showcase Competition

[Meetings] [CFP] (3rd Call) Call for Submission: NeuroDesign EXPO in HRI Student Showcase Competition at ICRA 2024 (USD $1800 in cash)

—------------------------------------------------------------


===============================Call for Submission===============================

NeuroDesign EXPO in HRI Student Showcase Competition  ($1800 USD for your participation)
2nd Workshop on NeuroDesign in Human-Robot Interaction: The making of engaging HRI technology your brain can’t resist

IEEE International Conference on Robotics and Automation (ICRA 2024)
Yokohama, JapanMay 17, 2024https://neurodesign-in-hri.webflow.io/


Dear Colleagues,

Are you working on a human-interactive robot project that already has prototypes or some initial research findings? Perhaps you're pondering how to evolve these into more human-centric, real-world applications with a seamless, intuitive "brain"-centered experience, aiming to connect robots more deeply with our bodies, minds, and souls.

Worry not! We're excited to announce the NeuroDesign EXPO in HRI Showcase Competition at ICRA 2024, in conjunction with the NeuroDesign in HRI workshop (https://neurodesign-in-hri.webflow.io/). This event will feature a panel of world-renowned experts from diverse fields such as Human-Robot Interaction (HRI), Artificial Intelligence (AI), Cognitive Neuroscience, Social and Behavioral Psychology, Art & Design, RoboEthics, and the Startup community. These professionals will offer live, integrated-perspective feedback and recommendations to help refine your projects into more impactful research and commercial products.
We invite students at the BSc, MSc, and PhD levels to submit your projects. Submissions can fall into, but are not limited to, the following categories:

Affective Computing
Social and Service Robot
Industrial Collaborative Robot
Wearable Robot/ Device
Brain-Machine/Computer Interface
Haptics and Tele-operation
Soft Robotics
VR/AR & Metaverse
Cyborg and Bionic System
Healthcare Robotics
Exoskeleton and Rehabilitation Robot
LLM and Foundation Model for Human Robot Interaction
Brain Dynamics and Psychology for Cognitive and Physical HRI (cHRI/pHRI)
Human-Drone/AutoVehicle Interaction
Assistive Technology
Intelligence Augmentation and Human 2.0 Technologies
Supernumerary Limbs
Biometric Information Processing
Pervasive-Ubiquitous/Spatial Computing
Smart Home/Internet of Things (IoTs)
Edge-Fog-Cloud Computing
Speech/Gesture Recognition or Image/Audio Processing
Big Data, AI & Machine Learning for Multimodal Interaction
Smart Tutoring/Chatbot System
RoboEthics
RoboFashion, Clothing and Robot Skin
Diversity, Equity & Inclusion (DEI) for HRI Technologies
 
 
Participation Procedure:
Our selection process is on a rolling basis, and we aim to choose 10 projects for the final on-stage pitch presentation. We especially encourage those who have already submitted their work as posters or papers to ICRA 2024 or have publications elsewhere to participate in our event. This competition offers a fantastic opportunity to increase the visibility of your research globally.

Submission Options:

Submissions can be made in one of three formats:
1. A concise 100-word abstract and a 1-2 minute video, offering a brief yet engaging overview.
2. A 100-word abstract accompanied by 5 detailed slides for a short but thorough presentation.
3. A 2-page extended abstract (Please follow IEEE ICRA format:
https://ras.papercept.net/conferences/support/word.php), for a more in-depth submission.
*4. Participants are also welcome to submit using any combination of the above formats.


Final Round and Presentation:

The selected 10 projects will each have a 5-minute pitch presentation on stage during the final round. Alternatively, you may submit a polished, pre-recorded 5-minute video presentation, which we will play on stage.

Exhibition and Virtual Participation:

All submitted projects will receive a dedicated booth for poster and prototype demonstrations.
The event is designed to be "Hybrid" to ensure that everyone has the opportunity to participate, regardless of their ability to travel to Japan.

Awards:

We are thrilled to present two distinguished award categories at the competition. The "Best Innovation in HRI NeuroDesign Award" will be awarded to 3 outstanding projects. The First Prize will be awarded for USD 1000 USD and the 2nd Prize will be awarded 800 hundred for the prototyping. These projects will exemplify groundbreaking innovation within Human-Robot Interaction NeuroDesign. Additionally, the "Most Popular Project in HRI NeuroDesign Award" will be given to 2 projects, for which the First Prize will be awarded for “Full Waiver” for the submission to Journal of Frontiers in Robotics and AI in that capture the hearts of our workshop attendees and audience, determined through a popular vote. Winners in both categories will receive certificates acknowledging their achievements.

Timeline:

- Submission Deadline: Your entries must be submitted by May 1, 2024. Please note that our selection process is rolling, so early submissions are encouraged.- Announcement of Final Project Teams: The teams selected for the final round will be announced on May 3, 2024.- Competition Date: The competition will take place on May 17, 2024, where finalists will present their projects to the panel and attendees.

Submission Website:
https://forms.gle/fQMxJtkXb8JEU2WR6

If you have any questions, please don't hesitate to reach out. We're looking forward to seeing you at ICRA 2024!

Best regards,

Organizing Committee
2nd Workshop on NeuroDesign in Human-Robot Interaction: The making of engaging HRI technology your brain can’t resisthttps://neurodesign-in-hri.webflow.io/

Register or submit a contribution!
STARTER
Free
Get started with training routines designed for beginners.
Access to 60+ training videos
1 free personalised nutritional plan
Monthly personalised training plans
Video form review
-
Pre-Register
Free Forever

In recent years, there has been a rapid rise of innovations in robotics around the globe. This has been largely driven by the fact of labor force shortage in the lower-level, dirty, dull, dangerous and repetitive/tiring jobs, such as manufacturing, agriculture, food industry, infrastructure constructions, and/or autonomous vehicles, etc., as where the robots can provide faster, precise, safer and reliable task performance, working for long hours without taking breaks, compared to our human counterparts. During the past two years, the world-wide pandemic even pushes the demands further of using robots to replace the frontline healthcare workers, nurses and physicians to avoid body contacts, mitigating dangerous virus infections and transmissions. All these great examples contributed to the vast innovations of robot automation, which excels when the robot can work alone without human intervention, separated from our living environment without worrying too much about harming the people nearby. However, when the robots come to our homes and the hospitals, or in the environment where it needs tight human-robot interactions (HRI), the safety issues and uncertain human factors make the presumed technical assumptions falter, and the developmental processes and the business models fail. Good representative examples can be found in the recent shutdowns of several well-known startup companies, including Rethink Robotics, Jibo and Anki, of which they were all developing the forefront human-robot interaction solutions.

We surmise that HRIinnovation and commercialization of HRI products present unique challenges that are typically not encountered in other industries. With the constant increasing demands of using HRI technologies to compensate, augment and empower our human capabilities, we need to seriously address the fundamental flaws when developing HRI technologies, and translate the traditional “ivory tower” lab research into the real-world applications and the consumer products more fluently. In our competition, we intend to find the best projects that exemplify the NeuroDesign innovation principles to identify, invent and implement the developed HRI technologies, which could provide us a practical guidance to quickly bring Human-Robot Interaction lab research to solve the real-world problems.

Participation Procedure:
Our selection process is on a rolling basis, and we aim to choose 10 projects for the final on-stage pitch presentation. We especially encourage those who have already submitted their work as posters or papers to ICRA 2024 or have publications elsewhere to participate in our event. This competition offers a fantastic opportunity to increase the visibility of your research globally.

Submission Options:

Submissions can be made in one of three formats:
1. A concise 100-word abstract and a 1-2 minute video, offering a brief yet engaging overview.
2. A 100-word abstract accompanied by 5 detailed slides for a short but thorough presentation.
3. A 2-page extended abstract (Please follow IEEE ICRA format:
https://ras.papercept.net/conferences/support/word.php), for a more in-depth submission.
*4. Participants are also welcome to submit using any combination of the above formats.


Final Round and Presentation:

The selected 10 projects will each have a 5-minute pitch presentation on stage during the final round.

Poster and Exhibition:

All submitted projects will receive a dedicated booth for poster and prototype demonstrations.
The event is designed to be "Hybrid" to ensure that everyone has the opportunity to participate, regardless of their ability to travel to Japan.

Awards:

We are thrilled to present two distinguished award categories at the competition. The "Best Innovation in HRI NeuroDesign Award" will be awarded to 3 outstanding projects. The First Prize will be awarded for USD 1000 USD and the 2nd Prize will be awarded 800 hundred for the prototyping. These projects will exemplify groundbreaking innovation within Human-Robot Interaction NeuroDesign. Additionally, the "Most Popular Project in HRI NeuroDesign Award" will be given to 2 projects, for which the First Prize will be awarded for “Full Waiver” for the submission to Journal of Frontiers in Robotics and AI in that capture the hearts of our workshop attendees and audience, determined through a popular vote. Winners in both categories will receive certificates acknowledging their achievements.

Timeline:

- Submission Deadline: Your entries must be submitted by May 1, 2024. Please note that our selection process is rolling, so early submissions are encouraged.- Announcement of Final Project Teams: The teams selected for the final round will be announced on May 3, 2024.- Competition Date: The competition will take place on May 17, 2024, where finalists will present their projects to the panel and attendees.

Submission Website:
https://forms.gle/fQMxJtkXb8JEU2WR6

Team 1: Observational Error Related Negativity for Trust Evaluation in Human Swarm Interaction
Joseph P. Distefano (University of Buffalo)

This groundbreaking study marks the inaugural exploration of Observation Error Related Negativity (oERN) as a pivotal indicator of human trust within the paradigm of human-swarm teaming, while simultaneously delving into the nuanced impact of individual differences, distinguishing between experts and novices. In this institutional Review Board (IRB) approved experiment, human operators physiological information is recorded while they take a supervisory control role to interact with multiple swarms of robotic agents that are either compliant or non-compliant. The analysis of event-related potentials during non-compliant actions revealed distinct oERN and error positivity (Pe) components localized within the frontal cortex.

Extended AbstractShort VideoPoster

Team 2: Novel Intuitive BCI Paradigms for Decoding Manipulation Intention - Imagined Interaction with Robots
Matthys Du Toit (University of Bath)

Human-robot interfaces lack intuitive designs, especially BCIs relying on single-body part activation for motor imagery. This research proposes a novel approach: decoding manipulation intent directly from imagined interaction with robotic arms. EEG signals were recorded from 10 subjects performing motor execution, visual perception, motor imagery, and imagery during perception while interacting with a 6-DoF robotic arm. State-of-the-art classification models achieved average accuracies of 89% (motor execution), 94.9% (visual perception), 73.2% (motor imagery), and highest motor imagery classification of 83.2%, demonstrating feasibility of decoding manipulation intent from imagined interaction. The research invites more intuitive BCI designs through improved human-robot interface paradigms.

Short VideoPoster

Team 3: Multimodal Emotion Recognition for Human-Robot Interaction
Farshad Safavi (University of Maryland, Baltimore County)

Our project is a multimodal emotion recognition system to enhance human-robot interaction by controlling a robotic arm through detected emotions, using facial expressions and EEG signals. Our prototype adjusts the robotic arm's speed based on emotions detected from facial expressions and EEG signals. Two experiments demonstrate our approach: one shows the arm's response to facial cues—it speeds up when detecting happiness and slows down for negative emotions like angry face. The other video illustrates control via EEG, adjusting speed based on the user's relaxation level. Our goal is to integrate emotion recognition into robotic applications, developing emotionally aware robots.

Intro SlidesShort VideoPoster

Team 4: Learning Hand Gestures using Synergies in a Humanoid Robot
Parthan Olikkal (University of Maryland, Baltimore County)

Hand gestures, integral to human communication, hold potential for optimizing human-robot collaboration. Researchers have explored replicating human hand control through synergies. This work proposes a novel method: extracting kinematic synergies from hand gestures via a single RGB camera. Real-time gestures are captured through MediaPipe and converted to joint velocities. Applying dimensionality reduction yields kinematic synergies, which can be used to reconstruct gestures. Applied to the humanoid robot Mitra, results demonstrate efficient gesture control with minimal synergies. This approach surpasses contemporary methods, offering promise for near-natural human-robot collaboration. Its implications extend to robotics and prosthetics, enhancing interaction and functionality.

Intro SlidesShort VideoPoster

Team 5: A Wearable, Multi-Channel, Parameter-Adjustable Functional Electrical Stimulation System for Controlling Individual Finger Movements
Zeyu Cai (University of Bath)

As the survival rate of patients with stroke and spinal cord injuries rises, movement dysfunction in patients after surgery has become a concern. Among them, hand dysfunction seriously impairs patients' quality of life and self-care ability. Recent, many studies have demonstrated that functional electrical stimulation (FES) in the rehabilitation of upper limb motor function. At the same time, compared to traditional treatment methods, functional electrical stimulation is more effective. However, existing FES studies for the hand placed the electrodes in the forearm, which does not allow full control of individual movements of single fingers. In this study, an electrode glove was designed to place the electrodes on the hand, which can achieve this goal. Furthermore, existing FES systems are large in size, the FES system developed in this study is more lightweight and can be made wearable. In summary, this study aimed to develop a novel, wearable functional electrical stimulation system for the hand, which can adjust the stimulation parameters and, with an electrode glove, can control the individual movements of single fingers providing a personalized rehabilitation approach.

Extended AbstractIntro SlidesPoster

Team 6: Enhancing MI-BCI Training with Human-Robot Interaction Through Competitive Music-Based Games
Alessio Palatella (University of Padova)

Motor Imagery Brain-machine Interfaces (MI-BMIs) interpret users' motor imagination to control devices, bypassing traditional output channels like muscles. However, MI-BMIs' proficiency demands significant time and effort, especially for novices. To address this, we propose a novel MI-BMI training method using Human-Robot Interaction via rhythmic, music-based video games and NAO robots. Our experimental setup involves a rhythm game connected to a real NAO robot via a BMI. EEG signals are processed using a CNN-based decoder. Despite data limitations, our approach demonstrates promising control capabilities, highlighting the potential of combining MI-BMIs with robotics for intuitive human-robot interaction and enhanced user experience.

Extended AbstractIntro SlidesShort VideoPoster

Team 7: Enhancing Synergy - The Transformative Power of AR Feedback in Human-Robot Collaboration
Akhil Ajikumar (Northeastern University)

Through this paper, we introduce a novel Augmented Reality system to enable intuitive and effective communication between humans and collaborative robots in a shared workspace.By using multimodal interaction data like gaze, speech, and hand gestures, which is captured through a head-mounted AR device, we explore the impact created by the system in improving task efficiency, communication clarity, and user trust in the robot. We validated this using an experiment, based on a gearbox assembly task, and it showed a significant preference among users for gaze and speech modalities, it further revealed a notable improvement in task completion time, reduced errors, and increased trust among users. These findings show the potential of AR systems to enhance the experience of human-robot teamwork by providing immersive, real-time feedback, and intuitive communication interfaces.

Extended AbstractPoster

Team 8: EEG Movement Detection for Robotic Arm Control
Daniele Lozzi (University of L'Aquila)

This research introduces a novel approach to the construction of an online BCI dedicated to the classification of motor execution, which importantly considers both active movements and essential resting phases to determine when a person is inactive. Then, it explore the best Deep Learning architecture suitable for motor execution classification of EEG signals. This architecture will be useful for control an external robotic arm for people with severe motor disabilities.

Extended AbstractPoster

Team 9: EEG and HRV Based Emotion Estimation Robot for Elderly Interaction
Yuri Nakagawa (Shibaura Institute of Technology)

The increasing demand for emotional care robots in nursing homes aims to enhance the Quality of Life for the elderly by estimating their emotions and providing mental support. Due to the limited physical state of elderly, traditional methods of emotion estimation pose challenges; thus, we explore physiological signals as a viable alternative. This study introduces an innovative emotion estimation method based on Electroencephalogram and Heart Rate Variability, implemented in a care robot. We detail an experiment where this robot interacted with three elderly individuals in a nursing home setting. The observed physiological changes during these interactions suggest that the elderly participants experienced positive emotions.

Intro SlidesPoster

    Organizers     

No. 1

DR. Ker-jiun Wang

Bioengineering
University of Pittsburgh
No. 2

Dr. Zhi-Hong Mao

ECE & BIOEngineering
University of Pittsburgh
No. 3

Dr. Midori Sugaya

CSE
Shibaura Institute of Technology
No. 3

Dr. maryam alimardani

CS and Artificial Intelligence
Tilburg University
No. 3

Dr. Ramana vinjamuri

CSEE
University of Maryland Baltimore

Local Arrangement Chairs    

No. 3

Dr. Feng Chen

PosDoc, Dolylab
Shibaura Institute of Technology
No. 3

Yuri Nakagawa

phd student, dolylab
Shibaura Institute of Technology

HRI and Neuroscience at Scale

Innovation is hard. The core innovation is rooted in a scientific discovery that requires additional technical de-risking, finding profitable and sustainable solutions that meet the target needs. Neurodesign in HRI for a better “brain-centered experience” provides a sticky glue connecting the technologies and the end-users, changing the people’s behaviors to accept the new tech and finding the real use cases to apply the tech. Through hosting this workshop, we hope the HRI researchers and the neuroscientists could bring their groundbreaking research to the real world with more impact. The world needs science at scale.
keep the latest news
Email address
join our email list
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Come participate online in our creative workshop.,
with inspiring talks, thought-provoking discussions
and lots of fun interactions  . and networking.

What is the

Conferencos

Art Classes

& Camps?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Get Tickets

 May 17, 2024  
08:30 - 17:30

2024 IEEE International Conference on Robotics and Automation (ICRA 2024)
New Here’s a notice bar to bring attention to new features of your website.