Difference between revisions of "3W for HRI"

From Robot Intelligence
Jump to: navigation, search
 
(55 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
==Project Outline==
 
==Project Outline==
*Research Period : 2012.6 ~ 2017.5 (5 years, 5st Phase)
+
*Research Period: 2012.6 ~ 2017.5
 
*Funded by the Ministry of Trade, Industry and Energy (Grant No: 10041629)
 
*Funded by the Ministry of Trade, Industry and Energy (Grant No: 10041629)
 +
 +
*PI: [[JongSuk Choi | JongSuk Choi, Ph.D. ]]
 +
 +
*Members: [[Sung-Kee Park | Sung-Kee Park, Ph.D. ]], [[Yoonseob Lim | Yoonseob Lim, Ph.D. ]], [[Sang-Seok Yun | Sang-Seok Yun, Ph.D.]], [[Junghoon Kim | Junghoon Kim, M.S.]], [[Hoang Minh Do | Hoang Minh Do(UST)]], [[Donghui Song | Donghui Song(UST)]],  [[Hyeonuk Bhin | Hyeonuk Bhin(UST)]], [[Gyeore Lee | Gyeore Lee(UST)]],
 +
  
 
==Introduction and Research Targets==
 
==Introduction and Research Targets==
1. This project is for the purpose of the implementation of technologies for identification, behavior and location of human based sensor network fusion program.
+
1. This project is for the purpose of the implementation of technologies for identification (WHO), behavior (WHAT) and location (WHERE) of human based sensor network fusion program.
 
<br>2. In this project we develop a robot-assisted management system for promptly coping with abnormal events in classroom environments.
 
<br>2. In this project we develop a robot-assisted management system for promptly coping with abnormal events in classroom environments.
 
  - Reliably detect the occurrence of human-caused emergency situations via audio-visual perception modules
 
  - Reliably detect the occurrence of human-caused emergency situations via audio-visual perception modules
Line 12: Line 17:
 
::[[File:Simonpic_overview.png|800px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/>
 
::[[File:Simonpic_overview.png|800px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/>
 
<br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/>
 
<br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/>
 +
  
 
==Developing Core Technologies==
 
==Developing Core Technologies==
Line 29: Line 35:
 
*Gaze control and robot navigation
 
*Gaze control and robot navigation
 
*Human following with recovery mechanism
 
*Human following with recovery mechanism
 +
 +
 +
==Results==
 +
* [https://youtu.be/pt8mgxO08c0-_I Remote Handling for Emergency based on 3W (WHO/WHAT/WHERE) Information]
 +
 +
::[[File:SimonPiC_Remote handling for emergency on 3w data.png|600px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/>
 +
<br/> <br/> <br/> <br/>
 +
 +
* [https://youtu.be/3sjWdq6P_hY_I Final 3W (WHERE-WHO-WHAT) Integration]
 +
::[[File:(SimonPiC)_3W_final.png|600px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/>
 +
 +
* [https://youtu.be/pHfZKAX6FWU_I Audio-Visual Integration (WHAT: gssl+ssc)]
 +
::[[File:(SimonPiC) Audio-Visual_Integration.png|600px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/>
 +
 +
* [https://youtu.be/MNRCwFDDbXI-_I WHERE-WHO Fusion]
 +
::[[File:(SimonPiC)_WHERE_WHO_Fusion.jpg|300px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/>
 +
 +
* [https://youtu.be/tVGpAoH13dI-_I HRI2015 Leg detection]
 +
::[[File:(SimonPiC3) Human Leg Detection.jpg|300px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/>
 +
 +
* [https://youtu.be/kyu7psC4IG0-_I 3D simulation of 3W Fusion]
 +
::[[File:(SimonPiC2)_3W FUSION Simulation.jpg|300px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/>

Latest revision as of 18:45, 22 October 2017

Project Outline

  • Research Period: 2012.6 ~ 2017.5
  • Funded by the Ministry of Trade, Industry and Energy (Grant No: 10041629)


Introduction and Research Targets

1. This project is for the purpose of the implementation of technologies for identification (WHO), behavior (WHAT) and location (WHERE) of human based sensor network fusion program.
2. In this project we develop a robot-assisted management system for promptly coping with abnormal events in classroom environments.

- Reliably detect the occurrence of human-caused emergency situations via audio-visual perception modules
- Make an urgent SMS transmission to let someone know that an emergency event occurs, and relay spot information to them
- Perform an immediate reaction for the happening by using robot navigation and interaction technologies on behlaf of the remote user
Simonpic overview.png
























Developing Core Technologies


DETECTION

  • WHERE: Human detection and localization
  • WHO: Face recognition and ID tracking
  • WHAT: Recognition of individual and group behavior
  • 3W data association on the perception sensor network


AUTOMATIC SURVEILLANCE

  • Automatic message transmission for human-caused emergencies
  • Analysis of student attitude
  • Remote monitoring via Web technologies and stream server


ROBOT REACTION

  • Human-friendly robot behavior
  • Gaze control and robot navigation
  • Human following with recovery mechanism


Results

SimonPiC Remote handling for emergency on 3w data.png
















(SimonPiC) 3W final.png














(SimonPiC) Audio-Visual Integration.png












(SimonPiC) WHERE WHO Fusion.jpg











(SimonPiC3) Human Leg Detection.jpg











(SimonPiC2) 3W FUSION Simulation.jpg