Yukie Nagai
TitleCited byYear
A constructive model for the development of joint attention
Y Nagai, K Hosoda, A Morita, M Asada
Connection Science 15 (4), 211-229, 2003
2122003
Learning for joint attention helped by functional development
Y Nagai, M Asada, K Hosoda
Advanced Robotics 20 (10), 1165-1181, 2006
902006
Computational analysis of motionese toward scaffolding robot action learning
Y Nagai, KJ Rohlfing
IEEE Transactions on Autonomous Mental Development 1 (1), 44-54, 2009
732009
People modify their tutoring behavior in robot-directed interaction for action learning
AL Vollmer, KS Lohan, K Fischer, Y Nagai, K Pitsch, J Fritsch, KJ Rohlfing, ...
2009 IEEE 8th International Conference on Development and Learning, 1-6, 2009
662009
Staged development of robot skills: Behavior formation, affordance learning and imitation with motionese
E Ugur, Y Nagai, E Sahin, E Oztop
IEEE Transactions on Autonomous Mental Development 7 (2), 119-139, 2015
592015
From bottom-up visual attention to robot action learning
Y Nagai
2009 IEEE 8th International Conference on Development and Learning, 1-6, 2009
532009
Emergence of mirror neuron system: Immature vision leads to self-other correspondence
Y Nagai, Y Kawai, M Asada
2011 IEEE International Conference on Development and Learning (ICDL) 2, 1-6, 2011
512011
Can motionese tell infants and robots” what to imitate”
Y Nagai, KJ Rohlfing
Proceedings of the 4th International Symposium on Imitation in Animals and …, 2007
472007
Initiative in robot assistance during collaborative task execution
J Baraglia, M Cakmak, Y Nagai, R Rao, M Asada
The Eleventh ACM/IEEE International Conference on Human Robot Interaction, 67-74, 2016
422016
Toward designing a robot that learns actions from parental demonstrations
Y Nagai, C Muhl, KJ Rohlfing
2008 IEEE international conference on robotics and automation, 3545-3550, 2008
352008
Does disturbance discourage people from communicating with a robot?
C Muhl, Y Nagai
RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human …, 2007
302007
Learning to comprehend deictic gestures in robots and human infants
Y Nagai
ROMAN 2005. IEEE International Workshop on Robot and Human Interactive …, 2005
302005
The role of motion information in learning human-robot joint attention
Y Nagai
Proceedings of the 2005 IEEE International Conference on Robotics and …, 2005
292005
Predictive learning of sensorimotor information as a key for cognitive development
Y Nagai, M Asada
Proc. of the IROS 2015 Workshop on Sensorimotor Contingencies for Robotics 25, 2015
282015
Developmental learning model for joint attention
Y Nagai, M Asada, K Hosoda
IEEE/RSJ International Conference on Intelligent Robots and Systems 1, 932-937, 2002
282002
Parental scaffolding as a bootstrapping mechanism for learning grasp affordances and imitation skills
E Ugur, Y Nagai, H Celikkanat, E Oztop
Robotica 33 (5), 1163-1180, 2015
242015
Joint attention development in infant-like robot based on head movement imitation
Y Nagai
Proc. Third Int. Symposium on Imitation in Animals and Artifacts (AISB05), 87-96, 2005
242005
How does an infant acquire the ability of joint attention?: A Constructive Approach
Y Nagai, K Hosoda, M Asada
Lund University Cognitive Studies 101, 91-98, 2003
242003
Infant's action skill dynamically modulates parental action demonstration in the dyadic interaction
H Fukuyama, S Qin, Y Kanakogi, Y Nagai, M Asada, ...
Developmental science 18 (6), 1006-1013, 2015
222015
A developmental approach accelerates learning of joint attention
Y Nagai, M Asada, K Hosoda
Proceedings 2nd International Conference on Development and Learning. ICDL …, 2002
222002
The system can't perform the operation now. Try again later.
Articles 1–20