Folgen
Catie Cuan
Catie Cuan
Bestätigte E-Mail-Adresse bei stanford.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Choreographic and somatic approaches for the development of expressive robotic systems
A LaViers, C Cuan, C Maguire, K Bradley, K Brooks Mata, A Nilles, I Vidrin, ...
Arts 7 (2), 11, 2018
282018
CURTAIN and Time to Compile: A Demonstration of an Experimental Testbed for Human-Robot Interaction
C Cuan, I Pakrasi, E Berl, A LaViers
2018 27th IEEE International Symposium on Robot and Human Interactive …, 2018
172018
Time to Compile: An Interactive Art Installation
C Cuan, I Pakrasi, A LaViers
16th Biennial Symposium on Arts & Technology 51, 19, 2018
17*2018
Time to compile: A performance installation as human-robot interaction study examining self-evaluation and perceived control
C Cuan, E Berl, A LaViers
Paladyn, Journal of Behavioral Robotics 10 (1), 267-285, 2019
142019
Dances with Robots: Choreographing, Correcting, and Performing with Moving Machines
C Cuan
TDR 65 (1), 124-143, 2021
62021
Perception of control in artificial and human systems: A study of embodied performance interactions
C Cuan, I Pakrasi, A LaViers
Social Robotics: 10th International Conference, ICSR 2018, Qingdao, China …, 2018
62018
Output: Choreographed and Reconfigured Human and Industrial Robot Bodies across Artistic Modalities
C Cuan
Frontiers in Robotics and AI 7, 576790, 2021
52021
Measuring human perceptions of expressivity in natural and artificial systems through the live performance piece Time to compile
C Cuan, E Berl, A LaViers
Paladyn, Journal of Behavioral Robotics 10 (1), 364-379, 2019
42019
Dancing Droids: An Expressive Layer for Mobile Robots Developed Within Choreographic Practice
I Pakrasi, N Chakraborty, C Cuan, E Berl, W Rizvi, A LaViers
International Conference on Social Robotics, 410-420, 2018
42018
Gesture2Path: Imitation Learning for Gesture-aware Navigation
C Cuan, E Lee, E Fisher, A Francis, L Takayama, T Zhang, A Toshev, ...
arXiv preprint arXiv:2209.09375, 2022
22022
OUTPUT: Vestiges of Human and Robot Bodies
C Cuan
Proceedings of the 7th International Conference on Movement and Computing, 1-2, 2020
22020
OUTPUT: Translating Robot and Human Movers Across Platforms in a Sequentially Improvised Performance
C Cuan, E Pearlman, A McWilliams
2*
Music Mode: Transforming Robot Movement into Music Increases Likability and Perceived Intelligence
C Cuan, E Fisher, A Okamura, T Engbersen
arXiv preprint arXiv:2306.02632, 2023
12023
Stories About the Future: Initial Results Exploring How Co-movement with Robots Affects Perceptions About Robot Capability
C Cuan, J Hoffswell, A LaViers
Proceedings of the 7th International Conference on Movement and Computing, 1-8, 2020
12020
Leveraging Haptic Feedback to Improve Data Quality and Quantity for Deep Imitation Learning Models
C Cuan, A Okamura, M Khansari
IEEE Transactions on Haptics, 2024
2024
Interactive Multi-Robot Flocking with Gesture Responsiveness and Musical Accompaniment
C Cuan, K Jeffrey, K Kleiven, A Li, E Fisher, M Harrison, B Holson, ...
arXiv preprint arXiv:2404.00442, 2024
2024
Robot navigation in dependence on gesture (s) of human (s) in environment with robot
C Cuan, TW Lee, AG Francis Jr, A Toshev, S Pirk
US Patent App. 18/240,124, 2024
2024
Still Exhausted: Introduction
C Cuan, D Eacho, S Skybetter
TDR 68 (1), 10-18, 2024
2024
Robot Choreography, Choreorobotics, and Humanist Technology: A Conversation between Dr. Madeline Gannon and Dr. Ken Goldberg, mediated by Dr. Catie Cuan
C Cuan
Choreomata, 340-354, 2023
2023
Sally Banes and Mary Overlie
J Roach, C Martin, C Cuan, D Goldman, R Chavkin, R Schechner, ...
2021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20