7. Bibliography

All URLs have been last visited on January 8th 2021. Media which is already embedded in the other pages is not listed here.

  1. Krueger, Myron W. 1977. Responsive environments. In: Proceedings of the June 13-16, 1977, national computer conference. AFIPS ‘77. Dallas, Texas
  2. Universal Everything: Super You AR app: https://universaleverything.com/projects/super-you
  3. Entry of Very Nervous System on Medienkunstnetz: http://www.medienkunstnetz.de/werke/very-nervous-system/
  4. David Rokeby. The Harmonics of Interaction. MUSICWORKS 46: Sound and Movement. 1990. Online article: http://www.davidrokeby.com/harm.html
  5. softVNS Max/MSP modules by David Rokeby: http://davidrokeby.com/softVNS.html
  6. Official Just Dance product site: https://www.ubisoft.com/de-de/game/just-dance-2020/
  7. Nagual Sounds website: http://www.nagualsounds.com/
  8. Vogt, Katharina, Pirrò, David, Kobenz, Ingo, Höldrich, Robert, Eckel, Gerhard. 2009. “PhysioSonic - Evaluated move- ment sonification as auditory feedback in physiotherapy.” In: Auditory Display. CMMR 2009, ICAD 2009. Lecture Notes in Computer Science, vol 5954. Springer, Berlin, Heidelberg
  9. Motion Composer V3 leaflet: http://motioncomposer.de/wp-content/uploads/2020/08/ENG_MC3.0_2020_4.pdf
  10. Official Posenet repository: https://github.com/tensorflow/tfjs-models/tree/master/posenet
  11. Microsoft Azure Kinect product page: https://azure.microsoft.com/en-us/services/kinect-dk/
  12. OpenPose project repository https://github.com/CMU-Perceptual-Computing-Lab/openpose
  13. Dushyant Mehta, Oleksandr Sotnychenko, Franziska Mueller, Weipeng Xu, Mohamed Elgharib, Pascal Fua, Hans-peter Seidel, Helge Rhodin, Gerard Ponsmoll, Christian Theobalt. XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera. 2020. ACM Trans. Graph., Vol. 39, No. 4, Article 1. Project website: https://gvv.mpi-inf.mpg.de/projects/XNect/
  14. Apple. Bringing People Into AR. 2018. Video: https://developer.apple.com/videos/play/wwdc2019/607/
  15. Rudolf von Laban. The Language of Movement - A Guidebook to Choreutics. 1974. Plays Incorporated
  16. Rett, Jörg, Dias, Jorge. 2007. “Human Robot Interaction Based on Beyesian Analysis of Human Movements.” Coimbra, Portugal. Conference Paper. DOI: 10.1007/978-3-540- 77002-2_45
  17. He, Jing. 2014. “Paint with Dance: An exploration in creating abstract animation with movement.” Master Thesis. Parsons School of Design. New York
  18. Antonio Camurri, Gualtiero Volpe, Stefano Piana, Maurizio Mancini, Radoslaw Niewi- adomski, Nicola Ferrari, Corrado Canepa. The Dancer in the Eye: Towards a Multi-Lay- ered Computational Framework of Qualities in Movement. MOCO ‘16: Proceedings of the 3rd International Symposium on Movement and Computing July 2016 Article No.: 6 Pages 1–7. https://doi.org/10.1145/2948910.2948927
  19. ARSkeleton demo repository by user “iamfine”: https://github.com/iamfine/ARSkeleton
  20. openFrameworks project website: https://openframeworks.cc/
  21. AudioKit project website: https://audiokit.io/
  22. Curtis Roads. Composing Electronic Music: A New Aesthetic. 2015. Oxford University Press, Madison, New York.
  23. Aniruddh D. Patel, John R. Iversen. The evolutionary neuroscience of musical beat perception: the Action Simulation for Auditory Prediction (ASAP) hypothesis. Frontiers in Systems Neuroscience, Volume 8, Article 57. 2014 DOI: 10.3389/fnsys.2014.00057
  24. Douglas Cooper. Very Nervous System. Wired Magazine, 1995. Online version: https://www.wired.com/1995/03/rokeby/
  25. Luke Dahl. Triggering Sounds From Discrete Air Gestures: What Movement Feature Has the Best Timing?. NIME’14, June 30 – July 03, 2014, Goldsmiths, University of London, UK.
  26. Charles E. Clauser, John T. McConville, J. W. Young. Weight, Volume, And Center of Mass of Segments of the Human Body. Aerospace Medical Research Labs. 1969. https://ntrs.nasa.gov/citations/19700027497
  27. Marc-André Weibezahn. Structuring Multi-Layered Musical Feedback for Digital Bodily Interaction: Two Approaches to Multi-layered Interactive Musical Feedback Systems. ICMI ‘20 Companion: Companion Publication of the 2020 International Conference on Multimodal Interaction. Pages 446 - 448. October 2020. https://doi.org/10.1145/3395035.3425970
Previous: Reflection