Coenesthesia: An Aesthetic of Healing through Hybrid Reality Art ©

An excerpt from my thesis

3.2      Creating a Sense of Body Ownership Over Virtual Body Parts.

First I want to clarify the use of ownership.  Here I do not write in socio-economic terms of ownership. Ownership in this context comes from the language used in neuro-prosthetics research, to establish things that are included as being part of the makeup of one’s physical body. 

Whereas amenable objects are the realm of ‘me and not me extension’ somewhere between subjective and objective, so too are prosthetic limbs. I look to the field of cognitive neuro- prosthetics to establish techniques for creating a sense of body ownership over virtual representations of body parts. This field builds, “models of self-consciousness in order to project them onto artificial limbs, avatars and robots.” (Blanke, 2012) I look at a few contributions to this field that seem relevant to my mission.

Hugh Herr, renowned neuro-prosthetics researcher who revolutionized prosthetic limbs after losing his own legs in a climbing incident at the age of 17 says "Technology has the power to heal, to rehabilitate and to even extend human experience and capability" (Hugh Herr, 2015)

In 1998 Botvinick and Cohen established the ‘the rubber arm illusion’.  This is an experiment where one arm of the participant is hidden from view and a rubber arm is positioned where the participant’s real arm would typically be. The experimenter then simultaneously brushes both the real arm of the participant and the rubber arm for length of time. This synchronization creates a sense of body ownership in the participant for the rubber arm and when it is threatened with pain, they jump to protect it.  This experiment shows how sight, touch and proprioception combine to create a convincing feeling of body ownership, one of the foundations of self-consciousness (Nature, vol 391, p 756).  In 2007, Frank H Durgin et al, expanded on the rubber arm experiment[1]. They replaced the brush with a laser and discuss the results in terms of multisensory integration[2]. Vision, Touch and Synchronisation are all key elements.

Experiments along similar lines, established the mirror therapy method (Ramachandran, 1994), to relieve pain in phantom limbs. Using a mirror to recreate a mental image of the missing limb was found to alleviate pain in the phantom limb and over time to remap the mental image of that limb in the brain of the amputee having a lasting effect on the pain relief. With the limb missing, confusion develops in the signals sent between the limb and the brain, forming in some patients a mental image of a cramped up, clenched hand. When a mirror is used to substitute an image of the opposite arm there is instant pain relief when they see the hand stretched and relaxed.  The mental image of the limb is key to the experience of it.

Heautoscopy is a phenomenon where patients have a sensation of being reduplicated and to exist at two or even more locations. It has been found that self-identification with two virtual bodies was stronger during synchronous stroking (Heydrich, 2013).  This proposes that having more than one version of organs in the virtual space I am creating will allow the user to identify with more variations of organ representation.

To make effective use of the sound component in the research creation VR project, the sound component incorporation is influenced by the findings of Noel et al, in which they consider acoustic stimuli, in the mix of synchronous stroking during the rubber arm illusion, “the distance at which acoustic stimuli are presented may alter the balance between self- and non-self, biases.” (Noel et al, 2017). 

Insights gained from this research include the fact that light when employed through synchrony with a physical sensation can contribute to a sense of body ownership and of heautoscopy, actively engaging motor neurons and the position of acoustic stimuli are also of key importance to the mix. In Coenesthesia I create a synchronous link between the immersant’s real organs and the virtual representations of the body organs and virtual objects through synchronous biometric data.


[1] Two experiments involving a total of 220 subjects are reported.  The experiments document that “stroking”a false hand with the bright beam of light from a laser pointer can produce tactile and thermal sensations when the hand can be seen as one’s own. Overall, 66% of subjects reported somatic sensations from the light. Felt hand location was recalibrated toward the location of the false hand for those subjects who felt the light. Moreover, the proprioceptive recalibration from the laser experience was comparable to that produced by actual coordinated brushing of the false hand and of the unseen heal hand after 2 min of stimulation. The illusion may be experienced on one’s real hand as well. (Durgin,)


[2] We interpret this touch-from-light illusion in terms of a multisensory-integration theory wherein perceptual signals of highcertainty from one sense modality can produce perceptualconsequences that influence the experience of a second modality (De Gelder & Bertelson, 2003; Driver & Spence, 2000; Ernst & Banks, 2002; Gibson, 1966; McGurk & MacDonald, 1976; Shimojo & Shams, 2001). For example, an insect crawling on the skin would not normally produce tactile sensations if the mechanical disturbances to the skin are below sensory threshold. Once the insect is seen, however, a vivid experience of

tactile sensations could arise from the combination of the visual localization evidence with sensory noise from the tactile sensors (see Durgin, 2002). In the present case, sensory integration depends on the ease with which a rubber hand can be incorporated

into the body schema or body image (Head et al., 1920;Schilder, 1938).