To view PDF files

You need Adobe Reader 7.0 or later in order to read PDF files on this site.
If Adobe Reader is not installed on your computer, click the button below and go to the download site.

Regular Articles

Vol. 15, No. 1, pp. 26–30, Jan. 2017. https://doi.org/10.53829/ntr201701ra1

Directional Remapping in Tactile Motion Perception

Shinobu Kuroki

Abstract

This article introduces tactile illusions in which adaptation to one directional motion alters the direction of the following motion. We used this illusion to explore how the brain calculates tactile motion directions, which is an essential issue in creating tactile navigation systems.

Keywords: tactile, motion, illusion

PDF PDF

1. Introduction

Motion perception is a function essential to the lives of everyone. We have a purpose-built mechanism in our brain to detect motion, which is beautifully represented as the well-known waterfall illusion; after prolonged observation of a waterfall, an illusory upward motion can be seen in a static cliff (Fig. 1). This phenomenon is referred to as the motion aftereffect (MAE) [e.g., 1, 2].


Fig. 1. Waterfall illusion. After prolonged observation of a waterfall (top panel), an illusory upward motion can be seen in a static cliff (bottom panel).

Motion perception holds a prominent position not only in vision but also in touch. The motion between the fingertips and an object has important functions such as preventing a glass from slipping from our hand and making it possible to identify the surface texture of fabrics. However, the mechanism of motion perception in touch has not been fully revealed. In this article, I introduce our group’s recent studies on tactile motion perception using the MAE. How tactile motion is processed in our brain is an essential issue in neuroscience, and, at the same time, it can contribute to the development of future user-friendly information technologies.

2. MAE in touch

The MAE has been used to investigate the visual mechanism of information processing in the brain. It provides proof that a certain mechanism exists in the brain without the need for imaging or neurophysiological measurement. The waterfall illusion, the most famous MAE, has been explained as follows. It is widely known that we have neurons that specifically respond to directional motion, and the perceived motion direction can be estimated from the activity valance of neurons responding to each direction. When we watch a waterfall, downward-motion neurons are adapted and become fatigued. Then, the activity of downward-motion neurons becomes weaker than that of the upward-motion ones. When we watch a static field, the activity level of all of the directional neurons is mostly the same. However, after adaptation to the waterfall/downward motion, this balance is violated. The downward-motion neurons cannot fire at the same level as the other directional neurons. In the waterfall illusion, this results in illusory upward motion of the static field, that is, the cliff. Note that occurrence of this illusion itself is proof of downward-motion neurons.

Compared to the visual MAE, which has been repeatedly observed in a robust and rigid fashion [3], the tactile MAE has received far less attention, and its occurrence itself has been a matter of long debate [4, 5]. A breakthrough study was done by Watanabe et al., who introduced a new adaptation method that can induce a tactile MAE in a reproducible fashion [5]. The essence of their method is that they employed motion stimuli with an ambiguous direction as test stimuli instead of using static stimuli such as a cliff. If participants touch static stimuli as test stimuli after adapting to one-directional motion stimuli, as in the conventional manner, they first judge whether the test stimuli are moving or not then report the perceived direction of the motion. On the other hand, if participants touch dynamic stimuli as test stimuli, they report the perceived direction based on their ambiguous perception. Only in the latter case can a robust MAE—illusory motion in the direction opposite to the adapted motion—be observed in touch.

Watanabe et al. used pin-shaped vibrators to generate a sensation of apparent motion on the finger cushion. Pins were vertically vibrated (pushed onto and pulled off the skin surface) at a frequency of 30 Hz by vibration generators (EMIC Inc., Kyoto, Japan, 511-A). When three pins were sequentially driven, the participant perceived apparent motion on the finger cushion. For example, when the nail-side pin stuck out first, then the middle pin, and finally the palm-side pin (Fig. 2(a)), participants perceived palm motion on the finger pad. With such stimuli, when participants adapted to one direction, say, to the palm motion for tens of seconds (Fig. 2(b), adaptation phase), they tended to report directionally ambiguous motion as a motion to the nail (Fig. 2(c), MAE phase).


Fig. 2. Experimental setup in [5] and schematic diagram of tactile MAE. After prolonged perception of motion to the palm (adaptation panel), an illusory motion to the nail can be perceived with a directionally ambiguous motion (MAE panel).

Watanabe et al. conducted tactile adaptation experiments under several conditions and successfully showed the MAE for straight motion within the finger pad, straight motion from the finger base to tip, and circular motion within the finger pad. These results suggest that the MAE can be a useful psychophysical tool for probing the neural mechanisms of tactile motion processing, as well as that of visual motion processing. This finding supports the possibility that tactile motion processing shares neural circuits with visual motion processing, which is good news for scientists since visual mechanisms are much better understood than tactile ones. In addition, the tactile MAE enables us to consider a tactile-specific issue—how the tactile motion can be processed in conjunction with kinesthetic information such as finger position and posture. I discuss this point in the next section.

3. MAE across fingers

In touch, direction is an issue. The input on the skin is encoded by the sensors underneath it. Therefore, for the sensors, it feels like the direction of motion on the fingertip is from the palm to the nail. Instead, we usually perceive tactile motion defined in an environmental coordinate, such as upward or rightward. This means that the brain needs to transform the motion direction on the skin coordinate into the direction on the remapped environmental coordinate while taking body posture into account. The same issue arises in vision and audition regarding the retina-/ear-centered coordinate vs. the environmental coordinate. However, touch has more layered remapping processes involving skin-, hand-, arm-, and body-centered coordinates since we can drastically change our body posture at many stages. Thus, remapping in touch must pose a difficult problem to our brain.

What will happen if we change our posture after adaptation? Will the MAE occur according to the skin coordinate or to the environmental coordinate? MAE that occurs according to the skin coordinate would mean that tactile directional neurons might exist in the first stage of cortical processing, where cutaneous sensors encoding skin input and kinesthetic sensors encoding posture information are independently represented. In contrast, MAE that occurs according to the environmental coordinate would suggest that the neurons exist after the integration of cutaneous and kinesthetic sensors. To test these possibilities, we examined whether finger posture modulates the direction of the tactile MAE induced by an apparent inter-finger motion between the index and middle fingers [6].

In the experiment, we introduced conflict between the skin coordinate and environmental coordinate. Adaptation motion stimuli were presented on the index and middle finger with them crossed, where the middle to index finger motion resulted in rightward motion perception, while the index to middle finger motion resulted in leftward motion perception. After that, test motion stimuli were presented on the same two fingers with them uncrossed. If the MAE occurs according to the skin coordinate, the middle to index finger adaptation would induce an index to middle aftereffect; that is, rightward adaptation would induce a rightward aftereffect. If, on the other hand, the MAE occurs according to the environmental coordinate, rightward motion would induce a leftward aftereffect, which is the direction opposite to the skin-coordinate MAE.

We found that the direction of the tactile MAE was determined by the environmental coordinate (Fig. 3). When participants adapted to rightward motion (i.e., middle to index finger direction), they tended to report direction-ambiguous test stimuli as leftward motion; that is, they felt illusory motion with the middle to index finger direction after adapting to the motion with the middle to index finger direction. We also found that the MAE disappeared when the index and middle fingers were vertically aligned during adaptation, where the direction of adaptation motion was vertical, while that of the test motion was horizontal. These results suggest that direction of tactile motion is defined after skin input and posture information are integrated. In addition, we found no MAE when the adaptation motion was presented on the left hand and the test motion was presented on the right hand. This result suggests that the direction of tactile motion is likely to be defined at each side of the body, and that it involves tactile-specific processing rather than high-level super-modal motion processing. In summary, this study provides a novel behavioral method for accessing the tactile motion remapping from skin space into environmental/perceptual space.


Fig. 3. Schematic diagram of MAE across fingers observed in our study [6]. When participants adapted to one directional motion for a while with their finger posture shown in the top images, they tended to report direction-ambiguous test stimuli as directional or non-directional motion as shown in the bottom images.

4. Future research prospects

Our research group is trying to clarify the psychophysical mechanism for visual, auditory, and tactile inputs. Understanding the mechanism in our brain is also essential for developing user interfaces. For example, our group has developed a tactile navigation system called Buru-Navi (Fig. 4) [7]. Navigation through touch is a promising application for the future. It can provide direction information in an intuitive way as if one’s hand was being pulled, and, most importantly, the information can be obtained without preventing visual/auditory inputs.


Fig. 4. Buru-Navi3 [7] force display device that generates a sensation of being pulled or pushed by exploiting the characteristics of human perception.

At the same time, as mentioned above, direction in touch is non-unique, and it has many definitions. First, the direction is mapped on the skin coordinate and then remapped on the hand coordinate, arm coordinate, and body coordinate by taking body posture into account. Then the question is, to which coordinate should the navigation system show the direction when the device presents direction on the fingertip? Our results suggest that the direction at the hand/environmental coordinate is a better choice than that at the skin/finger coordinate since the brain calculates the direction on a fingertip after taking finger posture into account. Still, other questions remain such as whether the direction is calculated on the hand coordinate or body-centered coordinate and whether the same MAE can be observed (i.e., the same mechanism/theory is employed) when motion is presented with the palm-up posture. By gaining a more thorough understanding of information processing in the brain, we will contribute to developing unique, reliable, and user-friendly devices in the future.

References

[1] A. Wohlgemuth, “On the After-effect of Seen Movement,” British Journal of Psychology (Monograph Supplement), Vol. 1, pp. 1–117, 1911.
[2] G. Mather, F. A. J. Verstraten, and S. M. Anstis, “The Motion Aftereffect: a Modern Perspective,” MIT Press, Cambridge, MA, USA, 1998.
[3] S. Nishida, “Advancement of Motion Psychophysics: Review 2001-2010,” Journal of Vision, Vol. 11, No. 11, doi: 10.1167/11.5.11, 2011.
[4] P. J. Planetta and P. Servos, “The Tactile Motion Aftereffect Revisited,” Somatosensory and Motor Research, Vol. 25, No. 2, pp. 93–99, 2008.
[5] J. Watanabe, S. Hayashi, H. Kajimoto, S. Tachi, and S. Nishida, “Tactile Motion Aftereffects Produced by Appropriate Presentation for Mechanoreceptors,” Experimental Brain Research, Vol. 180, No. 3, pp. 577–582, 2007.
[6] S. Kuroki, J. Watanabe, K. Mabuchi, S. Tachi, and S. Nishida, “Directional Remapping in Tactile Inter-finger Apparent Motion: a Motion Aftereffect Study,” Experimental Brain Research, Vol. 216, No. 2, pp. 311–320, 2012.
[7] T. Amemiya, S. Takamuku, S. Ito, and H. Gomi, “Buru-Navi3 Gives You a Feeling of Being Pulled,” NTT Technical Review, Vol. 12, No. 11, 2014.
https://www.ntt-review.jp/archive/ntttechnical.php?contents=ntr201411fa4.html
Shinobu Kuroki
Research Scientist, Sensory Representation Research Group, Human Information Science Laboratory, NTT Communication Science Laboratories.
She received a Ph.D. in information science and technology from the University of Tokyo in 2011. Her research is focused on human tactile processing, particularly frequency perception, time perception, and motion perception.

↑ TOP