Preprint
Article

This version is not peer-reviewed.

Continuous Vibration-Driven Virtual Tactile Motion Perception Across Fingertips

A peer-reviewed article of this preprint also exists.

Submitted:

14 July 2025

Posted:

18 July 2025

You are already at the latest version

Abstract
Motion perception is a fundamental function of tactile system, essential for object exploration and manipulation. While human studies have largely focused on discrete or pulsed stimuli with staggered onsets, many natural tactile signals are continuous and rhythmically patterned. Here, we investigate whether phase differences between simultaneously presented, continuous amplitude-modulated vibrations can induce the perception of motion across fingertips. Participants reliably perceived motion direction at modulation frequencies up to 1Hz, with discrimination performance systematically dependent on the phase lag between vibrations. Critically, trial-level confidence reports revealed lowest certainty for anti-phase (180°) conditions, consistent with stimulus ambiguity as predicted by my mathematical framework. I propose two candidate computational mechanisms for tactile motion processing. The first is a conventional cross-correlation computation over the envelopes; The second is a probabilistic model based on the uncertain detection of temporal reference points (e.g., envelope peaks) within threshold-defined windows. This model, despite having only a single parameter (uncertainty width determined by an amplitude discrimination threshold), accounts for both the non-linear shape and asymmetries of observed psychometric functions. These results demonstrate that the human tactile system can extract directional information from distributed phase-coded signals in the absence of spatial displacement, revealing a motion perception mechanism that parallels arthropod systems but potentially arises from distinct perceptual constraints.
Keywords: 
;  ;  ;  ;  

1. Introduction

Motion is a fundamental quality of sensory input. In vision, despite diverse evolutionary trajectories across different species, from insects and cephalopods to vertebrates, visual systems have converged on fundamentally similar mechanisms of motion processing [1,2]. However, motion is not exclusive to vision; it is also a hallmark the tactile sensory system, with considerable behavioural relevance for both animals and humans. Everyday interactions such as object manipulation and haptic exploration involve relative motion between the skin and surfaces [3]. For example, discerning roughness and smoothness, identifying material properties (e.g. metal vs. wood) or recognising object shapes requires dynamic contact through palpation and movement. Reading Braille depends on lateral movement of the fingertips to interpret sequences of raised dots. Tactile motion processing underpins fine motor control and precise object manipulation [4,5]. Yet, the perceptual and computational mechanisms underlying tactile motion remain incompletely understood.
Two principal sources of information for tactile motion have been identified in the literature [5,6]. The first relies on the sequential activation of mechanoreceptors at different skin locations as an object or stimulus moves across the skin – such as when an insect crawls along the arm. This includes apparent tactile motion, typically studied experimentally using discrete, pulsed stimuli delivered in succession to separate skin sites[7,8,9,10,11,12,13,14]. The second source involves skin deformation cues, particularly shear and stretch, which arise during sliding contact or friction. These deformations can convey directional information [15], potentially through recruitment of distinct afferent populations, such as slowly adapting type II (SA2) units, which are sensitive to stretch and contribute to motion direction perception [16].
At the level of periphery, slowly adapting type I (SA1) afferents convey high-resolution spatial information about contact location [4,17,18]. The spatiotemporal pattern of their population activity is thought to encode motion direction and speed with acuity comparable to human perceptual performance [18]. Rapidly adapting type I (RA1) afferents may also contribute to motion encoding via spatiotemporal patterns, though with lower spatial precision due to their larger receptive fields [18]. SA2 afferents, previously mentioned, respond to skin stretch and support motion direction perception through their tuning to deformation patterns [15,16].
Here, I investigate a third and less explored mechanism for tactile motion: the perception of motion across fingers induced by asynchronous, continuous streams of vibrotactile input delivered to two fingertips simultaneously. Unlike prior studies that typically rely on either (a) sequential, pulsed stimulations delivered to multiple discrete skin sites [7,9,11,14,19], or (b) physical sliding stimuli that engage friction-induced skin deformation [5,6], this paradigm involves two spatially fixed but temporally dynamic inputs. The stimulation does not involve skin movement or high spatial acuity, but rather evokes motion percepts through temporal phase differences between inputs to two fingerpads.
Fingertips are among the most densely innervated tactile regions in mammalian and human body [20], and are primary organs for active exploration [3]. This work uses continuous amplitude-modulated vibrations known to predominantly recruit rapidly adapting type II (RA2) afferents – i.e., Pacinian corpuscles – which are sensitive to high-frequency vibratory energy and exhibit large receptive fields [21,22,23]. Unlike SA1 and RA1 afferents, RA2 units are less sensitive to fine spatial features but can detect remote vibratory events across skin and even bone. Notably, the stimuli here are delivered over the entire fingertip pad, eliminating reliance on fine spatial localisation and instead leveraging temporal synchrony or asynchrony across digits. This approach is analogous in principle to mechanisms found in certain arthropods, such as chelicerates, which detect and localise remote vibratory sources using their paired appendages [24]. Similarly, humans may infer motion direction or location of a remote source by comparing asynchronous vibratory input across fingerpads [25] – effectively extending tactile spatial perception beyond the point of contact.
In this study, I first formalise the physical basis for detecting the location and direction of a remotely moving vibration source, and how such stimuli can be simulated through asynchronous amplitude-modulated input to two fingertips. I then present a series of psychophysical experiments characterising human perceptual performance in detecting the direction of such inferred motion, revealing a novel mode of tactile motion perception that operates independently of spatial acuity or physical surface movement.

2. Materials and Methods

Vibrotactile stimulation is a versatile method for conveying spatiotemporal information through the skin and has been widely employed in both fundamental research and haptic technology applications [26]. Arrays of tactors delivering temporally staggered pulses have been used to generate apparent motion across body surfaces such as between hands, along the arm, or across the back, simulating a moving tactile stimulus without physical displacement [8,12,19,26,27]. While such paradigms rely on discrete bursts or pulses with differences in stimulus onset timing across spatially distinct sites to evoke motion percepts, the current study employs continuous amplitude-modulated waveforms with controlled phase offsets, enabling investigation of motion perception from contentious, distributed, phase-coded signals in the absence of distinct onsets or spatial displacement.
To model the stimuli generated by a remotely vibrating source, such as a mobile phone on a table, we consider a point source emitting a sinusoidal carrier wave A 0 sin 2 π f c t , where A 0 and f c denote the amplitude and carrier frequency of vibration, respectively. As this vibration propagates through the substrate – e.g., the table – approximately as a plane wave, it undergoes attenuation due to dissipation, scattering, and absorption. This attenuation typically follows an exponential decay with distance:
A d = A 0 exp γ d ,
where γ is the attenuation coefficient (dependent on medium properties and frequency), and d is the distance from the source.
When the source moves relative to a fixed point, the amplitude at that point changes with time due to the variation in distance. These changes are proportional to the radial component of the source’s motion. In the special case of periodic movement at a frequency f f c , the received signal envelope at a fixed remote point with time-varying distance d t is itself periodic, and the received signal can be written as:
A t + t 0 = A 0 exp γ d t sin 2 π f c + Δ f t ,
where t 0 is the delay due to propagation over distance d t , and Δ f is the Doppler shift caused by the source’s motion. Both parameters t 0 and Δ f depend on the wave propagation speed in the medium, which is determined by its stiffness and density (e.g., ∼5790 ms-1 in stainless steel, and ∼3960 ms-1 in hard wood).
For distances on the order of a meter or less, t 0 corresponds to sub-millisecond or nanosecond delays, well below biologically plausible detection thresholds. Moreover, assuming slow motion of source relative to wave propagation speed, and f f c , the Doppler shift Δ f is negligible. Henceforth, I assume t 0 0 and Δ f 0 .

2.1. Motion Direction Estimation via Two Touch Points

When a sensor (e.g., a fingertip) is placed at a fixed point (hereafter a ‘touch point’), the direction of source movement along the radial axis (toward or away from the touch point) can be inferred from temporal changes in the vibration envelope. However, a single touch point provides no information about the tangential component of the motion. To recover trajectory information, at least two touch points positioned at distinct spatial locations are required (Figure 1).
Let T 1 and T 2 denote two such points. The envelopes of vibration received at these two locations can, in principle, be used to infer the moment-by-moment position of a moving source in 2D. However, the reconstruction is ambiguous: any trajectory and its mirror reflection across the line connecting the two touch points (the touch-point axis) yields identical vibration patterns. This is illustrated in Figure 1A.
In 3D, ambiguity increases: all trajectories that are rotationally symmetric around the touch-point axis produce indistinguishable vibration profiles at the two points. That is, any trajectory that can be rotated about this axis into another remains perceptually equivalent at the touch points, leading to infinite number of trajectories that create an identical vibration pattern at the two touch points T 1 and T 2 .

2.1.1. In-Phase Vibrations

As discussed, a periodic source movement with frequency f results in a periodic amplitude modulation at each touch point. If the envelopes at T 1 and T 2 vary together over time – i.e., they are monotonic transformations of one another under a strictly increasing odd function – they are said to be ‘in phase’. For example, consider a trajectory confined to a plane perpendicular to the touch-point axis (Figure 1B). The closest and farthest positions on the trajectory to T 1 are identical to those to T 2 . Let h t denote the instantaneous orthogonal distance from the touch-point axis to the source trajectory at any moment t, and let r i be the perpendicular distance from touch point T i to the plane of the trajectory. The amplitude at each touch point is given by:
A i t = A 0 exp γ r i 2 + h 2 t ,
and the derivative:
d d t A i t = γ A 0 h t r i 2 + h 2 t exp γ r i 2 + h 2 t d h d t .
This shows that the envelope at each touch point changes in the same direction (increasing or decreasing together), confirming they are in phase. Additionally, one can show:
A 2 t = A 0 exp γ 2 r 2 2 r 1 2 + ln 2 A 1 A 0 ,
which is a strictly increasing function of A 1 , again confirming phase alignment.

2.1.2. Anti-Phase Vibrations

Two waveforms are anti-phase if their envelopes exhibit a phase difference of π , such that when one increases, the other decreases. This occurs when the envelopes are related through a negatively proportional transformation under a strictly increasing odd function. In this case, the perceived direction of motion alternates across each half-cycle, creating a bouncing or bidirectional trajectory. Perceptually, such out-of-phase vibration patterns may give rise to bistable motion perception, wherein the ambiguous temporal dynamics support two competing interpretations, each of which corresponding to motion in opposite directions, that may alternate spontaneously over time. Examples of out-of-phase configurations are shown in Figure 1D and E.

2.2. Circular Motion of a Vibrating Source

Consider a point source moving along a circular trajectory with radius r 0 and constant tangential velocity v (Figure 1C). The frequency of motion is
f = v 2 π r 0 .
Let T be a touch point located at polar coordinates r , φ relative to the centre of the circular path O. The received vibration at time t is given by:
A t = A 0 exp γ r 2 + r 0 2 2 r r 0 cos 2 π f t φ sin 2 π f c t ,
where A 0 is the source amplitude, f c is the carrier frequency, and γ is the attenuation coefficient of the medium. The envelope of the received vibration is the time-varying function:
A 0 exp γ r 2 + r 0 2 2 r r 0 cos 2 π f t φ ,
which oscillates at frequency f, between a minimum of A 0 exp γ r + r 0 and a maximum of A 0 exp γ r r 0 .

Extension to 3D:

In three dimensions, the equation for the received signal becomes:
A t = A 0 exp γ r 2 + r 0 2 2 r r 0 cos α cos 2 π f t φ sin 2 π f c t ,
where α is the elevation of the touch point T relative to the trajectory plane denoted by P, and π 2 α is the inclination angle in spherical coordinates (Figure 1C).

Phase Differences from Geometry

Let P denote the plane of circular trajectory, centred at O. Consider two touch points, T 1 and T 2 , with projections T 1 and T 2 onto plane P. Without loss of generality, let the coordinates of the two touch points be θ 1 , φ , r 1 and θ 2 , 2 π φ , r 2 , respectively. According to Eq. 9, the received vibrations at the two touch points are:
A 1 t = A 0 exp γ r 1 2 + r 0 2 2 r 1 r 0 sin θ 1 cos 2 π f t φ sin 2 π f c t , A 2 t = A 0 exp γ r 2 2 + r 0 2 2 r 2 r 0 sin θ 2 cos 2 π f t + φ sin 2 π f c t ,
Thus, the envelope phase difference between the two points is Δ φ = 2 φ .
Assume that the projected points and the centre of the circular path O lie on a straight line (Figure 1D). Then, if O lies between T 1 and T 2 (i.e., φ = π 2 ) the envelopes of the vibrations are anti-phase (Figure 1D and E). Conversely, if O lies outside the segment connecting T 1 T 2 – i.e., φ = π ,– the envelopes are in phase.
For simplicity, hereafter, we focus on the 2D symmetric case where the centre of the circular trajectory lies on the perpendicular bisector of the segment connecting the two touch points T 1 and T 2 , such that r 1 = r 2 . In this configuration, the two touch points are equidistant from the centre, resulting in envelopes of vibration with equal amplitude range. This condition facilitates visualisation and analysis of in-phase and out-of-phase conditions in a two-dimensional geometry.

2.3. Experimental Procedure

Three psychophysical experiments were conducted to investigate vibrotactile motion direction discrimination using a common two-alternative forced-choice (2-AFC) discrete trial paradigm. In this experiment, participants reported the perceived direction of vibro-tactile motion (left vs. right) generated by two amplitude-modulated vibrations delivered simultaneously to the index and middle fingertips of the right hand (see details below). All experimental procedures were approved by the Monash University Human Research Ethics Committee (MUHREC) and conducted in accordance with approved guidelines.

2.3.1. Participants

A total of 26 participants (12 female, age range: 19–34, 1 left-handed) took part across the three experiments. All were undergraduate or graduate students at Monash University. Each experiment involved distinct participant groups. All participants provided written informed consent prior to the experiment.

2.3.2. Vibro-Tactile Stimulation

In all experiments, vibrotactile stimuli were delivered simultaneously to the index and middle fingertips of the right hand using two miniature solenoid transducers (PMT-20N12AL04-04, Tymphany HK Ltd; 4 Ω, 1 W, 20 mm diameter) mounted 5 cm apart on a vibration-isolated pad. Stimuli were generated in MATLAB (MathWorks Inc.) at a sampling rate of either 48 kHz or 192 kHz, and output through a Creative Sound Blaster Audigy Fx 5.1 sound card (model SB1570). The peak-to-peak amplitude of the output waveform was set to 1.98 V. The shape and curvature of the transducer matched the size and contour of adult fingertips [28]. The base (carrier) frequency of the vibrations was f c = 100 H z . Although this frequency is within the audible range, we verified during pilot testing that the stimuli were not audible to participants and could only be perceived via touch. Amplitude modulation (AM) was applied to generate low-frequency envelopes, with each trial containing 3 modulation cycles. The modulation amplitude was set well above detection threshold, and pilot testing confirmed that even halving the amplitude had negligible effects on performance in the motion discrimination task.
In all experiments, sinusoidal envelopes were used due to their mathematical and physical properties. Sinusoids are fundamental in Fourier decomposition and are the only waveforms that preserve their shape under summation with others of the same frequency. Sinusoidal modulation mimics natural oscillatory signals (e.g., wind, light, and sound waves) and implies motion with varying velocity, similar to pendular or spring-mass dynamics.
For a given phase difference Δ φ , the two sinusoidally modulated vibrations were defined as:
A 1 t = A 0 2 1 + sin 2 π f t + Δ φ 2 sin 2 π f c t , A 2 t = A 0 2 1 + sin 2 π f t Δ φ 2 sin 2 π f c t .
To avoid any response bias or cues about motion direction arising from differences in initial envelope amplitude, vibration onset was set to one of the two isoamplitude points where the envelopes were identical. For non-zero Δ φ , these occur at t 0 = 1 4 f and 3 4 f , with the corresponding envelope amplitudes of A 0 2 1 + cos ± Δ φ 2 and A 0 2 1 cos ± Δ φ 2 , respectively (Figure 2C). Since cosine is an even function, the envelope magnitudes are identical for + Δ φ / 2 and Δ φ / 2 . At each of these onset points, the envelopes have opposite slopes – one rising and the other falling – corresponding to opposite directions of motion along the circular trajectory. On each trial, one of these two onset points was selected at random with equal probability, ensuring that initial envelope phase provided no reliable cue about motion direction.
In Experiment 1, we additionally included stimuli with exponentially decaying envelopes to simulate more realistic, physically plausible patterns of vibration propagation. The envelopes were derived from Eq. 10, using fixed parameters r 0 = r 1 = r 2 = 1 , and γ = 2 log 20 2 :
A 1 t = A 0 exp log 20 1 cos 2 π f t + Δ φ 2 sin 2 π f c t , A 2 t = A 0 exp log 20 1 cos 2 π f t Δ φ 2 sin 2 π f c t .
These stimuli were also initiated at one of the two isoamplitude points selected randomly on each trial, consistent with the sinusoidal condition.

2.4. Motion Direction Discrimination Task

Participants performed a discrete-trial two alternative forced-choice (2-AFC) task to judge the perceived direction of vibrotactile motion. On each trial, two amplitude-modulated vibrations were delivered simultaneously to the index and middle fingertips of the right hand. Participants were instructed to gently rest their fingertips on the transducers without applying force (Figure 2). The two transducers were spaced 5 cm apart on a vibration-isolated pad, arranged such that vibrations from one transducer were not perceptible at the other. Participants rested their arm on the chair armrest with their wrist comfortably supported on a padded surface aligned with the stimulation platform. They were instructed to maintain a stable hand posture throughout each session. All participants reported clear perception of the envelope modulation, and pilot testing confirmed that the stimulus amplitude was well above detection threshold.
The task was self-paced, with all responses made via keyboard. On each trial, a pair of vibrations with a specific envelope phase difference was presented for three cycles (e.g., 6 s at an envelope frequency of 0.5 Hz). Participants reported the perceived motion direction (leftward or rightward) by pressing the corresponding arrow key with their left hand. There was no time limit for responses, and participants could respond at any moment during or after stimulation.
The specific phase differences and envelope modulation frequencies varied across the three experiments. In Experiment 1, we compared sinusoidal and exponential envelopes with phase differences of ±30°to ±150°(30° increments) at a fixed envelope frequency of 0.5 Hz. In Experiment 2, sinusoidal envelopes were used with phase differences ranging from -180° to 180° in 30°increments, tested at envelope frequencies of 0.5, 1, and 1.5 Hz. In Experiment 3, sinusoidal envelopes were tested at phase differences of 0°, ±30°, ±60°, ±90°, and 180° at a fixed frequency of 0.5 Hz, with participants additionally providing confidence ratings after each response by pressing a number key from “1” (no confidence) to “5” (absolute certainty). These ratings were linearly scaled to a 0–100% confidence range. Experimental conditions were presented in pseudo-random order across trials, with approximately 30 repetitions per condition.

2.5. Psychometric Modelling

To quantify sensitivity to phase differences at each modulation frequency, we modelled perceptual discrimination performance as a function of phase difference Δ φ using a nonlinear periodic-sigmoid psychometric function:
P Δ φ = 1 1 + e κ sin Δ φ ,
where P Δ φ denotes the predicted proportion of correct responses at a given phase difference Δ φ , and κ is a sensitivity parameter reflecting the steepness of the psychometric function. Higher κ values indicate greater sensitivity to phase differences. This model captures the periodic geometry of the stimulus, predicting chance-level performance ( P = 0.5 ) at 0° and 180°, where the vibrations provide no directional cue. The sinusoidal form, combined with its single-parameter structure, avoids overfitting and reflects the hypothetised mechanism of motion perception based on phase-difference readout across spatially separated tactile sensors. The model was fit to group-averaged accuracy data using nonlinear least-squares regression, and the model performance (goodness-of-fit) was assessed using the coefficient of determination ( R 2 ).

2.6. Probabilistic Model of Temporal Reference Detection and Cycle Disambiguation

Consider two AM vibrations wth phase difference Δ φ , which corresponds to a temporal lag d between their envelopes:
d = Δ φ 2 π f ,
where f is the envelope modulation frequency. Let t 1 and t 3 denote the perceived moments of two consecutive salient reference points (e.g., peaks) of vibration 1, and let t 2 denote the corresponding reference point of vibration 2 that occurs between t 1 and t 3 . These reference points are extracted from the envelopes of the amplitude-modulated vibrations. Hereafter, we focus on peak features, but the same logic applies to other amplitude landmarks (e.g., troughs or zero-crossings). Due to sensory noise and perceptual limits, each detected peak is assumed to lie within a temporal uncertainty window around the true peak time. For each reference point, I model the perceived time as being uniformly distributed within a window of width w centred at the true peak. This uncertainty window depends on the perceptual threshold with which the envelope is extracted. For instance, assuming a sinusoidal envelope in Eq. 11, the time intervals where the envelope deviates from the peak amplitude by less than a threshold value δ correspond to durations satisfying | A i ( t ) A 0 | δ , which implies:
t 1 2 π f arcsin 1 2 δ A 0 d 2 , 1 2 π f arcsin 1 2 δ A 0 d 2 ,
so that the total uncertainty window is:
w = 1 π f arcsin 1 2 δ A 0 ,
Since the vibrations are periodic and have identical envelope shapes (with vibration 2 being a phase-shifted version of vibration 1), the reference point of vibration 2 is shifted by lag d. Similarly, t 3 is one cycle after t 1 , i.e., with an offset of T, where T = 1 f is the envelope modulation period. Without loss of generality, we assume d > 0 , and align the reference points relative to zero and define their distributions as t 1 U [ 0 , w ] , t 2 U [ d , d + w ] and t 3 U [ T , T + w ] . Correct perception of motion direction depends on both (1) the reliability of judging the temporal order of salient reference points (e.g., peaks) and (2) disambiguation of within-cycle versus across-cycle intervals. For now, we focus on peak features, but the same logic applies to other amplitude landmarks (e.g., troughs or zero-crossings).
Based on these distributions, the two forms of perceptual inference are required to judge motion direction: first, the temporal order judgement, i.e. determining whether the peak of vibration 2 occurs after the peak of vibration 1 ( t 2 > t 1 ). Second, the inter-peak interval discrimination, i.e., comparing whether the interval between t 1 and t 2 is shorter than the interval between t 2 and t 3 , i.e., testing whether t 2 t 1 < t 3 t 2 .
The sections that follow formalise these probabilities and derive an analytical expression for the overall probability of a correct motion direction judgement.

2.6.1. Temporal Order Judgement

The first source of error arises from uncertainty in judging the temporal order of peaks. If the temporal lag dis smaller than the uncertainty window w, the perceived ordering of t 1 and t 2 may be incorrect. The probability that the peak of envelope 1 is perceived before that of envelope 2 (i.e., a correct temporal order judgement) is given by the integral of the joint distribution of t 1 and t 2 over the area t 2 t 1 > 0 :
P t 2 t 1 > 0 = t 2 t 1 > 0 d t 1 d t 2 w 2 .
As t 1 and t 2 are mutually independent with uniform distributions, t 2 t 1 is distributed triangularly over the interval d w , d + w , yielding:
P ( t 1 < t 2 ) = 1 1 2 1 d w 2 for d w , 1 for d > w .
The case d > w guarantees correct order due to non-overlapping supports.

2.6.2. Across-Cycle Ambiguity and Inter-Peak Interval Discrimination

As d increases and the uncertainty window extends into the next modulation cycle, another form of error emerges. This is when the uncertainty around t 2 extends beyond the halfway point of the modulation period T, the perceived peak of envelope 2 may fall closer in time to the next peak of envelope 1 (denoted t 3 T , T + w ) rather than the original one at t 1 0 , w . This may result in an incorrect interval comparison, i.e., t 2 t 1 > t 3 t 2 , thus misjudging the motion direction. Based on the mutually independent uniform distributions of t 1 , t 2 and t 3 , the probability density function of V = t 3 2 t 2 + t 1 is a piece-wise quadratic function of the form:
P V = t 3 2 t 2 + t 1 = f V v + T 2 d = 1 2 w 1 v 2 2 w 2 for v w , 2 w v 2 4 w 3 for w < v 2 w , 0 for 2 w < v .
The probability of avoiding this across-cycle confusion is:
P V = t 3 2 t 2 + t 1 > 0 = 0 T + 2 w d f V v d v .
. The probability is below 1 when d + w < T / 2 .

2.6.3. Joint Probability of Correct Motion Perception

Correct motion perception requires both correct temporal order identification of peaks, and correct across-cycle inter-peak interval discrimination. These two conditions are not independent, and their joint probability must be calculated conditionally. Let Δ = t 2 t 1 . The joint probability of a correct response is:
P correct = P ( Δ > 0 ) · P ( t 3 + t 1 2 t 2 > 0 Δ > 0 ) .
Using the law of total probability over the distribution of Δ , the second term can be rewritten as:
P ( t 3 + t 1 2 t 2 > 0 Δ > 0 ) = 0 d + w P ( t 3 t 1 > 2 x Δ = x ) · f Δ | Δ > 0 ( x ) d x ,
where f Δ | Δ > 0 ( x ) is the conditional probability density function of Δ = t 2 t 1 given Δ > 0 , defined as:
f Δ | Δ > 0 ( x ) = f Δ x P Δ > 0 for x > 0 ,
with f Δ denoting the triangular probability density function of Δ d w , d + w , and the normalisation constant P Δ > 0 as derived in Eq. 15. Thus, the joint probability becomes:
P correct = 0 d + w P ( t 3 t 1 > 2 x Δ = x ) 1 x d w d x .
The conditioned probability P ( t 3 t 1 > 2 x Δ = x ) can be expressed as:
P ( t 3 t 1 > 2 x Δ = x ) = 2 x T + w f t 3 t 1 Δ = x y d y ,
where f t 3 t 1 Δ = x is the distribution of the difference between t 3 U [ T , T + w ] and t 1 , given Δ = x . The conditional distribution t 1 ( Δ = x ) U a ( x ) , b ( x ) , where:
a ( x ) = max d , x , b ( x ) = min w , d + w x ,
so that the support length is w ( x ) = b ( x ) a ( x ) = w | d x | , at most w. This leads to a trapezoidal distribution for t 3 t 1 Δ = x , from which we derive the cumulative probability:
P ( t 3 t 1 > 2 x Δ = x ) = 1 x T b x 2 , 1 2 x + b x T 2 2 w w d x T b x 2 < x T a x 2 , w d x 2 w + 2 x + b x T w w T a x 2 < x T b x + w 2 , 2 x + a x T w 2 x + a x T 2 w w d x T b x + w 2 < x T a x + w 2 , 0 T a x + w 2 < x .
The total probability of a correct decision is obtained by substituting this into Eq. 18. The full expression combines a triangular distribution for Δ , a trapezoidal distribution for t 3 t 1 , and a conditional integration over all valid Δ [ 0 , d + w ] . Though based on simple assumptions, this model predicts a non-linear psychometric curve that captures key features observed in our data, including asymmetries in performance consistent (e.g., better performance at 30° than at 150° phase lags in Experiment 2).

3. Results and Discussion

3.1. Experiment 1: Direction Discrimination Using Sinusoidal vs. Exponential Envelopes

To examine whether tactile motion perception can arise from simple envelope phase differences alone, I first tested whether participants could discriminate the direction of motion from two simultaneous vibrations with either sinusoidal or naturalistic – i.e., exponential – amplitude modulated (AM) envelopes. Both envelope types simulated a virtual motion trajectory via systematic phase differences across two fingertips. Participants performed a 2-AFC motion direction discrimination task at an envelope frequency of 0.5 Hz. On average across subjects (n = 8), direction discrimination accuracy was 85.5% ± 4.2% SEM for exponential envelopes, and 80.2% ± 5.1% SEM for sinusoidal envelopes (Figure 3). While exponential envelopes yielded slightly higher accuracy by 5.3% ± 1.5% SEM – possibly due to their closer resemblance to naturalistic wave propagation, – participants still showed robust performance with sinusoidal envelopes. This demonstrates that the tactile system extracts directional information purely from sinusoidal phase offsets, despite their more abstract physical basis.

3.2. Experiment 2: Upper Frequency Limit for Tactile Motion Discrimination Lies Below 1.5 Hz

To quantify discrimination performance at each envelope frequency, we fit a sigmoid psychometric model with a sensitivity parameter κ to the proportion of correct responses as a function of phase differences (see Methods). At 0.5 Hz, the model fit was robust (coefficient of determination R 2 = 0.56 ) with a relatively high sensitivity parameter κ = 1.29 , predicting a maximum accuracy of 78.4%. At 1 Hz, performance declined moderately ( R 2 = 0.82 , κ = 0.78 ), with a predicted maximum accuracy of 68.5% correct responses. At 1.5 Hz, performance approached chance level ( R 2 = 0.36 , κ = 0.17 ) with a predicted maximum of just 54.4% correct (Figure 4), indicating that the upper temporal limit for perceiving direction of tactile motion lies below 1.5 Hz.
These findings contrast with those of Kuroki et al. [25], who examined human sensitivity to AM vibrotactile stimuli up to 20 Hz in a synchronisation-asynchronisation detection task [25]. They reported higher detection thresholds, indicating that participants could reliably detect synchrony or asynchrony in AM signals nearly ten times higher than those supporting motion direction discrimination in our study. Moreover, they reported an inverse relationship between modulation frequency and detection threshold, with higher frequencies yielding better synchrony detection. This discrepancy underscores a critical distinction: while humans are capable of detecting synchrony in high-frequency AM signals, perceiving directional motion from inter-finger phase differences relies on much lower envelope frequencies. These differences point to potentially distinct neural mechanisms supporting temporal coincidence detection versus motion perception in the tactile domain.

3.3. Experiment 3: Cognitive and Metacognitive Signatures of Tactile Motion Perception

Building on Experiments 1 and 2, which established that tactile motion perception depends on the phase difference between fingertip vibrations, Experiment 3 introduced confidence ratings and examined behavioural signatures of perceptual ambiguity. By focusing on phase conditions with minimal directional information (e.g., 0° and 180°), Experiment 3 aimed to characterise how motion uncertainty is reflected in decision confidence, reaction times, and potential choice biases.

Ambiguity at 0° and 180° revealed by choice distribution

To assess whether phase differences between fingertip vibrations generate a reliable perception of motion direction, I examined participants’ choices across the range of phase offsets. For directional phase differences (e.g., ±30°, ±60°and ±90°) performance accuracy captures the extent to which participants reported motion direction consistent with the sign of the phase difference (see Figure 5A). However, at 0° and 180°, the vibrations were either perfectly in-phase or anti-phase across the two fingertips, resulting in symmetric temporal envelopes with no consistent directional cue. As such, for these two conditions, “correct” or “incorrect” responses are undefined for these conditions. Thus, I instead analysed these conditions in terms of choice likelihood – specifically, the proportion of “leftward” responses (Figure 5B).
At 0°, participants selected “left” on 52.2% of trials (SEM = 4.7%), not significantly different from chance (t(11) = 0.48, p = 0.64), consistent with perceptual ambiguity. At 180°, however, participants showed a subtle but reliable leftward bias (mean = 56.8%, SEM = 2.1%), which was significantly above chance (t(11) = 3.24, p = 0.008). This bias suggests that at 180°, even in the absence of reliable directional cues, early envelope asymmetries or internal decision biases may influence motion judgements.

3.3.1. Slower Responses Reflect Ambiguity in Motion Signal

Response time (RT) provides a behavioural index of the strength of sensory evidence. Here, I analysed how RT changed as a function of phase difference to assess how tactile motion signals support directional judgements under varying degrees of ambiguity. As shown in Figure 5C, RTs followed a U-shaped pattern: responses were slower for the ambiguous conditions (0° and 180°) and faster for intermediate phase differences. To statistically assess this pattern, a linear mixed-effects model (with random intercepts per subject) was fitted to trial-level RTs. Trials with excessively long RTs (>15s) were excluded, and the remaining RTs were z-scored within subject to account for individual baseline differences. The model included phase difference and its square as fixed effects. It revealed a statistically significant quadratic effect ( β = 0.0016 , p<1e-11), , consistent with an inverted-U pattern of RTs across phase differences, indicating that participants responded more slowly at both 0° and 180° phase differences, and more quickly for intermediate values (30°–90°), as shown in Figure 5C. This trend is consistent with the interpretation that extreme phase differences (0°, 180°) produce ambiguous or conflicting motion cues, requiring longer processing times. Indeed, the average RTs at 0° and 180° were nearly identical (5.32 ± 0.44 s and 5.32 ± 0.48 s, respectively), and both were higher than for other phase differences.

3.3.2. Confidence Ratings Track Motion Signal Strength

Confidence ratings reflect participants’ subjective assessment of perceptual certainty, providing a metacognitive index of how strongly they perceived the motion signals on each trial. A linear mixed-effects model with subject-wise random intercepts revealed a robust quadratic relationship between confidence and phase difference. Confidence was lowest at extreme phase difference values (0° and 180°) and highest at intermediate phase differences (30°–90°), mirroring the pattern observed in reaction times (Figure 5D), and consistent with weaker or more ambiguous motion signals at the extremes. This trend was reflected in a significant negative quadratic term ( β = 0.0021 , p < 1 e 44 ) and a significant positive linear term ( β = 0.36 , p < 1 e 35 ). These results indicate that participants were more confident when phase differences provided stronger directional cues, and less confident when the motion signal was more ambiguous.
In particular, average confidence at 180° was 49.4% ± 6.8% (mean ± SEM across participants), lower than all other conditions, including0° (52.8% ±3.0%), supporting the interpretation that anti-phase stimulation elicits especially uncertain percepts.

3.4. Phase Differences, Not Amplitude Differences, Drive Tactile Motion Perception

The behavioural ambiguity observed at 180° phase difference – reflected in a subtle directional bias, low confidence, and slow responses – raises a critical question: What stimulus features underlie tactile motion perception? Two plausible mechanisms are: (i) motion perception based on the phase difference between two AM signals (i.e., relative temporal shifts in their envelopes), and (ii) perception based on moment-by-moment amplitude differences between the signals.
The present stimulus design enables these alternatives to be dissociated. While ±180° phase differences produce the largest instantaneous amplitude differences between fingertips, they contain no consistent directional information, as the +180° and -180° stimuli are physically identical and indistinguishable. If motion perception were driven by amplitude differences alone, one would expect robust and consistent directional judgements under these conditions – contrary to the observed choice likelihood patterns.
Moreover, an amplitude-based account might predict high confidence on individual trial bases (despite random direction across trials), assuming a salient motion signal. Yet, confidence ratings at 180° were the lowest across all phase differences, mirroring the slower responses typically associated with perceptual uncertainty. Together, these findings support a mechanism in which phase differences between signals, not momentary amplitude (or “energy”) differences, drive tactile motion perception.

3.5. Potential Underlying Neural Computations

As in vision, tactile motion perception may rely on multiple neural computations [5,29]. Here, I outline two candidate mechanisms that could support the perception of motion based on phase differences in vibrotactile signals. These mechanisms differ in whether they rely on the measures of similarity of temporal patterns or on the relative timing of specific features (e.g., peaks or troughs) in the tactile signals. Below, we briefly discuss each and assess their neural plausibility.

3.5.1. Temporal Cross-Correlation Mechanisms

A plausible computational mechanism underlying tactile motion perception is based on temporal cross-correlation of the continuous tactile sensory inputs received at the two fingers. In this scenario, the brain compares the envelopes of each vibration over a certain temporal window to estimate their relative lag (phase difference) similar to Reichardt detectors [1,30]. The inferred phase Neural mechanisms for such temporal cross-correlation have been widely studied in other sensory systems. For example, in the auditory system, interaural time differences are computed via temporally sensitive circuits in the medial superior olive, involving coincidence detection mechanisms [31]. In the electrosensory system of weakly electric fish, neurons perform delay-sensitive comparisons between signals from different electroreceptors to extract motion or phase differences of preys [32]. While mammalian tactile system may not contain dedicated delay lines, some neurons in somatosensory cortex (particularly S1 and S2) exhibit phase-locked responses to frequency modulations[33,34], carrying information about the temporal patterns of sensory inputs. Additionally, cross-digit integration occurs at multiple levels, including primary and secondary somatosensory cortices, where receptive fields often span multiple fingers [35,36]. Such distributed, temporally sensitive representations could support correlation-based decoding of phase relationships. The observed sensitivity to small phase differences (e.g., 30°) in the present study is consistent with this type of integration. Thus, a biologically plausible hypothesis is that populations of neurons in somatosensory cortex, or possibly parietal areas, integrate envelope information and compare their temporal alignment. Population-level decoding of such temporal relationships could underlie the perceptual sensitivity to direction based on phase difference, as observed in our experiments. Whether these computations occur via explicit cross-correlation at the neural level, or are approximated by population-level pooling across temporal patterns, remains to be clarified.
Importantly, these computations are not limited to biological intuition but are also grounded in formal estimation theory. Under assumptions of linearity and Gaussian noise, cross-correlation, least-squares, and maximum likelihood methods yield equivalent estimates for time delay between signals [37]. These mechanisms are sensitive to the overall similarity and alignment of time-varying signals, rather than to discrete features such as peaks or zero-crossings. As such, they can operate continuously and flexibly, and do not depend on precise extraction of singular time points, potentially making them robust to noise.

3.5.2. A Probabilistic Model from Envelope Landmarks

A second mechanism is that the tactile system detects specific temporal landmarks in the envelope of each vibration – such as peaks, troughs, or other salient features – and infers motion direction based on the temporal order or timing of these events relative to each other. This process is inherently susceptible to sensory variability (noise) and perceptual uncertainty, especially when the modulated envelope changes gradually or when features are close in time.
To formalise this temporal uncertainty, I propose a simple threshold-based model in which a temporal reference point (e.g., a peak) is detected when the change in the envelope exceeds a certain slope or amplitude threshold. Changes below this threshold are not perceived as distinct events. For instance, under this assumption, any portion of the envelope around the true peak whose amplitude lies within the threshold margin is perceptually indistinguishable from the true peak. This introduces variability in the perceived timing of features or leads to missed detections, particularly when the modulation depth is shallow or the envelope is slowly varying.
Such detection uncertainty can lead to errors in temporal order judgements. For instance, two peaks occurring closely in time might be perceived in the wrong order, or the tactile system might match a peak from one vibration to the wrong cycle of the other, especially under large phase differences. These errors impair the brain’s ability to reliably infer motion direction. Importantly, this minimal model – based on a fixed amplitude detection threshold without any complex decoding and uniform temporal variability – produces non-trivial psychometric predictions. As illustrated in Figure 6, the model generates an asymmetric curve of predicted proportion correct as a function of phase difference: performance increases steeper near small positive phase offsets (e.g., 30°), but declines more gradually beyond 90°, reflecting cross-cycle misalignments and detection failures (e.g., for 150°). This asymmetry is evident in our experimental data, particularly in Experiment 2, where performance at 30° phase difference is higher than at 150°, despite the physical symmetry of the stimuli.
While this model was implemented based on peak detection, the same logic applies to other types of envelope features, including troughs or points of inflection. The key principle is that temporal reference points are perceived only if they exceed a salience threshold, and that perceptual errors emerge from variability in the timing or detectability of these points. This model captures the dual sources of perceptual error: (1) local ambiguity in temporal order when the temporal reference points are too close, and (2) misattribution across cycles when phase differences approach 180°.
Critically, this model explains why perceptual performance deteriorates at large phase differences despite increased amplitude contrast: the temporal lag between reference points (e.g. peaks) is closer to T / 2 , increasing the probability that reference points from one vibration are misattributed to a different cycle of the other. These findings suggest that under threshold-limited temporal resolution, tactile motion perception involves a delicate balance between fine temporal discrimination and the global temporal structure of the stimulus.
Together, these results suggest that tactile motion perception across fingers could be shaped by both temporally global (e.g., cross-correlation) and local (event-based) temporal processing mechanisms, each with distinct neural constraints and noise profiles.

4. Conclusions

This study investigated how the tactile system extracts spatial information about object motion from temporally structured vibrations delivered to two fingertips. Across three experiments, I delivered pairs of amplitude-modulated vibrations – each comprising a 100 Hz carrier modulated by a low-frequency sinusoidal envelope – to simulate continuous tactile motion. By systematically varying the phase difference between the two envelopes, I quantified how inter-fingertip phase offsets influence perceived motion direction, response latency, and confidence.
Our findings confirmed that the direction of perceived motion is determined by the phase difference between the two vibrations and not by their absolute frequency or amplitude. Experiment 1 showed that sinusoidal envelope vibrations reliably elicited robust directional motion percepts, comparable to those evoked by natural patterns (e.g., exponential decay). Notably, [12] found that gradually ramped vibrotactile stimuli produced stronger and smoother motion percepts than abrupt onsets, consistent with the use of continuous amplitude modulated vibrations in the present study to simulate naturalistic motion cues. Experiment 2 established that the upper frequency limit for reliable tactile motion discrimination lies below 1.5 Hz, nearly ten fold higher frequency threshold than those reported in earlier studies using similar stimuli [25]. Experiment 3 revealed systematic changes in confidence and reaction time with phase difference, with ambiguous conditions (0° and 180°) producing slower responses and lower confidence ratings. Importantly, although the 180° condition, despite producing the largest moment-by-moment amplitude differences between fingers, did not yield a consistent percept of direction, suggesting that motion perception depends on phase differences, not amplitude disparity.
Together, these results provide new insight into the computational basis of tactile motion perception. They support a mechanism in which tactile motion perception arises from the relative phase differences between of temporally structured signals across skin locations, rather than from instantaneous amplitude differences or ‘energy shifts. Unlike prior studies of tactile synchrony detection, the present paradigm required spatial trajectory inference across inputs, revealing that phase-based temporal integration, rather than amplitude contrast, underpins tactile motion perception. While Kuroki et al. [25] demonstrated that humans can detect temporal asynchrony in AM tactile stimuli at higher modulation frequencies (up to 20 Hz) indicative of sensitivity to temporal structure, their task probed asynchrony detection, not motion inference. Drawing parallels to the visual system, they proposed that tactile perception may rely on both “phase-shift” and “energy-shift” mechanisms, analogous to first- and second-order motion processing in vision.
Notably, Kuroki et. al reported peak detection at a 180° phase difference. Yet in the current study, the same phase difference produced ambiguous motion percepts, reflected in lower confidence, slower responses, and inconsistent choices. This discrepancy likely reflects task-specific neural computations for synchrony detection and motion perception. Synchrony detection may rely on local temporal contrast or energy cues at single skin locations, whereas tactile motion perception requires spatial comparison and temporal integration across fingertips. The present results suggest that phase-based readout, rather than local amplitude difference, is central to tactile motion perception.
This dissociation highlights that motion perception depends on the integration of temporal phase relationships across space and time. As in the visual system, where distinct pathways support multiple forms of motion processing, the tactile system may also engage parallel mechanisms for temporal analysis. Phase-based computations appear specifically tuned for inferring motion trajectories, distinguishing them from those supporting synchrony detection. These findings reveal how the tactile system transforms temporally structured input into spatial motion percepts, and how the brain selectively engages distinct temporal codes based on perceptual goals.
Here, I proposed two complementary models of tactile motion perception; one based on global cross-correlation of vibration envelopes, and another relying on local temporal comparisons between salient features such as envelope peaks. While the cross-correlation model captures overall waveform similarity, the feature-based model formalises direction perception as a probabilistic judgement derived from uncertain detection of temporal landmarks within amplitude-defined windows. Notably, both models are applicable to conventional apparent motion paradigms, where discrete or pulsed stimuli with staggered onsets simulate movement. Although the inter-peak intervals in our stimuli (e.g., 167 ms for 30° and 500 ms or 90° at 0.5 Hz) exceed classical tactile temporal order judgement thresholds [38], participants nonetheless exhibited robust directional performance and systematic confidence patterns. Notably, performance at 30° phase lag aligns with previously reported temporal order judgement thresholds (∼100 ms, [38]), despite differences in stimulus type and parameters, suggesting that reliable direction perception can emerge without discrete onsets or overt spatial displacement. While supramodal attentional tracking could, in principle, support such judgements – e.g., by tracking salient events across time and space irrespective of sensory modality, – our model provides a tactile-specific alternative. It attributes direction perception to probabilistic comparisons between uncertain temporal landmarks (e.g., envelope peaks), detected within amplitude-defined integration windows. This framework captures This framework captures the non-linearity in psychometric curves, including both the reliable direction perception at shorter phase lags and the ambiguity at 180°, without invoking higher-level amodal mechanisms or cross-modal attentional strategies. Instead, it reflects constraints intrinsic to tactile processing, where perceptual uncertainty in temporal feature extraction shapes directional judgements.
Central to the perception of the vibration-induced motion studied here is the brain’s ability to track dynamic changes in the envelopes of tactile signals and extract directional information from their relative timing. This sensory strategy has analogues across species: arachnids, for example, detect prey using complex vibration patterns transmitted through webs or substrates, relying on finely tuned mechanosensory systems that evolved independently from vertebrate touch [24]. In mammalian glabrous skin, Meissner’s and Pacinian corpuscles are specialised for detecting vibration [39,40,41,42,43], with Pacinian corpuscles implicated in encoding vibrotactile pitch in both mice and humans [44,45,46,47]. My previous work demonstrated that rodents can discriminate vibrations based on both amplitude and frequency using their whiskers [34,48,49]. Neurons in primary somatosensory cortex integrate these features in a way that supports vibrotactile perception. The present study builds on these principles, showing that temporal features – specifically phase relationships – can be exploited to generate robust perceptions of tactile motion across fingertips. This supports the idea that tactile systems, across species and sensor types, flexibly encode both spectral and temporal properties of mechanical stimuli to extract high-level perceptual content.
A major challenge in studying somatosensation is the ability to deliver tactile stimuli with precise control and reproducibility. In freely moving animals, variations in posture, movement, and skin contact can significantly affect the quality and consistency of tactile stimulation, introducing variability in the sensory input and complicating the interpretation of neural responses. In humans, the elastic properties of the skin can lead to trial-by-trial differences in receptor activation due to subtle changes in pressure, tension, or contact geometry [3]. These inconsistencies may engage different mechanoreceptor subtypes, potentially altering the percept and confounding behavioural measurements. To address these limitations, I developed a vibrotactile stimulation paradigm in which the perception of motion direction is determined not by low-level features of the individual vibrations – such as absolute amplitude or frequency – but by the phase relationship between them. This design enables consistent control over the critical perceptual variable (phase difference), even when some variability in contact conditions is unavoidable. As such, it offers a robust and generalisable framework for investigating tactile motion perception and related decision-making processes.
Finally, this paradigm provides a powerful tool for probing tactile decision-making and perceptual inference under controlled temporal structure, paralleling the role of random-dot motion in visual neuroscience. By dissociating low-level vibration attributes from high-level motion perception, it offers a flexible approach for linking somatosensory encoding with computational models of evidence accumulation and perceptual categorisation, in both human and animal research. By revealing how the brain transforms temporally structured input into coherent motion percepts across the skin, this work contributes to a deeper understanding of somatosensory processing and lays the groundwork for future research in touch-based interfaces, neuroprosthetics, and tactile cognition.

Author Contributions

“Conceptualization, M.A; methodology, M.A.; software, M.A.; validation,M.A.; formal analysis, M.A.; investigation, M.A.; resources, M.A.; data curation, M.A.; writing—original draft preparation, M.A.; writing—review and editing, M.A.; visualization, M.A.; project administration, M.A.; funding acquisition, M.A.. The author have read and agreed to the published version of the manuscript.”

Funding

M.A. was funded by the Australian Research Council (ARC) DECRA fellowship number DE200101468.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the the Monash University Human Research Ethics Committee (MUHREC) (Project ID 27649; date of approval: 08/02/2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

The author thanks Mohammad Razmjoo and Erfan Rezaei for their assistance with data collection in Experiment 1.

Conflicts of Interest

The author declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SEM Standard Error of the Mean
AM Amplitude modulation
RA1 Rapidly Adapting type I
RA2 Rapidly Adapting type II
SA1 Slowly Adapting type I
SA2 Slowly Adapting type II
2-AFC Two-Alternative Forced-Choice
MDPI Multidisciplinary Digital Publishing Institute

References

  1. Clifford, C.W.; Ibbotson, M.R.; Langley, K. An adaptive Reichardt detector model of motion adaptation in insects and mammals. Visual neuroscience 1997, 14, 741–749. [Google Scholar] [CrossRef] [PubMed]
  2. Adibi, M.; Zoccolan, D.; Clifford, C. Editorial: Sensory Adaptation. Frontiers in Systems Neuroscience 2021, 15, 809000. [Google Scholar] [CrossRef] [PubMed]
  3. Johansson, R.S.; Flanagan, J.R. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience 2009, 10, 345–359. [Google Scholar] [CrossRef] [PubMed]
  4. Johnson, K. Neural Basis of Haptic Perception 2002. [https://onlinelibrary.wiley.com/doi/pdf/10.1002/0471214426.pas0113]. [CrossRef]
  5. Pei, Y.C.; Bensmaia, S.J. The neural basis of tactile motion perception. Journal of Neurophysiology 2014, 112, 3023–3032. [Google Scholar] [CrossRef] [PubMed]
  6. Olausson, H.; Norrsell, U. Observations on human tactile directional sensibility. The Journal of physiology 1993, 464, 545–559. [Google Scholar] [CrossRef] [PubMed]
  7. Sherrick, C.E.; Rogers, R. Apparent haptic movement. Perception & Psychophysics 1966, 1, 175–180. [Google Scholar] [PubMed]
  8. Kirman, J.H. Tactile apparent movement: The effects of interstimulus onset interval and stimulus duration. Perception & Psychophysics 1974, 15, 1–6. [Google Scholar] [CrossRef]
  9. Gardner, E.P.; Palmer, C.I. Simulation of motion on the skin. I. Receptive fields and temporal frequency coding by cutaneous mechanoreceptors of OPTACON pulses delivered to the hand. Journal of Neurophysiology 1989, 62, 1410–1436. [Google Scholar] [CrossRef] [PubMed]
  10. Gardner, E.P.; Palmer, C.I. Simulation of motion on the skin. II. Cutaneous mechanoreceptor coding of the width and texture of bar patterns displaced across the OPTACON. Journal of neurophysiology 1989, 62, 1437–1460. [Google Scholar] [CrossRef] [PubMed]
  11. Killebrew, J.H.; Bensmaia, S.J.; Dammann, J.F.; Denchev, P.; Hsiao, S.S.; Craig, J.C.; Johnson, K.O. A dense array stimulator to generate arbitrary spatio-temporal tactile stimuli. Journal of neuroscience methods 2007, 161, 62–74. [Google Scholar] [CrossRef] [PubMed]
  12. Zhao, S.; Israr, A.; Klatzky, R. Intermanual apparent tactile motion on handheld tablets. In Proceedings of the 2015 IEEEWorld Haptics Conference (WHC), 2015, pp. 241–247. [CrossRef]
  13. Seizova-Cajic, T.; Ludvigsson, S.; Sourander, B.; Popov, M.; Taylor, J.L. Scrambling the skin: A psychophysical study of adaptation to scrambled tactile apparent motion. Plos one 2020, 15, e0227462. [Google Scholar] [CrossRef] [PubMed]
  14. Kuroki, S.; Nishida, S. Motion direction discrimination with tactile random-dot kinematograms. i-perception 2021, 12, 20416695211004620. [Google Scholar] [CrossRef] [PubMed]
  15. Abdouni, A.; Vargiolu, R.; Zahouani, H. Impact of finger biophysical properties on touch gestures and tactile perception: Aging and gender effects. Scientific reports 2018, 8, 1–13. [Google Scholar] [CrossRef] [PubMed]
  16. Olausson, H.; Wessberg, J.; Kakuda, N. Tactile directional sensibility: peripheral neural mechanisms in man. Brain research 2000, 866, 178–187. [Google Scholar] [CrossRef] [PubMed]
  17. Phillips, J.R.; Johnson, K.O. Tactile spatial resolution. II. Neural representation of bars, edges, and gratings in monkey primary afferents. Journal of neurophysiology 1981, 46, 1192–1203. [Google Scholar] [CrossRef] [PubMed]
  18. Friedman, R.M.; Khalsa, P.S.; Greenquist, K.W.; LaMotte, R.H. Neural coding of the location and direction of a moving object by a spatially distributed population of mechanoreceptors. Journal of Neuroscience 2002, 22, 9556–9566. [Google Scholar] [CrossRef] [PubMed]
  19. Kwon, J.; Park, S.; Sakamoto, M.; Mito, K. The effects of vibratory frequency and temporal interval on tactile apparent motion. IEEE Transactions on Haptics 2021, 14, 675–679. [Google Scholar] [CrossRef] [PubMed]
  20. Gardner, E.P.; Martin, J.H., Principles of Neural Science; McGraw-Hill New York, 2000; chapter Coding of sensory information, pp. 411–429.
  21. Mountcastle, V.B.; Talbot, W.H.; Darian-Smith, I.; Kornhuber, H.H. Neural basis of the sense of flutter-vibration. Science 1967, 155, 597–600. [Google Scholar] [CrossRef] [PubMed]
  22. Talbot, W.H.; Darian-Smith, I.; Kornhuber, H.H.; Mountcastle, V.B. The sense of flutter-vibration: Comparison of the human capacity with response patterns of mechanoreceptive afferents from the monkey hand. Journal of Neurophysiology 1968, 31, 301. [Google Scholar] [CrossRef] [PubMed]
  23. Johansson, R.S.; Vallbo, A.B. Detection of tactile stimuli. Thresholds of afferent units related to psychophysical thresholds in the human hand. The Journal of Physiology 1979, 297, 405. [Google Scholar] [CrossRef] [PubMed]
  24. Strauß, J.; Stritih-Peljhan, N. Vibration detection in arthropods: Signal transfer, biomechanics and sensory adaptations. Arthropod Structure & Development 2022, 68, 101167. [Google Scholar] [CrossRef] [PubMed]
  25. Kuroki, S.; Watanabe, J.; Nishida, S. Neural timing signal for precise tactile timing judgments. Journal of Neurophysiology 2016, 115, 1620–1629. [Google Scholar] [CrossRef] [PubMed]
  26. Jones, L.A. Tactile communication systems: Optimizing the display of information. Progress in brain research 2011, 192, 113–128. [Google Scholar] [PubMed]
  27. Severgnini, F.M.; Martinez, J.S.; Tan, H.Z.; Reed, C.M. Snake Effect: A Novel Haptic Illusion. IEEE Transactions on Haptics 2021, 14, 907–913. [Google Scholar] [CrossRef] [PubMed]
  28. Young, E.M.; Gueorguiev, D.; Kuchenbecker, K.J.; Pacchierotti, C. Compensating for fingertip size to render tactile cues more accurately. IEEE transactions on haptics 2020, 13, 144–151. [Google Scholar] [CrossRef] [PubMed]
  29. Pack, C.C.; Bensmaia, S.J. Seeing and feeling motion: canonical computations in vision and touch. PLoS Biology 2015, 13, e1002271. [Google Scholar] [CrossRef] [PubMed]
  30. Reichardt, W. Autocorrelation, a principle for the evaluation of sensory information by the central nervous system. In Sensory communication; Rosenblith, W., Ed.; Cambridge : M.I.T. Press, 1961; chapter 17, pp. 303–317.
  31. Franken, T.P.; Bremen, P.; Joris, P.X. Coincidence detection in the medial superior olive: mechanistic implications of an analysis of input spiking patterns. Frontiers in neural circuits 2014, 8, 42. [Google Scholar] [CrossRef] [PubMed]
  32. Rose, G.; Heiligenberg, W. Structure and function of electrosensory neurons in the torus semicircularis of Eigenmannia: morphological correlates of phase and amplitude sensitivity. The Journal of neuroscience 1985, 5, 2269. [Google Scholar] [CrossRef] [PubMed]
  33. Harvey, M.A.; Saal, H.P.; Dammann III, J.F.; Bensmaia, S.J. Multiplexing stimulus information through rate and temporal codes in primate somatosensory cortex. PLoS biology 2013, 11, e1001558. [Google Scholar] [CrossRef] [PubMed]
  34. Adibi, M. Whisker-mediated touch system in rodents: from neuron to behavior. Frontiers in Systems Neuroscience 2019, 13, 40. [Google Scholar] [CrossRef] [PubMed]
  35. Reed, J.L.; Qi, H.X.; Zhou, Z.; Bernard, M.R.; Burish, M.J.; Bonds, A.; Kaas, J.H. Response properties of neurons in primary somatosensory cortex of owl monkeys reflect widespread spatiotemporal integration. Journal of neurophysiology 2010, 103, 2139–2157. [Google Scholar] [CrossRef] [PubMed]
  36. Thakur, P.H.; Fitzgerald, P.J.; Lane, J.W.; Hsiao, S.S. Receptive field properties of the macaque second somatosensory cortex: nonlinear mechanisms underlying the representation of orientation within a finger pad. Journal of Neuroscience 2006, 26, 13567–13575. [Google Scholar] [CrossRef] [PubMed]
  37. Ljung, L., System Identification. In Signal Analysis and Prediction; Procházka, A.; Uhlíř, J.; Rayner, P.W.J.; Kingsbury, N.G., Eds.; Birkhäuser Boston: Boston, MA, 1998; pp. 163–173. [CrossRef]
  38. Craig, J.C.; Baihua, X. Temporal order and tactile patterns. Perception & psychophysics 1990, 47, 22–34. [Google Scholar]
  39. Mountcastle, V.; LaMotte, R.; Carli, G. Detection thresholds for stimuli in humans and monkeys: Comparison with threshold events in mechanoreceptive afferent nerve fibers innervating the monkey hand. Journal of Neurophysiology 1972, 35, 122–136. [Google Scholar] [CrossRef] [PubMed]
  40. Freeman, A.W.; Johnson, K.O. Cutaneous mechanoreceptors in macaque monkey: temporal discharge patterns evoked by vibration, and a receptor model. The Journal of physiology 1982, 323, 21–41. [Google Scholar] [CrossRef] [PubMed]
  41. Johansson, R.S.; Landstro, U.; Lundstro, R.; et al. Responses of mechanoreceptive afferent units in the glabrous skin of the human hand to sinusoidal skin displacements. Brain research 1982, 244, 17–25. [Google Scholar] [CrossRef] [PubMed]
  42. Bell, J.; Bolanowski, S.; Holmes, M.H. The structure and function of Pacinian corpuscles: a review. Progress in neurobiology 1994, 42, 79–128. [Google Scholar] [PubMed]
  43. Zimmerman, A.; Bai, L.; Ginty, D.D. The gentle touch receptors of mammalian skin. Science 2014, 346, 950–954. [Google Scholar] [CrossRef] [PubMed]
  44. Hollins, E.A.R.M. A ratio code for vibrotactile pitch. Somatosensory & motor research 1998, 15, 134–145. [Google Scholar]
  45. Tranchant, P.; Shiell, M.M.; Giordano, M.; Nadeau, A.; Peretz, I.; Zatorre, R.J. Feeling the beat: Bouncing synchronization to vibrotactile music in hearing and early deaf people. Frontiers in neuroscience 2017, 11, 507. [Google Scholar] [CrossRef] [PubMed]
  46. Prsa, M.; Morandell, K.; Cuenu, G.; Huber, D. Feature-selective encoding of substrate vibrations in the forelimb somatosensory cortex. Nature 2019, 567, 384–388. [Google Scholar] [CrossRef] [PubMed]
  47. Prsa, M.; Kilicel, D.; Nourizonoz, A.; Lee, K.S.; Huber, D. A common computational principle for vibrotactile pitch perception in mouse and human. Nature communications 2021, 12, 1–8. [Google Scholar] [CrossRef] [PubMed]
  48. Adibi, M.; Arabzadeh, E. A comparison of neuronal and behavioral detection and discrimination performances in rat whisker system. Journal of Neurophysiology 2011, 105, 356. [Google Scholar] [CrossRef] [PubMed]
  49. Adibi, M.; Diamond, M.; Arabzadeh, E. Behavioral study of whisker-mediated vibration sensation in rats. Proceedings of the National Academy of Sciences 2012, 109, 971–976. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Detecting the motion of a remote vibrating source through patterns of vibrations sensed at two touch points. A, T 1 and T 2 denote the two touch points. The trajectory on plane P is the rotation of trajectory on plane P around the touch axis T 1 T 2 . The grey closed curve shows the mirror of the trajectory with respect to the touch axis T 1 T 2 . B, For any arbitrary trajectory on the plane P, when the touch axis T 1 T 2 is orthogonal to P, the vibrations from source S received at T 1 and T 2 are in-phase. r 1 and r 2 represent the distances of T 1 and T 2 from P respectively. d 1 and d 2 denote the distances from the source S to T 1 and T 2 respectively, and vary as S moves along the trajectory. C, A circular trajectory with radius r 0 , centred at O. T denotes the projection of touch point T onto plane P. h, r and d denote the distances from T to P, O and S, respectively. α is the angle between O T and O T . D, d 1 and d 2 denote the distances from the source S to T 1 and T 2 , respectively, and vary as S moves along the trajectory. r 1 and r 2 are the distances from O to T 1 and T 2 , respectively. E, An example of anti-phase vibrations, when the projection of the axis T 1 T 2 (dashed line) onto the trajectory plane P passes through O. F, The two-dimensional geometry. All conversions as in D.
Figure 1. Detecting the motion of a remote vibrating source through patterns of vibrations sensed at two touch points. A, T 1 and T 2 denote the two touch points. The trajectory on plane P is the rotation of trajectory on plane P around the touch axis T 1 T 2 . The grey closed curve shows the mirror of the trajectory with respect to the touch axis T 1 T 2 . B, For any arbitrary trajectory on the plane P, when the touch axis T 1 T 2 is orthogonal to P, the vibrations from source S received at T 1 and T 2 are in-phase. r 1 and r 2 represent the distances of T 1 and T 2 from P respectively. d 1 and d 2 denote the distances from the source S to T 1 and T 2 respectively, and vary as S moves along the trajectory. C, A circular trajectory with radius r 0 , centred at O. T denotes the projection of touch point T onto plane P. h, r and d denote the distances from T to P, O and S, respectively. α is the angle between O T and O T . D, d 1 and d 2 denote the distances from the source S to T 1 and T 2 , respectively, and vary as S moves along the trajectory. r 1 and r 2 are the distances from O to T 1 and T 2 , respectively. E, An example of anti-phase vibrations, when the projection of the axis T 1 T 2 (dashed line) onto the trajectory plane P passes through O. F, The two-dimensional geometry. All conversions as in D.
Preprints 168000 g001
Figure 2. Motion direction discrimination task. A, The index and middle fingers of the right hand were stimulated using a pair of solenoid transducers (upper panel, B), which delivered amplitude-modulated vibrations (lower panel, B). C, On each trial, the envelopes of the two vibrations had a phase difference Δ φ . The vibrations began at one of two points where their envelope amplitudes were equal (indicated by dashed lines).
Figure 2. Motion direction discrimination task. A, The index and middle fingers of the right hand were stimulated using a pair of solenoid transducers (upper panel, B), which delivered amplitude-modulated vibrations (lower panel, B). C, On each trial, the envelopes of the two vibrations had a phase difference Δ φ . The vibrations began at one of two points where their envelope amplitudes were equal (indicated by dashed lines).
Preprints 168000 g002
Figure 3. Experiment 1: Naturalistic vs. sinusoidal vibration envelopes. A, Schematic representation of naturalistic (exponential) and sinusoidal vibrations, along with their envelopes (thick curves). For illustration purposes, a 20 Hz carrier frequency is shown; the actual carrier frequency used in the experiments was 100 Hz. B, Motion direction discrimination accuracy, shown as the proportion of correct trials for exponential and sinusoidal vibrations. Bars represent the average across subjects, with error bars indicating the standard error of the mean (SEM). Data points represent individual participants ( n = 8 ).
Figure 3. Experiment 1: Naturalistic vs. sinusoidal vibration envelopes. A, Schematic representation of naturalistic (exponential) and sinusoidal vibrations, along with their envelopes (thick curves). For illustration purposes, a 20 Hz carrier frequency is shown; the actual carrier frequency used in the experiments was 100 Hz. B, Motion direction discrimination accuracy, shown as the proportion of correct trials for exponential and sinusoidal vibrations. Bars represent the average across subjects, with error bars indicating the standard error of the mean (SEM). Data points represent individual participants ( n = 8 ).
Preprints 168000 g003
Figure 4. Experiment 2: Effect of envelope frequency on tactile motion perception. Motion direction discrimination performance as a function of phase difference, shown separately for each envelope frequency (indicated by colour). Data points represent across-subject averages ( n = 6 ), with error bars indicating the standard error of the mean (SEM). Curves represent psychometric fits for each frequency condition.
Figure 4. Experiment 2: Effect of envelope frequency on tactile motion perception. Motion direction discrimination performance as a function of phase difference, shown separately for each envelope frequency (indicated by colour). Data points represent across-subject averages ( n = 6 ), with error bars indicating the standard error of the mean (SEM). Curves represent psychometric fits for each frequency condition.
Preprints 168000 g004
Figure 5. Experiment 3: Cognitive and metacognitive measures of tactile motion. A, Discrimination accuracy, measured as the proportion of correct responses, averaged across subjects ( n = 12 ). For 0° and 180° phase differences, responses were pseudo-randomly labelled as correct or incorrect. B, Psychometric curves (choice likelihood) showing the proportion of “left” responses as a function of phase difference, averaged across subjects. C, Median reaction time (interval between stimulus onset and response), averaged across subjects, plotted as a function of phase difference. D, Average confidence ratings vs. phase differences. All error bars represent the standard error of the mean (SEM).
Figure 5. Experiment 3: Cognitive and metacognitive measures of tactile motion. A, Discrimination accuracy, measured as the proportion of correct responses, averaged across subjects ( n = 12 ). For 0° and 180° phase differences, responses were pseudo-randomly labelled as correct or incorrect. B, Psychometric curves (choice likelihood) showing the proportion of “left” responses as a function of phase difference, averaged across subjects. C, Median reaction time (interval between stimulus onset and response), averaged across subjects, plotted as a function of phase difference. D, Average confidence ratings vs. phase differences. All error bars represent the standard error of the mean (SEM).
Preprints 168000 g005
Figure 6. Predicted direction discrimination performance from the probabilistic feature-based model. Model-predicted proportion of correct motion direction discrimination as a function of phase difference. Each trace corresponds to a different amplitude detection threshold (indicated by colour), expressed as a proportion of the peak envelope amplitude (from 0.1 to 0.9 in increments of 0.1).
Figure 6. Predicted direction discrimination performance from the probabilistic feature-based model. Model-predicted proportion of correct motion direction discrimination as a function of phase difference. Each trace corresponds to a different amplitude detection threshold (indicated by colour), expressed as a proportion of the peak envelope amplitude (from 0.1 to 0.9 in increments of 0.1).
Preprints 168000 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated