Keywords

1 Introduction

Due to technological progress, the means of interacting with input devices in computing changed from indirect, as the motion of the devices had to be translated to the motion of a pointer on a screen, to direct, as touch technologies were introduced into the market. When using direct input devices, the space directly surrounding the body called peripersonal space is crucial. As input devices are mainly used by hand, this paper aims at studying cognitive processes with regard to hand presence.

1.1 The Effects of Proximal Hands

Recent studies have suggested that the location of the hand has an influence on perceptual processes. This effect may be explained by the fact that objects near the body, especially near the hands, may be candidates for manipulation in potential actions and therefore are be perceived differently. Research focusing on the perception of stimuli near the hands found out that the direction and strength of the effect is mainly dependent on task context, resulting in a positive effect in some tasks, while for other tasks the effect might change to a negative outcome. With regard to visual working memory, for example, a positive effect was found, as more elements could be kept in mind when the hands were near a stimulus. Another positive effect exists when processing visual stimuli while searching for a specific target. The results of five experiments administered by Reed et al. [1] showed that subjects detected target items which appeared near the hand in a covert attention task more quickly than targets away from the hand. This effect was also present for hands, which were not visible and only proprioceptive information about the location of the hand was given. Regarding tasks dealing with specific cognitive mechanisms, results showed that placing the hands near a stimulus can lead to a preference for focusing on details [2], a delay in attentional disengagement [3] and an enhancement in cognitive control mechanisms [4]. The effects caused by hand presence were also analyzed in lesion studies with patients having damages in specific brain areas. In this context, Schendel and Robertson [5] examined the post-stroke vision loss of a hemianopia patient by performing a detection task experiment in which three different conditions were compared. In the first condition, the patient’s left arm lay on his lap, in the second condition, the left arm was positioned in the left visual field and the stimulus presentation took place within reach of the arm, while in the third condition the stimulus presentation was varied in distance, namely out of the reach of the arm. The results showed that a reduction in the severity of the visual deficit can be noticed in the case the arm of the patient acted in the “blind” field during condition two although the patients’ inability to “see” the stimulus.

In summary, study results demonstrate a preferential processing of visual objects near the hands. Apart from the effects just described, engaging to stimuli more fully may also be non-facilitative for tasks requiring the processing of words and sentences. For example, Davoli et al. [6] let subjects judge the sensibleness of sentences and found that subjects are slower and less effective when their hands are near a visual display compared to when their hands were on the lap. In line with this result, Le Bigot and Jaschinski [7] found more errors in a letter detection task while subjects read pseudo text when the hands were near a computer screen than when the hands were further away from the screen on a desk.

Besides the distance between stimulus presentation and hand, the way how tasks are processed has an influence on the strength of the effect. Most of the studies analyzed tasks where hands were in a task congruent position. However, Brown et al. [8] studied a setting where the palm of the hand was rotated in the direction of the display and compared this position to a setting where the hand was rotated away from the display. The authors found a stronger effect of hand proximity when the hand was in a task congruent position, namely rotated with the palm in direction of the display. In another study, Festman et al. [9] studied the effect of hand proximity with moving hands. In grasping movements and pointing tasks a meaningful movement would be in the direction of the stimulus which needs to be attended to. Thus, the authors studied this movement direction and compared it to a movement which went the opposite way. Their results showed a stronger effect for movements into the direction of the stimulus and this was also true when the hands were not visible but covered under a table. Reed et al. [10] added a tool, specifically a rake, into their study setting. In the first condition they placed this tool in a meaningful position according to the original context of usage, whereas it was placed in an incongruent position in the second condition. This study again showed a stronger effect for the condition with the tool being in a task congruent position. These studies have shown that the effect which is engendered through the proximity of the hand not only originates from the presence of the hand or the tool, but also through the functional relation of the hand or the tool to the task context.

Evidence for the effect comes from Rizzolatti et al. [11] and their contribution to the bimodal neuron hypothesis. These researchers studied monkeys with lesions in specific brain regions and found partially separate neural circuits for differing distances around the body. Monkeys with lesions in the unilateral frontal lobe were not able to detect stimuli which were in their reaching distance and monkeys with unilateral lesions in the parietal lobe failed to attend to stimuli beyond reaching distance. Further, these neural systems are bimodal neurons responding to both visual stimuli, which are in the space immediately surrounding the body, called peripersonal space, and tactile information. As the hand moves, the receptive fields of the neurons in the brain move with the hand [12]. As the effect of proximal hands is stronger for task congruent hand positions and movements, it is not caused solely by a difference in attentional engagement to stimuli in peripersonal space but in a change in the process of object perception [13].

1.2 Peripersonal Space of the Elderly

Past studies of the influence of age on movement tasks have found that differences in hand motion are attributable to differences in the perception of peripersonal space, not to deficits in motor skills [14]. Based on studies administered by Tipper et al. [15], which found that while executing grasping movements the position of the object is automatically encoded in reference to the hand in younger subjects, Bloesch et al. [14] showed that in older subjects the reference frame is not specifically attributed to the hand but to the body as a whole. By means of reach-and-point actions, such as the movement needed to dial on a phone, Bloesch et al. found that distractor objects placed along a movement path slowed participants’ performance more than distractors outside the movement path in younger subjects but not in older subjects. Instead older subjects’ performance was slowed down when a distractor was placed near their bodies. Therefore, the authors concluded that young subjects adopt an action-centered reference frame while older aged subjects make use of a body-centered reference frame. One of the key findings from these papers is that older people have more problems in performing actions in peripersonal space and these shortfalls are not only caused by impairments in motor skills.

1.3 Eye-Tracking Metrics

The cognitive processes associated with search processes are covered and cannot be observed directly. However, by using eye-tracking measures, eye movements can be analyzed. The basic idea behind analyzing eye movements is that cognitive processes can be inferred from gaze behavior. A way to assess the visual focus is by utilizing fixation durations of the eyes regarding the stimulus in question. Thereby, longer fixation durations are an indicator for the difficulty to extract information from a display [16].

Concerning age differences in mean fixation durations most studies report higher mean fixation durations for older subjects. When analyzing navigational behavior on web pages, for example, Fukuda and Bubb [17] found that subjects aged between 62 and 74 years had longer fixation durations than younger participants aged between 17 and 29 years. Moreover, Hill et al. [18] investigated computer expertise when using the internet among older subjects (70–93 years) and found that older novices had significant higher mean fixation durations than older experts. However, there are a few studies that reported no age differences regarding the fixation durations. For example, Veiel et al. [19] investigated age differences in the perception of visual stimuli and found no age difference regarding the fixation durations. This is in line with the results of Maltz and Shinar [20], who studied visual performance while driving.

1.4 The Present Study

In order to examine age-related declines in human-computer interaction, this study focusses on cognitive processes in a visual search task, as searching for icons or functions on a screen is an essential part while interacting with software. Therefore, three different distances of the hand in reference to the screen were studied as independent variables: hands placed on the screen, hands placed on the table and hands placed on the lap. As dependent variable fixation durations were analyzed. As there are age-related declines in the perception of peripersonal space, results are studied in an age-differentiated manner. With regard to prior work which found longer search times [3, 21] and slower attentional disengagement [3] when hands are near a stimulus it is hypothesized that fixation durations are longer for positions of the hands at the screen.

2 Method

2.1 Participants

Altogether, 69 right-handed subjects with normal or corrected-to-normal vision participated in the study. The age ranged from 20 to 60 years (mean = 34.67, SD = 12.83 yrs.)

2.2 Apparatus

The experiment was conducted with a 17-inch LCD monitor. Eye–movements were measured during the task using the SMI eye-tracking glasses 2.0. In reference to Rayner [22] the criterion for fixations of the eye in order to count as fixation duration was set to 50 ms.

2.3 Procedure and Task

Subjects were seated at a desk in front of a computer with a viewing distance of 500 mm from the screen. The three hand conditions are shown in Fig. 1. In each condition, a computer mouse located under the subject’s right hand served as input device. In the first condition, both hands were placed at the sides of the display and the arms were supported by an elbow rest. In the second condition hands were placed on the table in front of the participant and in the third condition, the hands were placed on a wooden slat resting on the subject’s lap. The horizontal distance between the hands was kept constant for all conditions.

Fig. 1.
figure 1

Visualization of the study conditions: hands on the screen (left), hands on the table (center) and hands on the lap (right).

The search display consisted of a matrix containing 48 rectangles with different alphanumeric characters. Characters that look similar in upper case and lower case were presented only once. In total, each of the 48 rectangles needed to be searched in a random order for each hand position to ensure that every part of the display would be included in the task, resulting in 3 × 48 trials of the visual search task. During every trial, one character that needed to be searched was presented first. After that, a blank screen was shown as a masking stimulus for three seconds, followed by the search matrix in which alphanumeric characters were arranged randomly in every trial. At the beginning, instructions for the task were given and five practice trials were carried out per condition. After locating the search stimulus, participants were instructed to click the mouse with the right index finger.

3 Results

Tests of Normality revealed that fixation durations for the three positions of the hands were not normally distributed. However, according to the findings of e.g. Lumley et al. [23] and Norman [24] ANOVA is robust concerning the violation of normality for sample sizes larger than n = 30. Therefore, repeated-measures analysis of variance was used to study the overall effect of hand position statistically. In the case Mauchly’s test of sphericity showed a significant effect, within-subject effects were analyzed by means of the values corrected by Greenhouse-Geisser. The level of significance was set to α = 0.05. Effect sizes are categorized according to the guidelines from Cohen and Cohen [25] into small (< .01), medium (< .06) and high (< .14).

3.1 Fixation Durations

A one-way repeated-measures ANOVA was conducted to compare the effect of hand position on fixation durations. Generally, the effect showed no significant result (F(2,126) = 2.52, p = .084; \( \eta_{{{\text{p}}^{2} }} = .0 3 8 \)). However, after controlling for age by including it as a covariate, results revealed significant differences in the three hand positions with respect to fixation durations (F(2,124) = 4.17, p = .018; \( \eta_{{{\text{p}}^{2} }} = .0 6 3 \)) and no significant interaction effect (F(2,124) = 2.66, p = .074; \( \eta_{{{\text{p}}^{2} }} = .0 4 1 \)). In order to study the effect visually, error bar graphs were created for two different age groups (Fig. 2).

Fig. 2.
figure 2

Mean fixation durations (in msec.) for young subjects aged between 20 and 39 (n = 41) years and for an old group aged between 40 and 60 years (n = 23).

4 Discussion

Prior work has documented effects of hand proximity on perceptual processes. Abrams et al. [3] for example showed that subjects shift their attention between several items more slowly when their hands are near a display, resulting in longer search times. However, these effects were only studied with simple search tasks and did not take eye-tracking measures and age differences into account. In this study, we tested the effect of three different hand positions on fixation durations in a visual search task.

Overall, repeated-measures ANOVA showed no significant effect for the general model. However, after age was added as a covariate, results showed a significant effect with regard to the position of the hand. Fixation durations were shorter for the positions of the hands at the screen in comparison to the positions of the hands on the table and on the lap in the younger age group. However, for the older age group this effect was vice versa: Fixation durations were longer for the hands at the screen in comparison to the two other conditions. Studying the interaction effect showed no significant result but as the p-value was about p = .074 it can be concluded that there is a tendency in the values for an interaction in the scores of hand position depending on age.

This study therefore indicates that there are differences with regard to the position of the hands and eye-tracking measures and that these differences vary with regard to age for fixation durations. However, some limitations are worth noting. Although we found statistically significant results, we were not able to compare age groups due to different group sizes. As the eye-tracker we used only works for people not wearing glasses we had problems finding older aged subjects for our study. Another limitation is the fact that prior studies found longer search times for positions of the hands near a stimulus [21] which is not supported by the eye-tracking measures in our study for the young age group. An explanation may be the fact that although search times are longer, the process of searching may be less straining resulting in shorter fixation durations for nearby hands in the young age group. Therefore, further analysis concerning age differences is needed as well as an analysis of the amount of fixations found for the different hand positions. Furthermore, the dilation of the pupil will be studied with regard to the position of the hand as pupil diameter can be used as a measure of task difficulty.

Based on the results of the study, the effect evoked by proximal hands should be analyzed in a second study in a different task context. While in this study the effect was studied in a directional search task where the item that was searched was defined ahead, in the follow-up study the search task should be non-directional. Therefore, two parts of a display (one left and one right) should be compared regarding differences in items while effects generated by the hands are studied. In line with prior studies [3, 21], it is hypothesized that search times are longer for positions of the hands at the screen and that the effect of proximal hands is stronger for the right side of the screen for right-handed people as there may be more task interference by the prominent hand.