Keywords

1 Introduction

The electric wheelchair is an indispensable tool for handicapped people. It especially has a great value for ones with impaired legs. It is not, however, always easy to use for those who have invalidity with hands. Most electric wheelchairs are designed to be operated through a joystick, which requires operations with hand; therefore, other operation manners have to be given for hand-impaired people. One of such improvements of an electric wheelchair is intelligence. The purpose of intelligence is to realize safer and friendlier driving assistance by giving intuitive operations for driving electric wheelchairs.

Eye tracking is one of the most leading candidates of hands-free operations [1, 2]. In traditional eye tracking operations, users have to focus their attentions to the direction by their eyes. When driving a electric wheelchair, the user has to fix his or her eyes to the destination or goal of the electric wheelchair. It is not only unintuitive, but also tiresome. In this paper we propose a user interface that allows users to intuitively control their electric wheelchairs through eye tracking. Our electric wheelchair has a laptop PC, which displays the sight in front of the electric wheelchair with a web camera. The user can observe the sight as if he or she looks through the window.

The user interface for driving uses AR technique. AR is a technique and a method for expanding reality by adding “something different information” to human-perceivable information in the real world. In order to provide feedback to the user so that he or she can confirm the destination of the electric wheelchair, the front sight displayed on the PC includes an oval shape as a destination sign. The highlighted oval shape indicates the spot on the floor the user looks at, as if he or she plays search light on the floor. This feature let the user confirm he or she is driving the wheelchair in the right direction. Thus, iteration of AR based specification of the destination and move to it contributes to intuitively driving electric wheelchair without using hands or legs.

2 Related Work

There are roughly two approaches proposed for making of electric wheelchairs intelligent. The first approach is autonomous movement such as giving map information or recognizing surrounding environments through several sensors, so that the electric wheelchair can autonomously avoid obstacles and traveling along the wall without troublesome operations. This approach is effective for prevention of accidents due to user’s mistakes in judgment or operation errors. The second one is improvement of the user interface, which introduces an alternative method to replace the joystick which is a conventional operation for input interface. This approach is effective for users such as a hand-impaired people.

2.1 Autonomous Traveling

Many intelligent electric wheelchairs have been developed to realize autonomous traveling by recognizing the surrounding environment using multiple sensors [3,4,5]. They have infrared sensors and ultrasonic sensors as distance sensors, and touch sensors as contact sensors. Once these sensors recognize some obstacles, the electric wheelchairs calculate suitable roundabout routes. Furthermore, some systems that drive a wheelchair through visualizing a destination based on electroencephalogram [6]. In these driving systems, users specify the destination instead of the traveling direction; the specification manner reduces the burden on the user. Our user interface also aims at reduction of users’ burden in the same direction.

2.2 User Interface Improvements

Some control utilize facial expression methods through recognizing gestures of user’s face with the web camera [7], and the inclination of the user’s face [8]. Also, some electric wheelchairs are equipped with an interface that selects and manipulates buttons on a GUI (Graphical User Interface) built on the laptop PC screen with gaze [9, 10]. In addition, electric wheelchairs with a voice input interface have been developed [11]. In our proposal, we mainly improve the user interface, but we have also included autonomous driving features in two aspects.

3 Device Overview

3.1 Concept of Our Electric Wheelchair

Figure 1 shows the concept of our electric wheelchair. Users drive the wheelchair with a laptop PC on his knee. As shown in the figure, the laptop PC has an eye tracker sensor and a web camera. The web camera takes the sight in front of the electric wheelchair, and displays it on the screen of the PC. Also, an eye tracking sensor is equipped at the joint of the laptop so as it faces with the user to give the coordinates of the viewpoint on the PC screen. In order to track the user’s eye correctly, the display of the PC is inclined with right angle as the display and the user’s line of sight crosses perpendicularly.

Fig. 1.
figure 1

Overview of our electric wheelchair

3.2 AR Based User Interface

The sight in front of the user, which is a floor in most cases, is displayed on the screen of PC. The user can see the sight on the screen as if he or she is looking at the front sight through a physical window. On the screen, the user can see the spot corresponding to the destination of a line of sight on the floor, as shown in Fig. 2. It looks as if a small oval area of the floor is lighted up by a search light. Since the spot is on the line of sight, the user feels as if light would be emitted from his or her eyes to the floor as shown by the dotted line in Fig. 1.

Fig. 2.
figure 2

Image on the display

Even though the destination spot moves along the line of sight, once the user determine the destination, it is not necessary to make the destination spot move. We provide a means to fix the destination. For example, the user can fix the spot through a trigger like blinking his or her eyes. In our current prototype, we tentatively use a specific key on the PC to fix the destination spot. When the destination spot is fixed, the wheelchair moves forward until the destination spot comes under its wheels, as shown in the right side of Fig. 1.

3.3 Operations of the Electric Wheelchair

We can summarize the manner for driving our electric wheelchair using our user interface as follows:

  1. 1.

    Move the oval on the screen with the line of sight to the coordinates of the user’s destination.

  2. 2.

    When the user matches the oval to the destination, he or she blinks one eye for 0.5 s or more.

  3. 3.

    A signal is sent from the PC to the electric wheelchair, and the electric wheelchair move to the destination.

  4. 4.

    Once the electric wheelchair reaches to the destination, return to Operation 1. The user can reach the final destination through repeating this set of operations. If the user wants to stop halfway, he or she blink one eye.

Basically, the operations that the user is responsible are only to determine the destination with gaze, and to blink one eye for start and stop.

Notice that eye tracking is not performed during movement of the wheelchair. In general, continuous eye tracking imposes a burden on users. In our user interface, on the other hand, the user does not need to fix his or her stare at a point for operations. Our user interface liberates the users from concentration of eye-tracking. The user can see anything during the movement of wheelchair once he or she decides the destination.

It is rare for users to directly specify the final destination, because the display of laptop PC is not big enough. Also, there may be some obstacles on the path to the destination. Thus, we have designed our user interface to let the user reset partial destinations repeatedly while approaching to the final destination.

4 System Overview

4.1 Overall System Configuration

Figure 3 is an overview of our system. The image obtained through the web camera is displayed on the screen, and the depth value obtained by the sensor is sent to the user program of the host PC. Also, the ellipse is displayed at the coordinates of the line of sight on the screen, which are obtained by the eye tracker connected to the PC. Once a gesture (blink) is recognized with an eye tracker, a signal for moving to the specified coordinates is sent to the RRC of the electric wheelchair from the user program of the host PC. RRC is a device that adjusts the contents of the communication signal between the electric wheelchair and the host PC.

Fig. 3.
figure 3

System configuration diagram

4.2 State Transition of Electric Wheelchair

Figure 4 is a state transition diagram of the electric wheelchair with our user interface. Basically, when we use interface, state of our system is Run mode, and then, the state transits to output mode of (vi) in Fig. 4. In this state, the host PC sends a signal with output command to the RRC. Table 1 describes the output mode and output command.

Fig. 4.
figure 4

State transition diagram of electric wheelchair

Table 1. Output mode and output command description

4.3 Driving Controls

Figure 5 shows flows of data or commands in our system. In order to make an electric wheelchair move to the destination on the floor, the degree \(\theta \) of the destination for the front of the wheelchair, and the distance d from the wheelchair to the destination on the floor (DoF). Therefore, our system calculates them based on the coordinates of the destination on the screen (DoS), which is given by an eye tracker as a view point on the screen, and depth information, which is given by a depth sensor.

Fig. 5.
figure 5

Overview of control system

Once user’s Gesture is recognized, the turning direction, i.e. the left turn or the right turn, is first determined. The direction can easily be determined based on DoS. The turn off degree \(\theta \) can be calculated as follows: \(\theta = \arctan (Gx / depth)\) as shown in Fig. 6(b). Gx is the x-coordinate in global coordinates obtained from the depth sensor with depth value. As well, the distance from the Web camera (WtoD) to DoF can also be calculated using Gx and depth as follows: \(\text{ WtoD } = \sqrt{Gx^2+\text{ depth }^2}\).

Once WtoD is obtained, assuming that the height from the floor to the web camera is fixed at \(height = 0.85\) [m], as shown in Fig. 6(a), the distance to the destination can be calculated as follows: \(d = \sqrt{\text{ WtoD }^2 - 0.7225}\)

Finally, PC sends a command for turning in degree \(\theta \) and then a command for going straight distance d to the electric wheelchair.

Fig. 6.
figure 6

(a) Destination distance from the web camera; (b) Angle to the destination

5 System Implementation

As shown in Fig. 7, we have implemented our user interface on a real electric wheelchair. Main components are as follows:

Fig. 7.
figure 7

Implementation of our user interface

  • A laptop PC with the proposed user interface

    This is the core of the entire intelligent wheelchair control system. All the data are processed on this laptop PC, which send all the control signals to the electric wheelchair driving system (RRC).

  • Web camera with depth sensor

    This is mounted on a rack at the front of electric wheelchair. The depth sensor is used to measure the distance to the destination.

  • Eye tracker

    This is equipped on the joint of the laptop PC. It recognizes the user’s line of sight coordinates and gesture.

  • RRC (driving system)

    This is used to supply energy to the motors of wheelchair based on control signals sent by PC.

Figure 8(a) shows a screen shot of the destination on the floor and the screen of overlaid oval. Figure 8(b) is a screen shot of the destination specified by an oval. Notice that the size of oval varies in proportion to the distance. This feature gives users intuitiveness similar to physical perspective.

Fig. 8.
figure 8

(a) Destination set on the floor and oval; (b) Match the destination and the coordinates of line of sight

Fig. 9.
figure 9

(a) Immediately after gesture (command transmission); (b) After arrival at the destination (Color figure onine)

Fig. 10.
figure 10

Measuring the accuracy of arrival at the destination

Figure 9(a) shows a screen when one eye is closed and a command for fixing the destination, and the signal is transmitted to the electric wheelchair. At this time, the color of the oval turns to red, and the color is kept until the electric wheelchair reaches the destination or stops. The color representation of the oval contributes to intuitive operations. Also, Fig. 9(b) shows the screen after arriving at the destination. As mentioned above, when one of the eyes is closed while traveling, it stops and returns to the state of Fig. 8.

6 Experimental Results

In order to demonstrate the effectiveness of our AR based user interface, we have conducted experiments, and measured accuracy for arriving at the destination and tracing cranked route, and the number of executed operations.

  1. 1.

    error for reaching point

    As shown in Fig. 10, for some cases, where the direction is set to the right turn and the left turn, the distances are set to 1.2 [m], 2.4 [m], 3.6 [m], we have measured the difference of the positions between the destination and the electric wheelchair after arrival five times. Table 2 shows the results. As shown in the table, the farther the destination is, the more errors occur.

  2. 2.

    tracing cranked route and the number of operations

    Figure 11 shows the cranked route we employed for the electric wheelchair passes through, and how many times the user had to perform the operations. The dotted line is the path through which the electric wheelchair passed. Although we conducted the experiment several times, the number of operations until reaching the final destination was roughly four times.

Table 2. Differences between the positions of the destination and the wheelchair actually reached.
Fig. 11.
figure 11

Cranked road and traveling route of WC

7 Conclusion and Future Work

There are some electric wheelchairs (EWC) controlled with gaze have already been developed, but most of them lack intuitiveness or impose a burden through gazing at the direction during operation. Therefore, we have proposed and developed an ergonomic user interface for controlling an electric wheelchair. As the interface, we have adopted the operation manner to designate the destination instead of the direction to control the electric wheelchair. This reduces user’s fatigue during operation. In addition, the gaze based approach enables handicapped people to operate the electric wheelchair easily and intuitively through AR technology. However, the results of the experiments showed that Table 2 shows that specifying far positions tend to cause certain errors. Also, we have observed that complex routes such as a crank intend to require too many operations. In future works, we plan to combine our approach with the traditional obstacle avoidance algorithm in order to enhance autonomous driving. This should allow the user to reach the destination while avoiding obstacles in an unexpected environment and reduce the number of operations. Also, since the operating range of the oval is restricted on the screen of the PC, the use of wearable device would make operation for turning around easy, and enable to decrease the number of operations.