Most of us are so attached to our smartphones that keeping them away even for a short while makes us feel uncomfortable. We are using them all the time – in colleges, work places, when idle and even when walking! Looking constantly into one's phone while walking, though, involves the highest risk, because it means getting our eyes off the road, and off the nearest obstacle.

Like for everything else, there's an app for that too, or at least a researcher claims so. Hincapié-Ramos, a postdoctoral researcher at the University of Manitoba’s human-computer interaction lab in Winnipeg, Canada, is reportedly working on an app, aptly titled CrashAlert

The app's description says that it “augments mobile devices with a depth camera”, which offers distance and visual cues of the barriers that a user has in waiting. The description adds, “In a realistic environment outside the lab, CrashAlert users improve their handling of potential collisions, dodg-ing and slowing down for simple ones while lifting their head in more complex situations. Qualitative results outline the value of extending users’ peripheral alertness in eyes-busy mobile interaction through non-intrusive depth cues, as used in CrashAlert. We present the design features of our system and lessons learned from our evaluation.”

At the moment though, there is the CrashAlert prototype that operates on an Acer A100 7-inch tablet computer (running Android v2.3.3), a laptop computer and a Microsoft Kinect. The laptop, together with a 12 volt battery, is carried in a backpack to power the Kinect in a mobile setting. The way this setup works is that the laptop receives images from the Kinect via USB; it then processes and transforms them, and sends them to the tablet via Bluetooth. 

A detailed report on the app explains that the tablet receives images at approximately 10-11 frames per second. The application is written in C#.NET. It interfaces with the Kinect, processes the images with OpenCV and sends them over Bluetooth.

To see if CrashAlert app really worked, the researchers carried out an experiment. They got in eight university students to use the app. To create a real world-like scenario, the participants were asked to play a whack-the-mole game walking through the university cafetaria. Each trial session comprised walking at the bookstore close-by and moving around the entire food court. Participants had to walk as normally as they could while playing the game. They had to hit on as many moles as possible during their walk. “We ensured that participants would face at least four collisions during each trial. This was achieved by asking an ‘actor,’ unknown to the participant to provoke potential collisions. The ‘actor’ would do one of the following: cut the participants’ path orthogonally, would stop right in front of them, would come toward them at a fast pace, or would walk beside them but then immediately swerve in their lane,” the report explains.

During the entire session, the behaviour of the participants during any potential collision was recorded 

The researchers conclude in their report that the app shows the user the distance and position about potential obstacles. This information is displayed on a “minimal footprint ambient band” that is located at the top of the device’s display. 

They found that the participants used “simpler corrective actions early on in their path” when they spotted an obstacle. Participants, the researchers say, felt safer with their system. 

This improvement came with no negative impact on performance, showing that even mini-mal environment information outside the user’s periphery can provide for safer usage of mobiles while walking,” it concludes. 

Tags: , , , , , , ,