/ Smartphone / Efficiency /


Navigate photos on a smartphone without touching the screen and without voice control

Gesture-based Computing


While the concept will still use hands, it will not touch the screen and will be exponentially more effective.

The idea is to create an armband that will measure strength of the electromagnetic signals moving through the wrist, i.e. the signals sent by the nervous system to direct the motion of muscles in the fingers. By studying the nature of the signals passing through the wrist, the armband will be able to infer the motion of our fingers at any particular time, and send corresponding instructions to the smartphone on what to do.

For example, you could hover your finger above the smartphone and move it in circles, similar to operating a clickwheel on an iPod classic. The armband detects it, and sends signals to the phone, and now your phone will start swiping through the images on your photo album.

FP Scanner


Google Pixel already uses the finger print scanner (on the back) for basic navigation. This could be extended to allow left/right or up/down navigation through photo rolls.
This could be further extended to enable zooming in and out of images with a tap and hold or double tap and hold action, respectively.
This is obviously limited to mobile phones with FP scanners.

Volume Buttons


If you have your smartphone in your hands right now, look at it. Take note of your hand position and how easy it is to move your finger to the volume button.

When navigating photos, you can go forward or forward. Volume buttons work so great for this problem, because almost every smartphone have them, and there are exactly two of them; one button is used to go forward and the other to go backwards.

One difficulty would be to change the volume, since the buttons are multipurpose now. As soon as the photos app opens, the buttons would transform from volume to navigate photos mode. And as soon as you play a video the buttons change back to volume.

Eye Tracking and Eye Gestures


this method may be complex but its effective if you want to scroll the gallery hands free , that is to track your eye movements such as when the user moves his eyes from left to right the images in gallery moves from left to right
vice-versa. this can be done by using the rear camera of mobile , if this is not effective we can use “Eye gestures”
for example blink once to move image from left to right in gallery , twice to move right to left



We could use the IMU of the phone to capture changes in its orientation.

Tilting the phone to the right and back would bring the next photo.

This has the advantage of being usable with one-hand and quite cheap in term of computation.

Tap Cam


An alternative version to SwipeCam. A user could simply tap the microphone to go forward or double tap to go backwards. This would require very simple audio processing and may have a little less room for error, although it would lack the actual swipe effect.



You could use the front facing camera to navigate through photos. I believe you could activate the camera (while not displaying it on the screen) and then calculate swipes based on image recognition. You would only need to register an up / down swipe or a left / right swipe. This method would work best when the device is in landscape mode and would keep the hands completely free of the screen.

Submit your own idea!

Submit your idea to earn a future revenue share!

+ Submit Idea