Without a doubt, touchscreen technology has brought a whole new level of interaction with our devices. Israel’s Efrat Barit and Saar Shai, however, believe that the functionality of touch-enabled devices could be greatly enhanced with the development of their Ringbow concept. Worn on the index finger, the ring-like device can be programmed to add extra capabilities to existing actions, activate entirely new touch options, or liberate the user’s hands from the surface of the display for Kinect-like, spatial control over touchscreen device operation.
The evaluation prototype of the Ringbow concept is currently being issued to a limited number of application developers to test the kind of functionality and ergonomics destined for the production model. It communicates with a touchscreen device via Bluetooth and presently works only with the Android platform, but will be opened up for development on all touch-based platforms. It also looks a good deal chunkier than the product renderings on the company’s website, but that’s probably to be expected at this stage in the process.
To the side of the device and within easy reach of the thumb is a programmable 5-way directional control button. This can be used to add contextual flavors to existing touchscreen interaction, to toggle between menus or different interface displays, or to cater for the manipulation of virtual interface elements and their properties, or of offscreen visuals, and much more.
For example, the up, down, left, right and push inputs could be programmed to correspond to visual elements in a game or application, or add specific functionality to certain actions. Touching a point on the screen might open a browser, while touching the same point with the Ringbow activated might open the browser and head to a favorite site or simultaneously open a widget or app. Gamers needn’t bother stretching for a weapon’s magazine reload key, as this could be taken care of by the Ringbow.
The wireless nature of its connectivity also means that multi-touch input could be married with spatial gesture-based commands to improve performance and functionality, and certain interactions needn’t require the user to touch the screen at all.