recentpopularlog in

robertogreco : accelerometers   3

Apple invents natural tap-based gesture input for nudging onscreen objects, selecting text
"A patent granted to Apple on Tuesday reveals a novel mode of mobile device gesture input that turns taps detected on non-touchscreen surfaces, like the side of an iPhone, into granular on-screen controls.

As published by the U.S. Patent and Trademark Office, Apple's patent No. 9,086,738 for "Fine-tuning an operation based on tapping" describes a solution to a problem many iPhone and iPad owners face when attempting to conduct highly granular user interface manipulations on multitouch displays.

As Apple notes, touchscreens excel in operations requiring only coarse granularity, such as swipes and taps, but are often times unsuitable for performing fine adjustments. For example, picking out a specific character in a line of text is difficult on a touch interface because the mechanism relies on an input object with a relatively large contact area (a user's finger).

Apple's iOS features a virtual magnification loupe as a workaround for accurate UI asset selection, but the method is not as precise as a traditional computer mouse. Instead of looking for an answer in multitouch screen technology, Apple's patent makes use of motion sensors available throughout its iOS device lineup.

In one embodiment, a user is able to move an onscreen object left or right with extreme precision, perhaps nudged a pixel at a time, by lightly tapping on the side of an iPhone. Tap gestures on non-touchscreen portions of a device are picked up by an accelerometer or gyroscope and processed naturally, meaning inputs are represented onscreen in an equal and opposite direction. For example, a light tap on the right side of an iPhone would move an object to the left, while a tap on the left would send the object to the right.

The patent also accounts for varying input magnitudes. Stronger taps move objects greater distances, for example.

Another embodiment detailing text selection notes users can easily extend or contract an active boundary through suitable tapping procedures. Lighter taps would move the cursor one character at a time, while more prominent taps jump entire words or lines. The idea can be extended to any number of selection or virtual object manipulation operations, as seen in the above illustration relating to a spreadsheet application.

Apple also covers taps in other directions, for example from the top and bottom of a device, as well as input involving more than one finger and other UI variations.

It is unclear if Apple intends to incorporate the tap-based fine tuning mechanism into its iOS platform anytime soon. However, the company is slowly extending device usability beyond the years-old multitouch interface by augmenting its devices with new forms of input like Force Touch, which is rumored to make the jump from Apple Watch to iPhone this year.

Apple's patent for fine UI manipulation through tap gestures was first filed for in January 2013 and credits Maxim Tsudik as its inventor."
2015  via:tealtan  interaction  apple  patents  technology  nudging  interface  gestures  touch  motion  ios  accelerometers  gyroscopes 
july 2015 by robertogreco
Codify – iPad
"Codify for iPad lets you create games and simulations — or just about any visual idea you have. Turn your thoughts into interactive creations that make use of iPad features like Multi-Touch and the accelerometer.

We think Codify is the most beautiful code editor you'll use, and it's easy. Codify is designed to let you touch your code. Want to change a number? Just tap and drag it. How about a color, or an image? Tapping will bring up visual editors that let you choose exactly what you want.

Codify is built on the Lua programming language. A simple, elegant language that doesn't rely too much on symbols — a perfect match for iPad."
ipad  programming  ios  development  gamedev  multitouch  codify  applications  via:kottke  interactivity  accelerometers  touch 
october 2011 by robertogreco
Enigma Gadgets:NameSpace
"Here is I. M. Chip Blue, the fifth in my series of Enigma Gadgets. Like the others, it's based on the Arudino microcontroller and uses the Quadravox QV300 speech module. The QV300 is programmed from the factory to speak 240 common technical terms including units of measure, numbers and colors. I. M. Chip Blue also contains a Memsic 2125 accelerometer. I have programmed it the device to speak nonsensical sentences based on a set of rules. The rules vary depending on the way the device is oriented."
craighickman  arduino  microcontrollers  fictionalsmartboxes  accelerometers  numbers  colors  voice  nonsense 
november 2010 by robertogreco

Copy this bookmark:





to read