Monday, May 24, 2010
In this video about Android 2.2, the first new feature in the package is faster rendering speed, something close to our heart at TAT. But most of the time it is hard to sell better performance in a Youtube video, since the framerate of the video typically is way below the framerate of the device in real life. But, here they show two examples with an increasing amount of graphical objects, and color the entire screen red when the framerate drops below a certain threshold. Very persuasive. Although I would probably set the threshold framerate at 60 fps to ensure a solid gaming experience.
Posted by Staffan Lincoln kl. 7:06 PM
Monday, May 17, 2010
I kind of like the use of pen and fingers on the same device. By using them both at the same time you can get a simple mode switch similar to right clicking on your mouse. But writing on a thick glass surface where the display is a few millimeters under the surface is not like writing on paper. Its more imprecise, like drawing with a really thick crayon. The distance makes it hard to predict exactly where you get ink when you set down your pen. I think you can see the effect of this on the graceless handwriting in the demo.
When there is a layer of glass between the pen and the pixels, you get a mismatch that depends on your viewing angle. I tried the Wacom Cintiq and it has this problem. If only you could have a camera on the device, that could track your face and calibrate based on the angle, then you could solve this problem. Or, just have an incredibly thin glass.
Posted by Staffan Lincoln kl. 8:24 PM
Thursday, May 13, 2010
Wednesday, May 12, 2010
I think this is very exiting. Just like you have adaptive music in games that heightens the mood, so too could you have music in your book, that changes depending on what you´re reading right now. Or, you could even have a song book, that played accompaniment in the speed you´re reading. Or maybe that would just be weird.
It would be nice to experiment with dimming text that you´ve already read. Distracting, or awesome? Only testing it will tell me for sure.
Counting the word-speed could help find passages in a text that are hard to read. This would be awesome for wikipedia, as it could automatically flag passages for revision.
Exiting times ahead, when built in cameras will be hi-rez, and hi-speed, and computing power will be endless. Then, eye-tracking will be cheap, and reliable. This is within reach in a few years.
Posted by Staffan Lincoln kl. 7:38 PM