Unless you are an avid/regular blog reader it is unlikely that you have heard of BlindType…yet. So does it concern you? In all likelihood, yes, yes it does concern you but by no means in a bad way. How many people do I know who only buy Blackberry phones over iPhones or other touch screen devices because they have a “proper keyboard” or to be more precise a hardware keyboard. BlindType has taken an interesting and probably more successful approach to this problem. However before jumping straight in and showing it to you is probably best to explain a bit more about tactile feedback first.

Indeed even I as a touch screen user find it difficult to type on a touchscreen without having to focus on the on screen keyboard all the time. The main reason for this is that a touchscreen keyboard gives no tactile feedback (actually it does give a really tiny bit because of the edges of the screen and shape of the back of the phone but I digress – either way it is insignificant compared to a hardware keyboard). Tactile feedback enables your brain to work out where your hand/thumbs is/are on a keyboard automatically without you having to focus specifically on the keys you are pressing. Basically it will automatically count the keys as you slide your thumb over them or associate the points of pressure on your thumbs to work out a reference point, the process by which your brain remembers will differ according to device being used and indeed the particular persons way of remembering things. The only thing which most phone have now is a small vibration so you know when you have pressed a key, not really that helpful when typing without looking constantly.

In simple terms, your brain has an image of what your keyboard looks like “shape wise” in your head and probably a pretty good idea as to where all those keys are in space. What tactile feedback does is it gives a reference point on that visualized keyboard in your mind so you know where your fingers/thumbs are and how far to move them to touch a key. Learning how to touch type is basically taking this knowledge and training you to keep your hands regimented in their positioning on the keyboard. Your brain does the rest for you… in a sense (wheeeey! Sorry I couldn’t help it).

Ok so how can one fix this? Well the obvious answer would be to introduce tactile feedback. That however, isn’t easy to do on a touch screen, it’s not going to function like a touch screen if it isn’t flat is it? I suppose ideally you’d want a morphing screen which could change shape according to what it was doing and “simulate” real tactile feedback. This isn’t going to happen, not anytime soon anyway. Some of you may have heard of Swype which is one such technology which tried to deal with this issue. What Swype does to help is that by allowing you to just drag over the keyboard it allows you to use many fewer reference points from feedback. There is still no tactile feedback but you don’t need to look as much because you aren’t lifting and placing so often. Once you have a position on the keyboard you can make your drags to respective keys in order reasonably accurately with only a few glances. Swype is fantastic (was noticed and bought by Samsung for xyz million dollars I think it was 8 but I’m not sure (citation please)) and allows for some seriously speedy typing however it still doesn’t really get rid of the problem with touchscreen not having tactile feedback.

Ok I think it’s time to introduce BlindType and why it’s so exciting to me as a user, as well as a neuroscientist. So at the moment we have this idea that our brain has an image of the keyboard we are going to type on in our heads, we look/feel for a point on a keyboard to tell our brain where we are n its “map”, and then type using both pieces of information right? Ok so who says that we/our brain needs to work out where the keyboard is? This is what I imagine what spawned the thought behind BlindType. If both our brain and the computer have knowledge of what a keyboard layout is – which we do why can’t the computer work out the size, shape and position of the keyboard we are planning on using? Well the answer is that it can which is pretty awesome if you think about it. That means you can type anywhere on your phone screen in any particular orientation and as long as you keep the same shaped keyboard (ish) in your head – the computer can work out the shape of the keyboard for you and in turn what you typed. What’s even better is this doesn’t require you to even look at your phone which means texting and walking on touchscreen keyboards could finally be possible! The video below shows this in action the best – later on it shows the “keyboard shapes” which it is detecting just to show the technology working.

So how did I find out about this? Well the exciting thing is Google bought them on the 1st of October 2010 so hopefully it will find its way to Android or indeed all touch screen phones/tablets/all in one PC’s with touch screens, as soon as possible!

The project was put together originally by Kostas Eleftheriou – a computer science graduate from Warwick university and Panos Petropoulos who has a degree in marketing from Deree College.  I think both of them deserve quite the pat on the back – although Google has already kind of done that!  Either way I’m looking forward to easier typing on touch screen devices, what do you think?

Find out more information about BlindType by going to their site here, or read their FAQ here.