En route to Stanford University recently, I had to undergo fingerprinting at San Francisco airport, using one of those greasy little touchscreens the customs people use. I’ve been thinking about such screens a lot of late, especially in their role as data fusion tools, uniting information from the modalities of touch and vision.
Touch a part of this cameraphone screen, for example, and it can be made to focus in on that location.
Today’s invention is a touchscreen which unites these functions; providing conditional access to a subset of available applications. Press a particular location on the screen and it determines your identity before allowing you to use the functionality of the ‘button’ you are contacting -or not.