How do they program a touch screen?

222 views

I definitely take this technology for granted, but on occasion I find myself blown away by how they do this. Especially in some applications where you can zoom in and it still works flawlessly. It’s almost as though every pixel is programmed in an ever changing screen.

I’m very un-techsavvy so I hope I’m posting in the right place.

In: 2

4 Answers

Anonymous 0 Comments

There are two main things to understand.

(1) The touch sensor is completely separate from the screen. Usually the touch sensitivity comes from a thin conducive layer on the top surface of the glass, while the display is an LED panel behind the glass.

(2) All the touch sensor does is report to the system where you touch.

When you tap the glass, it works out where you touched based on which parts of the glass have lost a little bit of electric charge, and turns that into a set of coordinates. The system will occasionally “poll” the touch sensor – essentially, it asks “Hey, touch sensor, my man! You been touched anywhere since I asked a few microseconds ago?” And the touch sensor responds “Yeah, I’ve got a touch event for you. I was touched about 3cm from the left edge and 8cm from the top edge.” Which we might convert into a simpler coordinate, with the number of cm from the left as the first number (the x coordinate) and the number of cm from the top as the second (the y coordinate), making this touch (3,8).

The system then takes that information and makes it available to the programs running on it. So let’s say your program has a button. The program is constantly checking to see if there’s any touch data, and if it finds that there is some, it will then check the coordinates of the touch against all the features. Let’s say it’s told the screen to draw a big square button, that goes from (2,7) to (4,9). The touch is at (3,8) – 3 is between 2 and 4, 8 is between 7 and 9, so it looks like the tap is inside our button’s “bounding box” and the program considers that to be tapping (pressing) the button. Notice that all the button pressing is happening inside the program – it decided where the button was, it decided that the tap was inside the button’s borders. All the screen did was give the coordinates of the tap.

“Zooming in” is also handled by the program. What’s shown on the screen is entirely up to the program. If it wants to let you zoom in, it just makes everything bigger. “Okay, the button is now from (1,6) to (5,10).” It then tells the screen to draw a button stretching between those coordinates instead of the old ones. It then just performs the same check of seeing if the tap coordinates provided by the touch sensor is inside that new, bigger bounding box.

The touch sensor, and the display, both parts of the screen – they really don’t care what’s going on inside the program. They just report the coordinates of taps and draw shapes the way it’s told to respectively, the same way they always do, chugging along happily. The magic all happens inside the program.

And for final wrap-up, the system is the physical circuit everything is plugged into. The physical computer the touchscreen is a part of. The physical circuit occasionally sends electrical pulses here and there to signal to the different parts to do their thing.

You are viewing 1 out of 4 answers, click here to view all answers.