How do they program a touch screen?

377 views

I definitely take this technology for granted, but on occasion I find myself blown away by how they do this. Especially in some applications where you can zoom in and it still works flawlessly. It’s almost as though every pixel is programmed in an ever changing screen.

I’m very un-techsavvy so I hope I’m posting in the right place.

In: 2

4 Answers

Anonymous 0 Comments

A touch screen is not programmed. It is an input output device.

The software of the device basically sends a signal to the screen to indicate which pixels to illuminate.

The touch part of the device is based on measuring something called capacitance to establish where on the screen you are touching.

Anonymous 0 Comments

The conducting layer is connected to a low voltage so that for a short time, there is a tiny electric current on the screen. This leaves it with a small electric charge. When your finger touches the screen, some of the small electrical charge flows on to it at that location. The program will know the location of your finger….the programmer will have a template in the background coded to do a function based on where you touch on the screen.

Anonymous 0 Comments

First: A touch screen’s touch capabilities are *completely* unrelated to its capabilities as a display and vice versa. A touch screen is basically a big touch pad like what you have on a laptop except transparent, glued on top a screen, and in the rare cases where it isn’t glued you can often actually replace one or the other in case it breaks without needing to replace the other.

Once you grasp this fact that the “touch” and “screen” are completely separate things you can more easily realize that your touches aren’t actually **directly** influencing the pixels on the screen at all. Instead the computer both are connected to reads the touch inputs, interprets it as some kind of command, and changes the pixels it is telling the screen to display.

Anonymous 0 Comments

There are two main things to understand.

(1) The touch sensor is completely separate from the screen. Usually the touch sensitivity comes from a thin conducive layer on the top surface of the glass, while the display is an LED panel behind the glass.

(2) All the touch sensor does is report to the system where you touch.

When you tap the glass, it works out where you touched based on which parts of the glass have lost a little bit of electric charge, and turns that into a set of coordinates. The system will occasionally “poll” the touch sensor – essentially, it asks “Hey, touch sensor, my man! You been touched anywhere since I asked a few microseconds ago?” And the touch sensor responds “Yeah, I’ve got a touch event for you. I was touched about 3cm from the left edge and 8cm from the top edge.” Which we might convert into a simpler coordinate, with the number of cm from the left as the first number (the x coordinate) and the number of cm from the top as the second (the y coordinate), making this touch (3,8).

The system then takes that information and makes it available to the programs running on it. So let’s say your program has a button. The program is constantly checking to see if there’s any touch data, and if it finds that there is some, it will then check the coordinates of the touch against all the features. Let’s say it’s told the screen to draw a big square button, that goes from (2,7) to (4,9). The touch is at (3,8) – 3 is between 2 and 4, 8 is between 7 and 9, so it looks like the tap is inside our button’s “bounding box” and the program considers that to be tapping (pressing) the button. Notice that all the button pressing is happening inside the program – it decided where the button was, it decided that the tap was inside the button’s borders. All the screen did was give the coordinates of the tap.

“Zooming in” is also handled by the program. What’s shown on the screen is entirely up to the program. If it wants to let you zoom in, it just makes everything bigger. “Okay, the button is now from (1,6) to (5,10).” It then tells the screen to draw a button stretching between those coordinates instead of the old ones. It then just performs the same check of seeing if the tap coordinates provided by the touch sensor is inside that new, bigger bounding box.

The touch sensor, and the display, both parts of the screen – they really don’t care what’s going on inside the program. They just report the coordinates of taps and draw shapes the way it’s told to respectively, the same way they always do, chugging along happily. The magic all happens inside the program.

And for final wrap-up, the system is the physical circuit everything is plugged into. The physical computer the touchscreen is a part of. The physical circuit occasionally sends electrical pulses here and there to signal to the different parts to do their thing.