Beijing time on February 9 news, touch panel is to promote the evolution of mobile phones to smart phones a major driving force. Since the Apple iPhone's "iPhone" was introduced in 2007, many people have started to use their hands to browse websites that need to be accessed from computers.
To make a great contribution to this, it can be said that the evolution of input devices. Because browsing of a web page from a small screen of a portable terminal is no longer difficult as the keyboard of a conventional mobile phone becomes a touch panel, screen zooming, and scrolling operations are simplified. Abandoning the popular resistive film type at that time, adopting a projected capacitive touch panel capable of sliding screens, pinch zooming, and the like is a decisive factor in promoting the popularization of smart phones.
However, today's touch panel approach is not a panacea. And, just like the birth of the smart phone market, with the continuous evolution of touch panels, new applications may emerge in the future. The author recently had the opportunity to interview the expert of the interactive technology that combined human behavior with the movement of machinery - Professor Keisuke Inami of Keio University, and asked Professor Inami to look forward to the future of the touch panel. Below, the author will present the contents of the interview for everyone in the form of question and answer.
- How do you think touch panels will evolve in the future?
Today's touch panels are tools that people use to instruct robots and operate them. In the future, touch panels may evolve to become interfaces for human-machine interaction. By then, all the objects around you will act as a touch panel.
For example, the sofa in the living room becomes a touch panel. When the owner lies down, the sofa can detect movements, dim the lighting of the room, or turn the lights into warm white to create a relaxing environment. In addition, if the chairs and tables in the study become touch panels, what will happen? When the owner continues to maintain the same posture, the table and chair will be able to remind him to "change position." I think that in the future, with the help of touch panel functions, furniture will develop intelligently.
The touch panel is suitable for detecting the movement of the human body. Because in comparison to any other way, the information that this kind of method can obtain is closest to the user. Like the sofa example mentioned above, the sofa can get close to the user's information. It can be called a touch panel that uses the entire body for input. In addition, we can also imagine many different use scenarios.
For example, a touch panel can be laid on the bed to read skin and respiratory data and apply it to health management. You can also make the carpet a touch panel, identify the identity of the person walking on the carpet, and adjust the air-conditioning temperature of the room according to his preferences. Or let the table have the function of a touch panel, read the meal according to the number of dishes, determine whether the food is overdose, etc. Of course, this may be a mischievous thing (laughs).
Common to these uses is that a person does not consciously touch to operate a machine, but rather measures and analyzes when a person naturally touches an object or when the objects touch each other, and the machine actively takes some action. Because it is not consciously touching, but "involuntarily," the machine can take a step forward and provide a "transparent service" that is comfortable, safe and secure.
- The camera is also a tool for detecting human movements.
The camera is characterized by being able to obtain information related to human motion at a distance. But also can read facial expressions and other subtle information. However, there are restrictions on its use. First, cameras use light sensors that do not have access to information in dimly lit environments. There are also socially acceptable issues such as whether people can accept cameras in private places such as restrooms and sleeping rooms.
- Nowadays, proposals for "non-contact technology" such as speech recognition using speech input, gesture manipulation to swing the arm in space, and use of line-of-sight direction of sight input are constantly emerging. There are even opinions that "after touching, it will be non-contact." How do you locate these non-contact technologies?
These methods are quite different from touch panels. Because each method has its own "limitations," it is necessary to combine multiple methods.
An example is the operation of a car navigation system for speech recognition. Speech recognition uses language to input. Therefore, there is no way to respond to "I want to go (map) here" and other instructions. Similarly, ambiguous instructions such as "Put that there" cannot be handled. Therefore, it is more and more important to combine the gesture operation with the visual line input to achieve complementary information.
When operating with gestures, if the user wants to perform a more complicated operation, it needs to supplement the space information. Take the steering wheel of an automobile as an example. The tool for supplementing location information is an object such as a steering wheel. When the driver holds the steering wheel to rotate, he can intuitively feel the right rotation, left turn, change the rotation angle required for the lane, and drive smoothly. The operation with two hands dangling is not a concept at all. The steering wheel can be seen as a passive interface. In addition, from the perspective of the relationship between people and machinery, the steering wheel also has the effect of "providing support for the driver's hands." With the support of the steering wheel, the driver can avoid fatigue.
- I have a problem with the touch panel now used by people operating machinery. When using the touch panel, you must always observe the display. Is there any solution to this problem?
You are right, touch panel input is difficult to achieve blind touch in principle, but also narrow the user's field of vision. The touch panel is characterized by being able to use the display screen to freely change the user interface such as the size of the buttons, but it is also necessary to always look at the input position.
However, I think this can be improved. For example, tactile feedback technology for the display's vibration when touched has already appeared. In addition, passive improvement measures can also be taken. Using the keyboard as an example, protrusions are set on the F and J keys using the index finger input. The user can extrapolate other keys based on these two keys. This small positioning clue has played a huge effect. For example, as long as adhesives are used to set protrusions in the center of smart phone and tablet display screens, the user can know approximately where the touch is. It will not be completely different from a clue.
S100 I/O is the central process interface
for Advant Controller 400 series process
controllers. Thanks to built-in cablemarshalling
facilities and parallel
communication with the host controller, it
is the right choice for centralized I/O
systems and high-speed applications.
The range of process I/O Modules is
complete, consisting of general purpose
digital and analog inputs and outputs and
special interfaces for special tasks. These
specials include pulse counting, frequency
measuring, positioning, motor speed
control and communication with other
controllers. All I/O modules provide
simple interfacing, accurate - yet fast -
control, and easy integration of individual
loops into a comprehensive plant-wide
control and supervision system.
The interface modules connect to the
process through screw terminals on
connection units normally installed inside,
at the back of the cabinet. This solution
keeps noise and destructive voltage spikes
away from the central electronics and
provides a neat and tidy process interface
that is easy to maintain.