Pricing

Understanding the Android System Input Subsystem and Event Distribution Process

In this article, we delve into the Android system input subsystem to understand how input events are obtained and processed through callback functions.

Introduction

When we write programs with UI, if we want to obtain input events, we simply write a callback function, such as (onKeyEvent, onTouchEvent...). Input events may come from buttons, touch, or keyboards. In fact, the soft keyboard is also a kind of independent input event. So why can I get these input events through callback functions? How does the system accurately enable the program to obtain input events and respond to them? Why can only one interface obtain touch events at the same time? Below, we will answer these questions through the analysis of the Android system input subsystem.

Input Event Forwarding Process

How do physical devices send input data to the kernel?

Physical devices send data to the kernel through device drivers. In the /dev/input/ directory under Linux, there are several device files, such as event0, event1, event2... These device files are actually created by the drivers. They share a major device number and only have different minor device numbers, indicating that they belong to the same class of devices. For example, the touch screen corresponds to event0. After the touch screen driver is mounted, the driver program will initialize, mainly initializing the CPU pins and setting up the interrupt handler.

It's easy to understand that the touch screen is a physical device, but our driver program runs in the CPU, which are two different devices. They are physically connected by wires connecting the corresponding pins, although the wires are very small on the PCB board. The driver program initializes the pins in the CPU connected to the touch screen, and each pin corresponds to a register, which is detailed in the CPU's datasheet.

When the touch screen is pressed, the voltage level of a pin on the touch screen becomes lower. The connected CPU pin detects that the voltage of the connected pin has become lower, and an interrupt is triggered. This is initialized in the touch driver. The CPU has an interrupt vector table, which leads to the interrupt handler function written in our driver. The interrupt handler function reads the touch screen data, which is the binary data composed of the connected pins, such as (01011010). At this point, our kernel has obtained the touch screen data.

Timing chart of touch screen chip

How does the kernel send input data to the user space Android framework?

After the kernel obtains the touch screen data, it goes through smoothing and filtering processes. The data is still in the kernel space, so how does Android get the touch data? In fact, Android is a group of processes running on the Linux kernel, which together provide services such as UI and application installation for users.

The startup process of the mobile phone is that the Linux kernel starts first, and the Android process group will be started after the startup. FrameWork belongs to this process group. There is a service InputManagerService in Framework. Let's see where the Android source code is instantiated:

SystemServer. java -------->

StartOtherService() -------->

/*Construct InputManagerService*/

InputManager=new InputManagerService (context);

/*Pass inputManager to WindowManagerService

Wm=WindowManagerService. main (context, inputManager,

MFactoryTestMode= FactoryTest.FACTORY_ TEST_ LOW_ LEVEL! MFirstBoot, mOnlyCore);

/*Set callback for InputManagerService*/

InputManager. setWindowManagerCallbacks (wm. getInputMonitor());

/*After full initialization, SystemServer calls the start() function to start the two threads in the InputManager. First, look at the InputReaderThread, which is the starting point of event processing in user mode*/

InputManager. start();

 

So we can see that it is instantiated and started in the SystemServer process, so we first need to see what the constructor of InputManagerService does.

The constructor will call the c++object of the NativeInputManager created by jni, which is created in the NativeInputManager constructor

Sp<EventHub>eventHub=new EventHub()

MInputManager=new InputManager (eventhub, this, this);

 

The eventHub object constructor does the following:

1. Create epoll object, and then add fd of each input device to wait for input events

2. Use the inotify mechanism to monitor changes in the/dev/input directory. If any, it means that the device is changed and needs to be processed. Input the monitoring of the increase, decrease, and delete operations of the device, and add the fd representing inotify to epoll

3. Create a pipe. The pipe can only be used to communicate between two people with a common ancestor. Add epoll to the reader

The InputManager object constructor does the following:

1. Create InputDispatcher

2. Create InputReader (eventhub, inputdispatcher), and InputDispatcher inherits InputListenerInterface

3. Create InputReaderThread

4. Create InputDispatcherThread

We still remember that in SystemServer.java, we finally passed inputManager. start(); To run our InputManagerService, continue to look at the start method. In fact, in the inputManager object of the native layer, the two threads created above, InputReaderThread and InputDispatcherThread, are included in the start method.

For the start method of InputReaderThread:

1. Call the getEvents method of the eventHub saved in the constructor to get the input event, and do things in the getEvent method

1) Determine whether the input device driver needs to be opened. If the device driver needs to be opened, scan the device file in the/dev/input directory and open these devices. At the same time, determine whether there is a virtual keyboard in the device list. If not, create a device and add it

2) In the next step, there are at least two input devices in the system, one is the touch screen, and the other is the virtual keyboard. Because the device needs to be opened for the above getEvent call, these actions are encapsulated as RawEvent events. Here are two DEVICE_ ADDED event+FINISH_ DEVICE_ SCAN events, return these events, and will not go further

3) If you enter the getEvents method for the second time, you will wait to read the input event, and send the read touch event back

Here we know how the touch input data in the kernel space is transferred to the Android framework in the user space. In fact, it is through the/dev/input directory to scan the directory. If there is a device, open the device, add it to the epoll object, wait for input events multiple times, and obtain data in the loop.

How does the Android framework send input data to the APP process?

Android framework gets the data of touch input, but there are so many processes in the system, so many processes are getting input. How does it further process and accurately distribute events?

The second thing was done in the start method of InputReaderThread:

Call the processEventsLocked method to process the RawEvent returned by the getEvents method above

1) Depending on the type of RawEvent, different methods are called for processing, including ● ordinary touch events

. ● Add equipment event

. ● Delete equipment event

. ● FiNISHED_ DEVICE_ SCAN

2) For the touch event: call the process method of the input device (the InputDevice created between them) corresponding to the touch event. This method internally calls the process method of the internal InputMapper. One input device has many Mappers, traverses all Mappers, and calls process. Suppose we are a touch screen that supports multi touch. Its mapper is MultiTouchInputMapper, and call its process method.

3) The process method of MultiTouchInputMapper internally processes as follows:

First, one touchEvent at a time to get the slot. When no EV is received_ The corresponding Slots before SYN are the same, and then process x, y, pressure, and touch in turn_ Major, these values initialize the variables of the slot;

When receiving ev. type==EV_ SYN and ev.code=SYN_ MT_ REPORT adds 1 to the index of the current slot to record the next touch event, and the sync function handles the touch event at the same time;

Then the CurrentCookedPointerData and LastCookedPointerData perform some column operations, such as up, down, or move events. Then the corresponding events call dispatchMotion, and internally call the notifyMotion of InputDispatcher

4) For the notifyMotion of InputDispatcher:

● If InputDispatcher sets inputFilter, call inputFilter first to consume these events

● If there is no inputFiler or inputFilter is not interested in these events, then a MotionEntry will be constructed, added to the mInboundQueue, and the InputDispatcher thread will be awakened for processing

5) For the thread processing loop of InputDispatcher:

● Optimize the app handover delay. When the handover timeout occurs, preempt the distribution and discard all other events to be handled;

● Distribution events:

First, call findTouchedWindowTagetsLocked to find the window with focus, and save these creation in the inputTargets array;

The InputChannel of the previously registered monitor will also be added to the inputTargets array here;

Then distribute events one by one to the inputTargets array.

How the APP process sends input data to its corresponding activity

Activity is the basic component of a process. It can be thought of as representing an interface and a collection of views. What does an activity do each time it is started?

1. In fact, it depends on what ViewRootImpl does behind it. In the setView method in ViewRootImpl.java, instantiate InputChannel, and of course, it will judge whether the current window can accept input events, and then call the addToDisplay method in session.java to pass it to WindowManagerService. In fact, it is to call the addWindow method of WindowManagerService, and a pair of InputChannels [] will be created in WindowManagerService, Then the InputChannel [1] is transferred to this inputChannel, and the setView method continues to create a WindowInputEventReceiver object, and then the inputChannel created above

 

2. The addWindow method in WindowManagerService:

InputChannel [] inputChannels=InputChannel. openInputChannelPair (name)

/*Channel [0] is saved on the server side*/

Win. setInputChannel [inputChannels [0]]

/*Channel [1] returns to ViewRootImpl*/

InputChannels [1]. transferTo (outputInputChannel)

/*Register in inputManagerService*/

MInputManager. registerInputChannel (win. mInputChannel, win. mInputWindowHandle)

Now we can understand how to distribute time to the corresponding activity, which is actually the ViewRootImpl behind it.

How does activity send input data to specific views?

The last step is to distribute events to specific views in the Activity. It is easy to understand that events are distributed to specific views from ViewRootImpl, because the range of touch is known here, and the position and status of each view are also known here because if the view is to be correctly rendered, the Android graphics framework will do all this, measure the size of each view, and determine the position of each view, ViewRootImpl will distribute the data to each view layer by layer, but each view knows whether the touch event works on itself. If not, it will be discarded and distributed below.

Summary

The distribution process of touch events looks very complicated, but Android is still very elegant. Let's analyze its process, which is helpful for us to realize some cool functions. Of course, it will also be helpful for us to debug the code. When we find that the system does not respond to the touch, we can always analyze the reason by decomposing the above process.

WeTest Live Testing

WeTest Live Testing allows testers to see how the app will behave when a real user uses it and provides assurance that the app works correctly to ensure a smooth experience for all users.

WeTest Live Testing Features

1. Maximum real device market coverage

You could test on thousands of Android and iOS real devices including iPhone, Huawei, Xiaomi, Samsung Galaxy, Oppo, Vivo, and Pixel with different screens and OS versions.

2. Test Dev and Published Apps

WeTest Live Testing allows you to upload and test on your dev APK/AAB/IPA files. It is also allowed to install production apps from the Play Store/App Store.

3. Real-time Debugging

You could debug your app instantly, view device logs, inspect UI elements, and use stack trace to find and fix bugs easily.

4. ADB Command Debugging

It has easy access to Android cloud devices directly from adb shell which could help you ship quality apps even faster.

Try WeTest Live Testing now

We have good news for you the WeTest Mobile App Live Testing unmetered plan has been launched! You will get a 67% discount on the new plan with unlimited testing minutes of Live Testing. The unmetered plan is as follows:

Live Testing(Unmetered)-1 parallel/month, only needs $30.
Live Testing(Unmetered)-1 parallel/year, only needs $300.

Don't miss out on the special offer!

Latest Posts
1Enhancing Game Quality with Tencent's automated testing platform UDT, a case study of mobile RPG game project We are thrilled to present a real-world case study that illustrates how our UDT platform and private cloud for remote devices empowered an RPG action game with efficient and high-standard automated testing. This endeavor led to a substantial uplift in both testing quality and productivity.
2How can Mini Program Reinforcement in 5 levels improve the security of a Chinese bank mini program? Let's see how Level-5 expert mini-reinforcement service significantly improves the bank mini program's code security and protect sensitive personal information from attackers.
3How UDT Helps Tencent Achieve Remote Device Management and Automated Testing Efficiency Let's see how UDT helps multiple teams within Tencent achieve agile and efficient collaboration and realize efficient sharing of local devices.
4WeTest showed PC & Console Game QA services and PerfDog at Gamescom 2024 Exhibited at Gamescom 2024 with Industry-leading PC & Console Game QA Solution and PerfDog
5Purchase option change notification Effective from September 1, 2024, the following list represents purchase options will be removed.