## 7.2\. Input Devices Device implementations: * [C-0-1] MUST include an input mechanism, such as a [touchscreen](#7_2_4_touchScreen_input) or [non-touch navigation](#7_2_2_non-touch_navigation), to navigate between the UI elements. ### 7.2.1\. Keyboard If device implementations include support for third-party Input Method Editor (IME) applications, they: * [C-1-1] MUST declare the [`android.software.input_methods`](https://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_INPUT_METHODS) feature flag. * [C-1-2] MUST implement fully [`Input Management Framework`](https://developer.android.com/reference/android/view/inputmethod/InputMethodManager.html) * [C-1-3] MUST have a preinstalled software keyboard. Device implementations: * [C-0-1] MUST NOT include a hardware keyboard that does not match one of the formats specified in [android.content.res.Configuration.keyboard]( http://developer.android.com/reference/android/content/res/Configuration.html) (QWERTY or 12-key). * SHOULD include additional soft keyboard implementations. * MAY include a hardware keyboard. ### 7.2.2\. Non-touch Navigation Android includes support for d-pad, trackball, and wheel as mechanisms for non-touch navigation. Device implementations: * [C-0-1] MUST report the correct value for [android.content.res.Configuration.navigation]( https://developer.android.com/reference/android/content/res/Configuration.html#navigation). If device implementations lack non-touch navigations, they: * [C-1-1] MUST provide a reasonable alternative user interface mechanism for the selection and editing of text, compatible with Input Management Engines. The upstream Android open source implementation includes a selection mechanism suitable for use with devices that lack non-touch navigation inputs. ### 7.2.3\. Navigation Keys The [Home](http://developer.android.com/reference/android/view/KeyEvent.html#`KEYCODE_HOME`), [Recents](http://developer.android.com/reference/android/view/KeyEvent.html#`KEYCODE_APP_SWITCH`), and [Back](http://developer.android.com/reference/android/view/KeyEvent.html#`KEYCODE_BACK`) functions typically provided via an interaction with a dedicated physical button or a distinct portion of the touch screen, are essential to the Android navigation paradigm and therefore, device implementations: * [C-0-1] MUST provide a user affordance to launch installed applications that have an activity with the `` set with `ACTION=MAIN` and `CATEGORY=LAUNCHER` or `CATEGORY=LEANBACK_LAUNCHER` for Television device implementations. The Home function SHOULD be the mechanism for this user affordance. * SHOULD provide buttons for the Recents and Back function. If the Home, Recents, or Back functions are provided, they: * [C-1-1] MUST be accessible with a single action (e.g. tap, double-click or gesture) when any of them are accessible. * [C-1-2] MUST provide a clear indication of which single action would trigger each function. Having a visible icon imprinted on the button, showing a software icon on the navigation bar portion of the screen, or walking the user through a guided step-by-step demo flow during the out-of-box setup experience are examples of such an indication. Device implementations: * [SR] are STRONGLY RECOMMENDED to not provide the input mechanism for the [Menu function](http://developer.android.com/reference/android/view/KeyEvent.html#`KEYCODE_BACK`) as it is deprecated in favor of action bar since Android 4.0. If device implementations provide the Menu function, they: * [C-2-1] MUST display the action overflow button whenever the action overflow menu popup is not empty and the action bar is visible. * [C-2-2] MUST NOT modify the position of the action overflow popup displayed by selecting the overflow button in the action bar, but MAY render the action overflow popup at a modified position on the screen when it is displayed by selecting the Menu function. If device implementations do not provide the Menu function, for backwards compatibility, they: * [C-3-1] MUST make the Menu function available to applications when `targetSdkVersion` is less than 10, either by a physical button, a software key, or gestures. This Menu function should be accessible unless hidden together with other navigation functions. If device implementations provide the [Assist function](http://developer.android.com/reference/android/view/KeyEvent.html#`KEYCODE_ASSIST`), they: * [C-4-1] MUST make the Assist function accessible with a single action (e.g. tap, double-click or gesture) when other navigation keys are accessible. * [SR] STRONGLY RECOMMENDED to use long press on HOME function as this designated interaction. If device implementations use a distinct portion of the screen to display the navigation keys, they: * [C-5-1] Navigation keys MUST use a distinct portion of the screen, not available to applications, and MUST NOT obscure or otherwise interfere with the portion of the screen available to applications. * [C-5-2] MUST make available a portion of the display to applications that meets the requirements defined in [section 7.1.1](#7_1_1_screen_configuration). * [C-5-3] MUST honor the flags set by the app through the [`View.setSystemUiVisibility()`](https://developer.android.com/reference/android/view/View.html#setSystemUiVisibility%28int%29) API method, so that this distinct portion of the screen (a.k.a. the navigation bar) is properly hidden away as documented in the SDK. If the navigation function is provided as an on-screen, gesture-based action: * [C-6-1] [`WindowInsets#getMandatorySystemGestureInsets()`](https://developer.android.com/reference/android/view/WindowInsets.html#getMandatorySystemGestureInsets()) MUST only be used to report the Home gesture recognition area. * [C-6-2] Gestures that start within an exclusion rect as provided by the foreground application via [`View#setSystemGestureExclusionRects()`](https://developer.android.com/reference/android/view/View.html#setSystemGestureExclusionRects(java.util.List%3Candroid.graphics.Rect%3E)), but outside of [`WindowInsets#getMandatorySystemGestureInsets()`](https://developer.android.com/reference/android/view/WindowInsets.html#getMandatorySystemGestureInsets()), MUST NOT be intercepted for the navigation function as long as the exclusion rect is allowed within the max exclusion limit as specified in the documentation for [`View#setSystemGestureExclusionRects()`](https://developer.android.com/reference/android/view/View.html#setSystemGestureExclusionRects(java.util.List%3Candroid.graphics.Rect%3E)). * [C-6-3] MUST send the foreground app a [`MotionEvent.ACTION_CANCEL`](https://developer.android.com/reference/android/view/MotionEvent.html#ACTION_CANCEL) event once touches start being intercepted for a system gesture, if the foreground app was previously sent an [`MotionEvent.ACTION_DOWN`](https://developer.android.com/reference/android/view/MotionEvent.html#ACTION_DOWN) event. * [C-6-4] MUST provide a user affordance to switch to an on-screen, button-based navigation (for example, in Settings). * SHOULD provide Home function as a swipe up from the bottom edge of the current orientation of the screen. * SHOULD provide Recents function as a swipe up and hold before release, from the same area as the Home gesture. * Gestures that start within [`WindowInsets#getMandatorySystemGestureInsets()`](https://developer.android.com/reference/android/view/WindowInsets.html#getMandatorySystemGestureInsets()) SHOULD NOT be affected by exclusion rects provided by the foreground application via [`View#setSystemGestureExclusionRects()`](https://developer.android.com/reference/android/view/View.html#setSystemGestureExclusionRects(java.util.List%3Candroid.graphics.Rect%3E)). If a navigation function is provided from anywhere on the left and right edges of the current orientation of the screen: * [C-7-1] The navigation function MUST be Back and provided as a swipe from both left and right edges of the current orientation of the screen. * [C-7-2] If custom swipeable system panels are provided on the left or right edges, they MUST be placed within the top 1/3rd of the screen with a clear, persistent visual indication that dragging in would invoke the aforementioned panels, and hence not Back. A system panel MAY be configured by a user such that it lands below the top 1/3rd of the screen edge(s) but the system panel MUST NOT use longer than 1/3rd of the edge(s). * [C-7-3] When the foreground app has either the [`View.SYSTEM_UI_FLAG_IMMERSIVE`](https://developer.android.com/reference/android/view/View.html#SYSTEM_UI_FLAG_IMMERSIVE) or [`View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY`](https://developer.android.com/reference/android/view/View.html#SYSTEM_UI_FLAG_IMMERSIVE_STICKY) flags set, swiping from the edges MUST behave as implemented in AOSP, which is documented in [the SDK](https://developer.android.com/training/system-ui/immersive). * [C-7-4] When the foreground app has either the [`View.SYSTEM_UI_FLAG_IMMERSIVE`](https://developer.android.com/reference/android/view/View.html#SYSTEM_UI_FLAG_IMMERSIVE) or [`View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY`](https://developer.android.com/reference/android/view/View.html#SYSTEM_UI_FLAG_IMMERSIVE_STICKY) flags set, custom swipeable system panels MUST be hidden until the user brings in the system bars (a.k.a. navigation and status bar) as implemented in AOSP. ### 7.2.4\. Touchscreen Input Android includes support for a variety of pointer input systems, such as touchscreens, touch pads, and fake touch input devices. [Touchscreen-based device implementations](https://source.android.com/devices/input/touch-devices) are associated with a display such that the user has the impression of directly manipulating items on screen. Since the user is directly touching the screen, the system does not require any additional affordances to indicate the objects being manipulated. Device implementations: * SHOULD have a pointer input system of some kind (either mouse-like or touch). * SHOULD support fully independently tracked pointers. If device implementations include a touchscreen (single-touch or better), they: * [C-1-1] MUST report `TOUCHSCREEN_FINGER` for the [`Configuration.touchscreen`](https://developer.android.com/reference/android/content/res/Configuration.html#touchscreen) API field. * [C-1-2] MUST report the `android.hardware.touchscreen` and `android.hardware.faketouch` feature flags. If device implementations include a touchscreen that can track more than a single touch, they: * [C-2-1] MUST report the appropriate feature flags `android.hardware.touchscreen.multitouch`, `android.hardware.touchscreen.multitouch.distinct`, `android.hardware.touchscreen.multitouch.jazzhand` corresponding to the type of the specific touchscreen on the device. If device implementations do not include a touchscreen (and rely on a pointer device only) and meet the fake touch requirements in [section 7.2.5](#7_2_5_fake_touch_input), they: * [C-3-1] MUST NOT report any feature flag starting with `android.hardware.touchscreen` and MUST report only `android.hardware.faketouch`. ### 7.2.5\. Fake Touch Input Fake touch interface provides a user input system that approximates a subset of touchscreen capabilities. For example, a mouse or remote control that drives an on-screen cursor approximates touch, but requires the user to first point or focus then click. Numerous input devices like the mouse, trackpad, gyro-based air mouse, gyro-pointer, joystick, and multi-touch trackpad can support fake touch interactions. Android includes the feature constant android.hardware.faketouch, which corresponds to a high-fidelity non-touch (pointer-based) input device such as a mouse or trackpad that can adequately emulate touch-based input (including basic gesture support), and indicates that the device supports an emulated subset of touchscreen functionality. If device implementations do not include a touchscreen but include another pointer input system which they want to make available, they: * SHOULD declare support for the `android.hardware.faketouch` feature flag. If device implementations declare support for `android.hardware.faketouch`, they: * [C-1-1] MUST report the [absolute X and Y screen positions]( http://developer.android.com/reference/android/view/MotionEvent.html) of the pointer location and display a visual pointer on the screen. * [C-1-2] MUST report touch event with the action code that specifies the state change that occurs on the pointer [going down or up on the screen](http://developer.android.com/reference/android/view/MotionEvent.html). * [C-1-3] MUST support pointer down and up on an object on the screen, which allows users to emulate tap on an object on the screen. * [C-1-4] MUST support pointer down, pointer up, pointer down then pointer up in the same place on an object on the screen within a time threshold, which allows users to [emulate double tap]( http://developer.android.com/reference/android/view/MotionEvent.html) on an object on the screen. * [C-1-5] MUST support pointer down on an arbitrary point on the screen, pointer move to any other arbitrary point on the screen, followed by a pointer up, which allows users to emulate a touch drag. * [C-1-6] MUST support pointer down then allow users to quickly move the object to a different position on the screen and then pointer up on the screen, which allows users to fling an object on the screen. * [C-1-7] MUST report `TOUCHSCREEN_NOTOUCH` for the [`Configuration.touchscreen`](https://developer.android.com/reference/android/content/res/Configuration.html#touchscreen) API field. If device implementations declare support for `android.hardware.faketouch.multitouch.distinct`, they: * [C-2-1] MUST declare support for `android.hardware.faketouch`. * [C-2-2] MUST support distinct tracking of two or more independent pointer inputs. If device implementations declare support for `android.hardware.faketouch.multitouch.jazzhand`, they: * [C-3-1] MUST declare support for `android.hardware.faketouch`. * [C-3-2] MUST support distinct tracking of 5 (tracking a hand of fingers) or more pointer inputs fully independently. ### 7.2.6\. Game Controller Support #### 7.2.6.1\. Button Mappings If device implementations declare the `android.hardware.gamepad` feature flag, they: * [C-1-1] MUST have embed a controller or ship with a separate controller in the box, that would provide means to input all the events listed in the below tables. * [C-1-2] MUST be capable to map HID events to it's associated Android `view.InputEvent` constants as listed in the below tables. The upstream Android implementation includes implementation for game controllers that satisfies this requirement.
Button HID Usage2 Android Button
A1 0x09 0x0001 KEYCODE_BUTTON_A (96)
B1 0x09 0x0002 KEYCODE_BUTTON_B (97)
X1 0x09 0x0004 KEYCODE_BUTTON_X (99)
Y1 0x09 0x0005 KEYCODE_BUTTON_Y (100)
D-pad up1
D-pad down1
0x01 0x00393 AXIS_HAT_Y4
D-pad left1
D-pad right1
0x01 0x00393 AXIS_HAT_X4
Left shoulder button1 0x09 0x0007 KEYCODE_BUTTON_L1 (102)
Right shoulder button1 0x09 0x0008 KEYCODE_BUTTON_R1 (103)
Left stick click1 0x09 0x000E KEYCODE_BUTTON_THUMBL (106)
Right stick click1 0x09 0x000F KEYCODE_BUTTON_THUMBR (107)
Home1 0x0c 0x0223 KEYCODE_HOME (3)
Back1 0x0c 0x0224 KEYCODE_BACK (4)

1 KeyEvent

2 The above HID usages must be declared within a Game pad CA (0x01 0x0005).

3 This usage must have a Logical Minimum of 0, a Logical Maximum of 7, a Physical Minimum of 0, a Physical Maximum of 315, Units in Degrees, and a Report Size of 4. The logical value is defined to be the clockwise rotation away from the vertical axis; for example, a logical value of 0 represents no rotation and the up button being pressed, while a logical value of 1 represents a rotation of 45 degrees and both the up and left keys being pressed.

4 MotionEvent

Analog Controls1 HID Usage Android Button
Left Trigger 0x02 0x00C5 AXIS_LTRIGGER
Right Trigger 0x02 0x00C4 AXIS_RTRIGGER
Left Joystick 0x01 0x0030
0x01 0x0031
AXIS_X
AXIS_Y
Right Joystick 0x01 0x0032
0x01 0x0035
AXIS_Z
AXIS_RZ

1 MotionEvent

### 7.2.7\. Remote Control See [Section 2.3.1](#2_3_1_hardware) for device-specific requirements.