Developer Guidelines
Graphic User Interface (GUI) Design Guidelines for X2
In general, the broad concepts involved in designing an intuitive and easy to use mobile app also apply to app development for the X2. However, due to the unique way that users experience your app on the X2’s heads-up display, there are some important differences between designing a GUI for use on the X2 compared with a standard Android touchscreen device.
Some general concepts and themes specific to the X2 include:
All apps developed for the X2 should be set to Landscape as the default orientation.
Use black as the default color for backgrounds to allow the user to see through to the
real world.
Use large text (>25sp) in bright contrasting colors for optimal readability.
Reserve the center of the screen for objects that require the user’s continual focus. (ie
3D models, instructional text, temporary alerts, etc.)
Place buttons/menus toward the edges of the screen* to help maintain an unobstructed
view of the real world directly in front of the user’s line of sight.
Use of fullscreen mode may result in portions of the screen becoming inaccessible by
the motion/gesture UI cursor. If your app uses fullscreen mode, it is best to avoid placing
clickable content near the very top and bottom edges of the screen
Using the Audio Input Device
Due to Android OS limitations, the audio input device (ie microphone) can only be utilized by one app/process at a time. Since the Speech Recognizer service is always running, it is always using the audio input device while listening for voice commands. If your app requires the use of the audio input device, in addition to requesting the device from the Android framework (See Android SDK), you will also need to stop, and then restart, the Speech Recognizer. This also applies to usage of the Camcorder profile when recording video, since it also utilizes the audio input device. Before requesting the audio input device, use the following code to temporarily stop the Speech Recognizer:
Intent intent = new Intent("com.thirdeyegen.api.voicecommand");
intent.putExtra("instructions", "interrupt");
sendBroadcast(intent);
In order to ensure that the audio input device is completely released, it is recommended that you wait a short time (1-2 seconds) between calling “interrupt” on the Speech Recognizer service and requesting the audio input device from Android. This can be done by using a Countdown Timer, or by calling “Sleep” on your thread. Failure to insert a delay can result in your app not receiving audio input after completing the request.
When finished using the audio input device, use the following code to restart the Speech Recognizer:
Intent intent = new Intent("com.thirdeyegen.api.voicecommand");
intent.putExtra("instructions", "resume");
sendBroadcast(intent);
Interacting with the ThirdEye UI Controls Menu Bar
The ThirdEye UI Controls service (Head Motion, Hand Gestures) displays a menu bar containing buttons for various system actions (Home, Zoom In/Out, Volume Up/Down, etc.). By default, the menu bar exists in the top portion of the screen, and becomes visible when the cursor floats over that area of the screen. In some cases, you may want to minimize the menu bar programmatically to accommodate the placement of clickable UI objects in the top portion of the screen. To minimize the ThirdEye UI Controls menu bar, use the following code:
Intent menuHideIntent = new Intent("com.thirdeyegen.motionui.menu");
menuHideIntent.putExtra("action", "hide menu");
sendBroadcast(menuHideIntent);
To restore the menu when you no longer require it to be hidden, or when your application closes, use the following code:
Intent menuShowIntent = new Intent("com.thirdeyegen.motionui.menu");
menuShowIntent.putExtra("action", "show menu");
sendBroadcast(menuShowIntent);
Using the Camera
For general camera usage, please reference the Android SDK Developer Guide. As there is no user-facing camera on the X2, the main, center-mounted, forward-facing, RGB camera can be accessed using Camera ID “0”.
Camera Preview vs Optical See-Through
Camera Preview vs Optical See-Through When using the camera in your Android app, there is a requirement for a “Camera Preview” view to be present. This requirement ensures that the user can see what is being captured by the camera in real time. This requirement is useful when developing for a standard touchscreen Android device, since the user is not able to see through the display. For devices with a see-through display, including the ThirdEye Gen X2 smart glasses, the user does not need to see the direct preview feed from the camera since they already have an unobstructed view of the world in front of the camera. In fact, overlaying the direct camera feed in to the display can be disorienting for the user.
For many use cases (code scanning, object detection, SLAM-based experiences, etc.), it is best to maintain a clear view of the real world while utilizing the camera(s) for long periods of time. As detailed previously in this guide, you can use a black background to allow the user to see through the display into the real world. Doing this while simultaneously maintaining the required “Camera Preview” view requires a slightly more complicated approach. The following example is just one approach to achieving optical see-through while utilizing the camera(s).
In this example, we will be using the “TextureView” view as the view for the Camera preview, with an additional ImageView of the same size containing a black background. When the ImageView is set to be visible, it will sit in front of the camera preview, resulting in a black, see through background for the entire screen. You can then place other GUI elements in front of this child view for user to see/interact with.
Last updated