Introduction¶
The following examples are not strictly part of the Six15 SDK. They could change from release to release in a non-backwards compatible way. Their purpose is to be included in your application for your own use. They are a starting point for your app. Feel free to modify them to fit your needs.
This page heavily references our GitHub page https://github.com/Six15-Technologies/ST1-Examples.
Ways to use the Display¶
There are many ways to show content on the ST1’s display. Each one has has its own own pros and cons. Each of these solutions has corresponding example code on our GitHub page.
Low effort solutions have more limitations. Versatile solutions are more complex.
Here are some solutions sorted from low effort to high effort.
Method |
Effort |
Pro |
Con |
---|---|---|---|
Screen Mirroring |
None |
Possible without writing any code using the included apps, or through a single SDK call. Can display any app, even apps without ST1 integration. |
Limited to content already on the display. Phone screen must be powered on. User must accept system dialog every time. |
Picking Test Drive (OCR) |
None/Low |
Possible without writing any code using the included apps, or through a single SDK call. Can display any app, even apps without ST1 integration. Generates images are better suited for the HMD than normal screen mirroring. |
Limited to content already on the display. Phone screen must be powered on. User must accept system dialog every time. Works well only with simple text UI’s, and only after keyword configuration. |
Low |
Send simple text content or static images to the HUD with a broadcast intent. No need for complex AIDL Interface integration. Image can persist when your app is in the background. Images are automatically re-sent when the ST1 is re-connected. Can run with the phone’s display off. |
You must start our foreground service with a |
|
Presentation Mode |
None/Medium |
Can display content different from what’s on the phone’s display. Uses Android’s standard and well supported Presentation (i.e. external display) API. Application components, like Fragments or Custom Views, can be easily re-used on the ST1. Can run from a background Service. Android only sends frames when content changes. This can save power. |
Phone screen must be powered on. User must start presentation mode in the Six15 Service app, unless app requests it with the AIDL Interface. |
Medium/High |
Can display essentially any type of image, from static images, to mirroring sections of the phone UI, to entirely unique content. Behavior while the phone’s screen is off is flexible. Six15 has helper classes and examples which show a wide verity of methods easily portable to your existing Android app. |
Requires integrating the Six15 SDK into your app. |
HudIntentInterface¶
The example class HudIntentInterface defines static constants and static functions to help when using the Intent Interface from code. It defines helper functions to correctly start and stop the Intent Interface foreground service. It also defines all the actions and extras used by the interface.
For more example Intent Interface screens see our GitHub, specifically IntentInterfaceExamplesActivity.
AIDL Interface¶
The primary method to send images to the HUD is by using sendImageToHud(Bitmap)
or sendBufferToHud(new ByteFrame(byte[]))
where byte[] is an array of JPEG bytes.
Using sendBufferToHud(...)
requires the JPEG be properly sized and formatted for the HUD.
By setting setAutoResizeImage(true)
and using sendImageToHud(...)
, the Six15 Service app can automatically convert the Bitmap into a valid format for the HUD.
Static¶
Static images can easily be included inside your application, probably in res/raw, and sent with just a few lines of code.
IHudService mHmdService; ... InputStream is = getResources().openRawResource(R.raw.test_image); Bitmap bitmap = BitmapFactory.decodeStream(is); mHmdService.setAutoResizeImage(true); mHmdService.sendImageToHud(new ImageFrame(bitmap));
Dynamic¶
If you content changes dynamically, you’ll need to generate a Bitmap or JPEG on the host device at runtime.
This can be done using a
Canvas
,PixelCopy
, OpenGL, Camera, or any other method. This process can be challenging to implement quickly.Six15 provides helper classes which can make generating complex images for the HUD less challenging. Internally these methods use
sendBufferToHud()
.
AIDL Interface Methods¶
Method |
Image Source |
Phone Screen Off |
Supports Foreground Service |
Image Mirrored from Phone |
Supports Legacy Devices |
Key Drawbacks |
---|---|---|---|---|---|---|
Raw sendImageToHud() or Raw sendBufferToHud() |
Custom |
Yes |
Yes |
No |
Yes |
Generating images yourself is hard, especially at high frame rates |
HudViewMirroringHelper usingPixelCopy() |
Mirrored from View |
No |
No |
Yes |
Yes |
SurfaceView requires minSdk 26 |
HudViewMirroringHelper usingDraw() |
Mirrored from View |
No |
No |
Yes |
Yes |
SurfaceView requires minSdk 26 |
R.layout file |
Yes |
Yes |
No |
Yes |
No SurfaceView. Doesn’t support View’s which require being attached to a Window (like ViewPager). |
|
HudSurfaceViewRenderingHelper useOverlay=false |
R.layout file |
No |
No |
No |
No |
minSdk 26 |
HudSurfaceViewRenderingHelper useOverlay=true |
R.layout file |
Yes |
Yes |
No |
No |
minSdk 26 Requires user to allow “Draw Over Other Apps” |
HudViewMirroringHelper¶
View mirroring allows an application to select a region of it’s own UI and show it on the HUD. This region is defined by a View/ViewGroup. The view must be attached to a Window but doesn’t necessarily need to be visible.
There are 2 methods to do this. View.draw()
and PixelCopy
. The desired method is chosen using the appropriate static constructor.
When using a PixelCopy
, pixels are taken pixel-by-pixel from the application’s Window in the region of the View’s Rect
, and copied into a Bitmap
.
When using the View.draw()
method, the View.draw(Canvas)
call is made on the provided view causing it to draw using provided the Canvas
into the backing Bitmap
.
In both methods the Bitmap
is then formatted into an appropriately sized HUD image and sent to the HUD display.
Both methods detect when the View’s contents or position change and automatically trigger a re-draw when needed.
Neither method will mirror content from other Windows, like dialogs or the notification tray, even if they happen to overlay the View.
The use of PixelCopy
vs View.draw()
has a few important distinctions:
View.draw()
does not work with SurfaceViews or TextureViews.PixelCopy
does support SurfaceViews due to special handling within theHudViewMirroringHelper
class.
PixelCopy
is only supported with minSdk=26.
View.draw()
only considers the View hierarchy under the supplied View, so overlapping View’s or parent views (i.e. the window background) are not rendered at all.PixelCopy
simply copies pixels so would include these things.
View.draw()
can optionally attempt to turn dark colored text into white text. This improves contrast on the HUD and helps when mirroring light themed applications on the HUD.
HudViewRenderingHelper¶
The view rendering helper allows a View
to be specifically rendered for the purpose of showing on the HUD.
The View
is not mirrored from a View
on the host devices UI.
In some sense this rendering method is similar to Presentation mode. However, Presentation mode has a key limitation. It does not work when the phone’s screen is off. Screen off usage is important for those desiring hands free usage.
Another alternative would be to use Android’s Canvas API. A Canvas
can be used to draw onto Bitmaps which can be sent through the AIDL Interface to the HUD.
Canvas’s allows you to render simple shapes, images, and text. Using a Canvas
directly becomes unmanageable with even moderately complex content.
If background rendering is desired and Canvas
rendering is too complex, then HudViewRenderingHelper may be a good solution.
HudViewRenderingHelper makes it easy to render a View hierarchy into a Bitmap which can be shown on the HUD. This can even be done from a Service while the display is off.
Not every View is compatible with rendering in this way. This is because:
onAttachedToWindow()
is never called since the View isn’t attached to a Window.Surfaces, and therefore SurfaceView, TextureView, and related Views, are not supported. (WebView is not a Surface)
invalidate()
andrequestLayout()
are not handled automatically like normal.
For example, ViewPager sets internal state based on onAttachedToWindow()
so doesn’t work.
Rendering Video is difficult without a Surface.
That being said, many things do work.
All standard (Android or AndroidX) ViewGroups
TextView, ImageView, etc…
Animations like translation and rotation.
invalidate()
and requestLayout()
need to be handled manually through HudViewRenderingHelper.triggerLayout()
and HudViewRenderingHelper.draw()
or HudViewRenderingHelper.setAutoDraw(true)
.
You’ll need to either avoid Views which change size dynamically, or manually call HudViewRenderingHelper.triggerLayout()
when they do.
For example, a TextView
’s with android:layout_width="wrap_content"
, or an ImageView
with android:adjustviewbounds=true
will change size when the text or image change.
When this happens, you’ll need to manually call HudViewRenderingHelper.triggerLayout()
.
If these limitations are an issue HudSurfaceViewRenderingHelper many be an option. It doesn’t have these limitations, but has it’s own downsides.
For more examples on how to see use the HudViewRenderingHelper
see our GitHub, specifically BackgroundViewRenderingService
HudSurfaceViewRenderingHelper¶
HudViewRenderingHelper doesn’t support rendering SurfaceView. HudSurfaceViewRenderingHelper
uses a different rendering technique which allows SurfaceViews
to be supported.
This alternative technique provides the following useful benefits:
View’s are attached to a Window, so
onAttachedToWindow()
is called like normal. (they are obviously not actually visible on the host device’s screen).SurfaceView, TextureView, and related Views, are supported along with standard Views and ViewGroups.
invalidate()
andrequestLayout()
are handled automatically like normal.
These benefits require accepting 1 of 2 possible trade-offs:
Rendering can only run when your application is in the foreground.
The user must allow “Draw Over Other Apps” in the system settings.
A HudSurfaceViewRenderingHelper created with useOverlay=true requires that your application can “Draw Over Other Apps”. Your current “Draw Over Other Apps” status can be read from Android with:
import android.provider.Settings; ... boolean canDrawOver = Settings.canDrawOverlays(context);
When created with useOverlay=false no permission is needed, but rendering will stop when your application is no longer visible.
You can prompt the user to accept “Draw Over Other Apps” by sending them to the system settings activity.
import android.provider.Settings; ... private ActivityResultLauncher<Intent> mOverlayPermissionActivityLauncher; ... // In onCreate() mOverlayPermissionActivityLauncher = registerForActivityResult( new ActivityResultContracts.StartActivityForResult(), new ActivityResultCallback<ActivityResult>() { @Override public void onActivityResult(ActivityResult result) { if (Settings.canDrawOverlays(context)) { //useOverlay=true can be used. } else { //useOverlay=true can NOT be used. } } } ); ... //Open the system settings to request that the user allow "Draw Over Other Apps" Intent intent = new Intent( Settings.ACTION_MANAGE_OVERLAY_PERMISSION, Uri.parse("package:$packageName") ); mOverlayPermissionActivityLauncher.launch(intent);
A complete example of dealing with these permission can be found on our GitHub, specifically BackgroundSurfaceViewRenderingActivity.
For more examples on how to use the HudSurfaceViewRenderingHelper
see
BackgroundSurfaceViewRenderingService
for when using useOverlay=true, or
ForegroundSurfaceViewRenderingActivity
when using using useOverlay=false.
AIDL Client App Architecture¶
In an Activity, Presentation, or Fragment, you should bind/unbind to Six15’s Service in onStart()
and onStop()
.
In a Service, or Fragment with setRetainInstance(true)
, you should bind/unbind in onCreate()
and onDestroy()
.
HudServiceConnection
is a helper class which binds to the Six15 Service. It also exposes extra conveniences regarding threading, error handling, and JSON parsing.
The Abstract classes below can be extended to properly make use of HudServiceConnection
for their corresponding lifecycle component:
HudCompatActivity.java
HudPresentation.java
HudFragment.java
HudService.java
HudRetainedFragment.java
HudServiceConnection
can also be used standalone.
There can be more than one HudServiceConnection
instance connected at a time, even within the same app. Having more than 1 HudServiceConnection
is recommended since the lifecycle
of the connection can be directly tied to the corresponding lifecycle component.
The examples on our GitHub page use HudServiceConnection
extensively.
If you want to clear the HUD when your app closes, the best time to do so is in onPause()
while isFinishing() == true
. This makes sure clearing is complete before another Activity is resumed and might re-draw on the HUD.
@Override
public void onPause() {
super.onPause();
//Only clear when we're finishing, not when we leave the app or turn the display off.
if (mHmdService != null && requireActivity().isFinishing()) {
try {
mHmdService.clearHudDisplay();
} catch (RemoteException e) {
e.printStackTrace();
}
}
}
Offline Speech Recognition¶
Selecting the Correct Microphone¶
On Android versions above M it’s possible to select microphones based on their name. This can be used to specifically target the ST1’s microphone.
AudioManager audioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
AudioDeviceInfo[] devices = audioManager.getDevices(AudioManager.GET_DEVICES_INPUTS);
...
@RequiresApi(api = Build.VERSION_CODES.M)
private static boolean isSix15Mic(@NonNull AudioDeviceInfo device) {
return Constants.MIC_PRODUCT_NAME.contentEquals(device.getProductName());
}
The value of Constants.MIC_PRODUCT_NAME
is “USB-Audio - SIX15-HUD(HS Mode)”.
Vosk - ST1 Integration¶
Using the Vosk library is one way to implement speech recognition. Feel free to use a different library.
Vosk Features:
Good Android integration and examples
Simple text streaming output
Fixed or open-ended vocabulary
Offline
Multiple language support
Reasonable size (~50MB per language)
Vosk Documentation: https://alphacephei.com/vosk/android
Vosk Android Example: https://github.com/alphacep/vosk-android-demo
See the HudSpeechRecognitionHelper
for a complete example on how to use Vosk.
The VoiceToTextFragment
example on our GitHub shows how to use HudSpeechRecognitionHelper
within your application.
Vosk’s Android API doesn’t allow selecting a microphone, but it’s possible with reflection. We just need to call recorder.setPreferredDevice(audioDevice)
on the private AudioRecord
instance variable inside SpeechService
.
This should be done while voice recognition is off: i.e. be before SpeechService.startListening()
and/or after SpeechService.cancel()
@RequiresApi(api = Build.VERSION_CODES.M)
private static void setPreferredDeviceWithReflection(SpeechService service, AudioDeviceInfo audioDevice) {
if (audioDevice == null) {
Log.i(TAG, "No external mic requested");
return;
}
try {
Field recorderField = SpeechService.class.getDeclaredField("recorder");
recorderField.setAccessible(true);
AudioRecord recorder = (AudioRecord) recorderField.get(service);
if (recorder == null) {
Log.w(TAG, "Getting recorder with reflection failed");
return;
}
boolean worked = recorder.setPreferredDevice(audioDevice);
if (!worked) {
Log.w(TAG, "Unable to request that the mic be used");
}
} catch (IllegalAccessException | NoSuchFieldException e) {
e.printStackTrace();
}
}
Vosk - Controlling an App¶
Voice control allows interaction with an application while keeping both hands free. This means the phone can be put away while your voice controls the ST1.
In this situation, Vosk would likely be running in a Foreground Service.
In the HudSpeechRecognitionHelper
example, HudSpeechRecognitionHelper.Callback.onVoiceCommand(String text)
is called whenever a partial phrase is detected.
By using a fixed vocabulary each word can be mapped to an action with a simple String compare against text
.