Introduction

The following examples are not strictly part of the Six15 SDK. They could change from release to release in a non-backwards compatible way. Their purpose is to be included in your application for your own use. They are a starting point for your app. Feel free to modify them to fit your needs.

This page heavily references our GitHub page https://github.com/Six15-Technologies/ST1-Examples.

Ways to use the Display

There are many ways to show content on the ST1’s display. Each one has has its own own pros and cons. Each of these solutions has corresponding example code on our GitHub page.

How much integration effort is desired?

Low effort solutions have more limitations. Versatile solutions are more complex.

Here are some solutions sorted from low effort to high effort.

Display Methods: Pro/Con

Method

Effort

Pro

Con

Screen mirroring

None/Low

Possible without writing any code using the included apps, or through a single SDK call.

Can display any app, even apps without ST1 integration.

Limited to content already on the display.

Phone screen must be powered on.

User must accept system dialog every time.

Intent Interface

Low

Send simple text content to the HUD with a broadcast intent.

No need for complex AIDL Interface integration.

Image can persist when the content on your app .

Images are automatically re-sent when the ST1 is re-connected.

Can run with the phone’s display off.

You must start our foreground service with a startActivity() call, or have the user start through our app.

Send Bitmaps/JPEG’s with the SDK

Low

Can display content different from what’s on the display.

Can run from a background Service.

Can run with the phone’s screen off.

Manually rendering bitmaps can be difficult if they need to be dynamically created.

HudViewMirroringHelper

Low

Mirror an already visible View/ViewGroup in your view hierarchy

Possible with just a few lines of code.

Image can persist when screen turns off, or when Activity is in the background.

Only renders the contents of a View already on the screen.

Requires either: 1) minSdkVersion 26 for PixelCopy, or 2) Contrasting colors for onDraw.

Not all View’s are supported (Surfaces and related).

Presentation Mode

Medium

Can display content different from what’s on the display.

Uses Android’s standard and well supported Presentation (i.e. external display) API.

Application components, like Fragments or Custom Views, can be easily shown on the ST1.

Can run from a background Service.

Android only sends frames when content changes saving power.

Phone screen must be powered on.

HudViewRenderingHelper

Medium

Can display content different from what’s on the display.

Uses Android’s standard, flexible, and familiar View hierarchy.

Application Custom Views and Layouts can be reused.

Can run from a background Service.

Can run with the phone’s display off.

Not all View’s are supported (Surfaces and related)

Some Views which depend on being attached to a Window, like ViewPager, don’t work.

Static vs Dynamically generated Images

Static:

Static images can easily be included inside your application, probably in res/raw, and sent using the SDK.

IHudService mHmdService;
...
InputStream is = getResources().openRawResource(R.raw.test_image);
Bitmap bitmap = BitmapFactory.decodeStream(is);
mHmdService.setAutoResizeImage(true);
mHmdService.sendImageToHud(new ImageFrame(bitmap));

Dynamic:

Dynamic content, with changing images, text, video, or GIFs have more options. Bitmaps or JPEG’s can be generated on the host device. Or you can methods like: screen mirroring, presentation mode, the intent interface, HudViewRenderingHelper, etc…

Rendering images with the phone’s display off

On Zebra’s TC series of phones, the 7 contact USB charge cable is not functional when the display is off. This means the ST1 can not function on TC phones when the phone’s display is off.

This is not the case for other Android devices.

Method

Works with Phone Display Off

Screen Mirroring

No

Presentation Mode

No

HudViewMirroringHelper

No

Intent Interface

Yes

Send Bitmaps/JPEG’s with the SDK

Yes

HudViewRenderingHelper

Yes

HudViewRenderingHelper

Presentation mode works well, but it does not work when the phone’s screen is off. Therefore it can’t be used in every situation, especially those desiring hands free usage. For simple content Android’s Canvas API can be used to render Bitmaps. Bitmaps can then be sent through the AIDL Interface to the HUD.

Canvas’s allows you to render simple shapes, images, and text. This quickly becomes unmanageable with even moderately complex content.

It’s possible to render a View hierarchy into a Bitmap using a Canvas. This can even be done from a Service while the display is off. Six15 wrote the HudViewRenderingHelper class to help with this process. Not every View is compatible with rendering in this way. This is because:

  • onAttachedToWindow() is never called.

  • Surfaces, and therefore SurfaceView, TextureView, and related Views, are not supported. (WebView is not a Surface)

  • invalidate() and requestLayout() are not handled automatically like normal.

For example, ViewPager sets internal state based on onAttachedToWindow() so doesn’t work. Rendering Video is difficult without a Surface.

That being said, many things do work.

  • All standard (Android or AndroidX) ViewGroups

  • TextView, ImageView, etc…

  • Animations like translation and rotation.

invalidate() and requestLayout() need to be handled manually through HudViewRenderingHelper.triggerLayout() and HudViewRenderingHelper.draw() or HudViewRenderingHelper.setAutoDraw(true). Dealing with this isn’t normally an issue.

The Intent Interface uses HudViewRenderingHelper inside a foreground service to draw its images.

For more examples on how to see use the HudViewRenderingHelper see our GitHub.

https://github.com/Six15-Technologies/ST1-Examples/blob/master/examples_test/src/main/java/com/six15/examples_test/view_rendering/BackgroundViewRenderingService.java

HudIntentInterface

The example class HudIntentInterface defines static constants and static functions to help when using the Intent Interface from code. It defines helper functions to correctly start and stop the Intent Interface foreground service. It also defines all the actions and extras used by the interface.

For more example Intent Interface screens see our GitHub.

https://github.com/Six15-Technologies/ST1-Examples/blob/master/examples_test/src/main/java/com/six15/examples_test/intent_interface/IntentInterfaceActivity.java

AIDL Client App Architecture

In an Activity, Presentation, or Fragment, you should bind/unbind to Six15’s Service in onStart() and onStop().

In a Service, or Fragment with setRetainInstance(true), you should bind/unbind in onCreate() and onDestroy().

HudServiceConnection is a helper class which binds to the Six15 Service. It also exposes extra conveniences regarding threading, error handling, and JSON parsing.

The Abstract classes below can be extended to properly make use of HudServiceConnection:

  • HudCompatActivity.java

  • HudPresentation.java

  • HudFragment.java

  • HudService.java

  • HudRetainedFragment.java

HudServiceConnection can also be used standalone.

There can be more than one HudServiceConnection instance connected at a time, even within the same app.

The examples on our GitHub page use HudServiceConnection extensively.

If you want to clear the HUD when your app closes, the best time to do so is in onPause() while isFinishing() == true. This makes sure clearing is complete before another Activity is resumed and might re-draw on the HUD.

@Override
public void onPause() {
   super.onPause();
   //Only clear when we're finishing, not when we leave the app or turn the display off.
   if (mHmdService != null && requireActivity().isFinishing()) {
      try {
            mHmdService.clearHudDisplay();
      } catch (RemoteException e) {
            e.printStackTrace();
      }
   }
}

Offline Speech Recognition

Selecting the Correct Microphone

On Android versions above M it’s possible to select microphones based on their name. This can be used to specifically target the ST1’s microphone.

AudioManager audioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
AudioDeviceInfo[] devices = audioManager.getDevices(AudioManager.GET_DEVICES_INPUTS);
...
@RequiresApi(api = Build.VERSION_CODES.M)
private static boolean isSix15Mic(@NonNull AudioDeviceInfo device) {
   return Constants.MIC_PRODUCT_NAME.contentEquals(device.getProductName());
}

The value of Constants.MIC_PRODUCT_NAME is “USB-Audio - SIX15-HUD(HS Mode)”.

Vosk - ST1 Integration

Using the Vosk library is one way to implement speech recognition. Feel free to use a different library.

Vosk Features:

  • Good Android integration and examples

  • Simple text streaming output

  • Fixed or open-ended vocabulary

  • Offline

  • Multiple language support

  • Reasonable size (~50MB per language)

Vosk Documentation: https://alphacephei.com/vosk/android

Vosk Android Example: https://github.com/alphacep/vosk-android-demo

See the HudSpeechRecognitionHelper for a complete example on how to use Vosk.

The VoiceToTextFragment example on our GitHub shows how to use HudSpeechRecognitionHelper within your application.

Vosk’s Android API doesn’t allow selecting a microphone, but it’s possible with reflection. We just need to call recorder.setPreferredDevice(audioDevice) on the private AudioRecord instance variable inside SpeechService.

This should be done while voice recognition is off: i.e. be before SpeechService.startListening() and/or after SpeechService.cancel()

@RequiresApi(api = Build.VERSION_CODES.M)
private static void setPreferredDeviceWithReflection(SpeechService service, AudioDeviceInfo audioDevice) {
   if (audioDevice == null) {
      Log.i(TAG, "No external mic requested");
      return;
   }
   try {
      Field recorderField = SpeechService.class.getDeclaredField("recorder");
      recorderField.setAccessible(true);
      AudioRecord recorder = (AudioRecord) recorderField.get(service);
      if (recorder == null) {
            Log.w(TAG, "Getting recorder with reflection failed");
            return;
      }
      boolean worked = recorder.setPreferredDevice(audioDevice);
      if (!worked) {
            Log.w(TAG, "Unable to request that the mic be used");
      }
   } catch (IllegalAccessException | NoSuchFieldException e) {
      e.printStackTrace();
   }
}

Vosk - Controlling an App

Voice control allows interaction with an application while keeping both hands free. This means the phone can be put away while your voice controls the ST1.

In this situation, Vosk would likely be running in a Foreground Service.

In the HudSpeechRecognitionHelper example, HudSpeechRecognitionHelper.Callback.onVoiceCommand(String text) is called whenever a partial phrase is detected. By using a fixed vocabulary each word can be mapped to an action with a simple String compare against text.