This Glassware demonstrates a simple implementation of the LiveCard API.
It creates a LiveCard when the app's main activity is first activated (through voice input).
The lifecycle of the LiveCard is associated with that of a background Service (LiveCardDemoLocalService
).
It also shows how to add a menu to a LiveCard,
which is bound to the main activity.
(Cf. Live Card Menu Demo.)
Blog Post: GDK LiveCard Sample Code.
The purpose of this example is to test the "low frequency rendering" of Live Cards.
It uses an Android TimerTask
to update the live card's content every 15 seconds.
It also has slightly different behavior from the preivious version "LiveCard Demo" in that it sets the "NonSlient" flag to true. If you want to stop the app, tap the LiveCard screen to go back to the main Activity screen, from which tapping one more will exit the program (after removing the live card, etc.)
Blog Post: Google GDK Playground: Live Card Example 2 - Low Frequency Rendering.
This demo app includes sample code for the "high frequency Live Cards". It uses an Android local service which implements the LiveCardCallback interface to draw on the card's canvas.
The sample app merely draws the solid bacground with a random/time-changing color.
Blog Post: Google Glass GDK Live Card Surface Rendering Example.
This simple Glass app is based on the sample code found in the GDK doc: Touch Gesture.
It adds some minor changes
so that the gesture event is displayed on the screen
(either on a separrate activity or as a Toast,
based on the value of the Config variable, USE_GESTURE_INFO_ACTIVITY
).
You can start the app through the following voice input.
OK, Glass. Start Gesture Demo
Blog Post: Glass Development Kit Sample Code - Touch Gesture Detector.
This demo app includes two activities:
VoiceDemoActivity
- This is the main activity, and it demonstrates the voice trigger and voice prompt.VoiceDictationActivity
- This activity shows how to call the system-provided Speech Recognition API via the intent typeRecognizerIntent.ACTION_RECOGNIZE_SPEECH
.
After starting the app via the voice command, OK, Glass. Start Voice Demo,
speak dictate at the voice prompt, "next action",
to start the VoiceDictationActivity
. It can also be activated using the TAP gesture.
Tap to start dictation.
Blog Post: Google Glass GDK Sample Code - Voice Input Demo Application.
This is a simple extension of the first Voice Input Demo app. It includes two voice trigger-enabled activities to demonstrate the use of multiple voice trigger commands.
Blog Post: GDK Voice Trigger Sample App.
This first demo app using Google Glass Camera relies on the stock Camera activity to take photos.
The path of the photo/image file (JPG) is passed to the calling activity, CameraDemoActivity
, through the extra param, Camera.EXTRA_PICTURE_FILE_PATH
.
The image file at the given path is then polled to check if the file is ready.
If so, it is converted to a bitmatp and it is displyed in the ImageView
of the livecard.
Bitmap bitmap1 = BitmapFactory.decodeFile(filePath);
remoteViews.setImageViewBitmap(R.id.livecard_image, bitmap1);
Blog Posts:
- Google Glass GDK Camera API Sample.
- GDK Camera Demo: How to Display Dynamic Image Content on Live Card Using RemoteViews API.
This demo Glassware uses the Android LocationManager
API
to display the user's location on the LiveCard.
The location is dynamically updated based on the data supplied by a number of location providers
(including the "remote" location provider, if available).
A simple (toy) algorithm is used to get the "best" location information at any given time.
Blog Post: Glassware GDK Code Example - Location API Demo.
A number of sample Glassware illustrating the use of Android Sensor API are included in the Sensor Demo directory.
The GDK examples in this Timeline Demo direcotry are mostely related to TimelineManager
or other classes such as Card
which are relevant to timeline.
Sample Glass apps illustrating the use of general Window - UI related features are included in the Window Demo directory.