Monday, December 1, 2014

google cardboard tutorial-

https://developers.google.com/cardboard/get-started

files

https://github.com/googlesamples/cardboard/

Tutorial

This document describes how to use the experimental VR Toolkit to create your own Virtual Reality (VR) experiences.

Android demo app: Treasure Hunt

The code examples in this tutorial are taken from the "Treasure Hunt" Android demo app.
Cardboard is a simple device that unlocks the power of your smart phone as a VR platform. Despite costing around $2, Cardboard can work with your phone to display 3D scenes with binocular rendering, track and react to head movements, and interact with apps through magnet input. The demo app “Treasure Hunt” illustrates these features. In the game, users look around a digital world to find and collect objects as quickly as possible. It’s a basic game, but it demonstrates the core features of Cardboard.

Game features

The "Treasure Hunt" game starts with instructions rendered as 3D text. Users are instructed to pull the magnet when they find an object.
This is what appears on screen. This will appear as a 3D scene when viewed in Cardboard.
When a user centers a cube in the middle of the screen, the cube indicates this by changing its color to yellow. When the user pulls the magnet while the cube is yellow, the user's score increases, and the cube moves to a new location.
The app uses OpenGL ES 2.0 to display objects. It demonstrates some basic features, such as lighting, movement in space, and coloring. It shows how to use the magnet as input, how to check if the user is looking at something, and how to render images by providing a different view for each eye.

Get the VR Toolkit JAR

To use the Cardboard API, download the VR Toolkit .jar file and include it in your project.

Edit the manifest

These excerpts from the demo app's manifest illustrate mandatory and recommended settings for a Cardboard app:
<manifest ...
    <uses-permission android:name="android.permission.NFC" />
    <uses-permission android:name="android.permission.VIBRATE" />
    ...
    <uses-sdk android:minSdkVersion="16"/>
    <uses-feature android:glEsVersion="0x00020000" android:required="true" />
    <application
            ...
        <activity
                android:screenOrientation="landscape"
                ...
        </activity>
    </application>
</manifest>
Note the following:
  • <uses-sdk android:minSdkVersion="16"/> indicates that the device must be running API Level 16 (Jellybean) or higher.
  • <uses-feature android:glEsVersion="0x00020000" android:required="true" /> indicates that the device must support OpenGL ES 2.0 to run the demo app.
  • android:screenOrientation="landscape" indicates that the activity's required screen orientation is "landscape." This is the orientation you must set for VR apps. The view used by the toolkit, CardboardView, only renders on fullscreen and landscape (landscape, reverseLandscape, sensorLandscape) modes.
  • Even though the demo app doesn't include it, the setting android:configChanges="orientation|keyboardHidden" is also recommended.
  • android.permission.NFC and android.permission.VIBRATE. You need the permission android.permission.NFC to access Cardboard's NFC tag, and the android.permission.VIBRATE permission to cause the phone to vibrate (which is how the demo app informs the user that something has happened).

Extend CardboardActivity

CardboardActivity is the starting point for coding a cardboard app. CardboardActivity is the base activity that provides easy integration with Cardboard devices. It exposes events to interact with Cardboards and handles many of the details commonly required when creating an activity for VR rendering.
Note that CardboardActivity uses sticky immersive mode, in which the system UI is hidden, and the content takes up the whole screen. This is a requirement for a VR app, since CardboardView will only render when the activity is in fullscreen mode. See Using Immersive Full-Screen Mode for more discussion of this feature.
The demo app's MainActivity extends CardboardActivity. MainActivity implements the following interface:
  • CardboardView.StereoRenderer: Interface for renderers that delegate all stereoscopic rendering details to the view. Implementors should simply render a view as they would normally do using the provided transformation parameters. All stereoscopic rendering and distortion correction details are abstracted from the renderer and managed internally by the view.

Get the CardboardView

All user interface elements in an Android app are built using views. The VR Toolkit provides its own view, CardboardView, which is a convenience extension of GLSurfaceView that can be used for VR rendering. CardboardView renders content in stereo. You initialize your CardboardView in your activity's onCreate() method:
private float[] mModelCube;
private float[] mCamera;
private float[] mView;
private float[] mHeadView;
private float[] mModelViewProjection;
private float[] mModelView;
private float[] mModelFloor;
private Vibrator mVibrator;
...

**
 * Sets the view to our CardboardView and initializes the transformation matrices we will use
 * to render our scene.
 * @param savedInstanceState
 */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.common_ui);
    CardboardView cardboardView = (CardboardView) findViewById(R.id.common_paperscope_view);
    // Associate a CardboardView.StereoRenderer with cardboardView.
    cardboardView.setRenderer(this);
    // Associate the cardboardView with this activity. 
    setCardboardView(cardboardView);
    mModelCube = new float[16];
    mCamera = new float[16];
    mView = new float[16];
    mModelViewProjection = new float[16];
    mModelView = new float[16];
    mModelFloor = new float[16];
    mHeadView = new float[16];
    mVibrator = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);
...
}

Render the view

Once you get the CardboardView you associate it with a renderer, and then you associate the CardboardView with the activity. Cardboard supports two kinds of renderers, but the quickest way to get started is to use CardboardView.StereoRenderer, which is what the demo app uses.
CardboardView.StereoRenderer includes these key methods:
  • onNewFrame(), called every time that app renders.
  • onDrawEye(), called for each eye with different eye parameters.
Implementing these is similar to what you would normally do for an OpenGL application. These methods are discussed in more detail in the following sections.

Implement onNewFrame

Use the onNewFrame() method to to encode rendering logic before the individual eyes are rendered. Any per-frame operations not specific to a single view should happen here. This is a good place to update your model. In this snippet, the variable mHeadView contains the position of the head. This value needs to be saved to use later to tell if the user is looking at treasure:
private int mGlProgram = GLES20.glCreateProgram();
private int mPositionParam;
private int mNormalParam;
private int mColorParam;
private int mModelViewProjectionParam;
private int mLightPosParam;
private int mModelViewParam;
private int mModelParam;
private int mIsFloorParam;
private float[] mHeadView;
...
/**
 * Prepares OpenGL ES before we draw a frame.
 * @param headTransform The head transformation in the new frame.
 */
@Override
public void onNewFrame(HeadTransform headTransform) {
    GLES20.glUseProgram(mGlProgram);
    mModelViewProjectionParam = GLES20.glGetUniformLocation(mGlProgram, "u_MVP");
    mLightPosParam = GLES20.glGetUniformLocation(mGlProgram, "u_LightPos");
    mModelViewParam = GLES20.glGetUniformLocation(mGlProgram, "u_MVMatrix");
    mModelParam = GLES20.glGetUniformLocation(mGlProgram, "u_Model");
    mIsFloorParam = GLES20.glGetUniformLocation(mGlProgram, "u_IsFloor");
    // Build the Model part of the ModelView matrix.
    Matrix.rotateM(mModelCube, 0, TIME_DELTA, 0.5f, 0.5f, 1.0f);
    // Build the camera matrix and apply it to the ModelView.
    Matrix.setLookAtM(mCamera, 0, 0.0f, 0.0f, CAMERA_Z, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f);
    headTransform.getHeadView(mHeadView, 0);
    checkGLError("onReadyToDraw");
}

Implement onDrawEye

Implement onDrawEye() to perform per-eye configuration.
This is the meat of the rendering code, and very similar to building a regular OpenGL ES2 application. The following snippet shows how to get the view transformation matrix, and also the perspective transformation matrix. You need to make sure that you render with low latency. Otherwise EyeTransform contains the transformation and projection matrices for the eye. This is the sequence of events:
  • The treasure comes into eye space.
  • We apply the projection matrix. This provides the scene rendered for the specified eye.
  • The toolkit applies distortion automatically, to render the final scene.
private int mPositionParam;
private int mNormalParam;
private int mColorParam;
// We keep the light always position just above the user.
private final float[] mLightPosInEyeSpace = new float[] {0.0f, 2.0f, 0.0f, 1.0f};
...
/**
 * Draws a frame for an eye. The transformation for that eye (from the camera) is passed in as
 * a parameter.
 * @param transform The transformations to apply to render this eye.
 */
@Override
public void onDrawEye(EyeTransform transform) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    mPositionParam = GLES20.glGetAttribLocation(mGlProgram, "a_Position");
    mNormalParam = GLES20.glGetAttribLocation(mGlProgram, "a_Normal");
    mColorParam = GLES20.glGetAttribLocation(mGlProgram, "a_Color");
    GLES20.glEnableVertexAttribArray(mPositionParam);
    GLES20.glEnableVertexAttribArray(mNormalParam);
    GLES20.glEnableVertexAttribArray(mColorParam);
    checkGLError("mColorParam");
    // Apply the eye transformation to the camera.
    Matrix.multiplyMM(mView, 0, transform.getEyeView(), 0, mCamera, 0);
    // Set the position of the light
    Matrix.multiplyMV(mLightPosInEyeSpace, 0, mView, 0, mLightPosInWorldSpace, 0);
    GLES20.glUniform3f(mLightPosParam, mLightPosInEyeSpace[0], mLightPosInEyeSpace[1],
            mLightPosInEyeSpace[2]);
    // Build the ModelView and ModelViewProjection matrices
    // for calculating cube position and light.
    Matrix.multiplyMM(mModelView, 0, mView, 0, mModelCube, 0);
    Matrix.multiplyMM(mModelViewProjection, 0, transform.getPerspective(), 0, mModelView, 0);
    drawCube();
    // Set mModelView for the floor, so we draw floor in the correct location
    Matrix.multiplyMM(mModelView, 0, mView, 0, mModelFloor, 0);
    Matrix.multiplyMM(mModelViewProjection, 0, transform.getPerspective(), 0,
        mModelView, 0);
    ...
}

Cardboard input hardware

Cardboard includes the following hardware components for interacting with an Android app:
  • Magnets are used to create a push button. When you push the magnet, the magnetic field changes and is detected by the magnetometer of your phone. This is used to trigger operations in the demo app. This behavior is provided by the parent activity (CardboardActivity) implementation of the MagnetSensor.OnCardboardTriggerListener interface.
  • An NFC tag. When you insert a device that has the Android demo app on it into the Cardboard device, NFC triggers the launch of the demo app. This behavior is provided by the parent activity (CardboardActivity) implementation of the NfcSensor.OnCardboardNfcListener interface.
You can programmatically modify your app's interactions with these components, as described below.

Listen for magnet pulls

To provide custom behavior when the user pulls the magnet, override CardboardActivity.onCardboardTrigger() in your app's activity. In the treasure hunt app, if you found the treasure and pull the magnet, you get to keep the treasure:
private Vibrator mVibrator;
private CardboardOverlayView mOverlayView;
...
/**
 * Increment the score, hide the object, and give feedback if the user pulls the magnet while
 * looking at the object. Otherwise, remind the user what to do.
 */
@Override
public void onCardboardTrigger() {
    if (isLookingAtObject()) {
        mScore++;
        mOverlayView.show3DToast("Found it! Look around for another one.\nScore = " + mScore);
        ...
    } else {
        mOverlayView.show3DToast("Look around to find the object!");
    }
    // Always give user feedback
    mVibrator.vibrate(50);
}

/**
 * Check if user is looking at object by calculating where the object is in eye-space.
 * @return
 */
private boolean isLookingAtObject() {
    float[] initVec = {0, 0, 0, 1.0f};
    float[] objPositionVec = new float[4];

    // Convert object space to camera space. Use the headView from onNewFrame.
    Matrix.multiplyMM(mModelView, 0, mHeadView, 0, mModelCube, 0);
    Matrix.multiplyMV(objPositionVec, 0, mModelView, 0, initVec, 0);

    float pitch = (float)Math.atan2(objPositionVec[1], -objPositionVec[2]);
    float yaw = (float)Math.atan2(objPositionVec[0], -objPositionVec[2]);
    ...
    return (Math.abs(pitch) < PITCH_LIMIT) && (Math.abs(yaw) < YAW_LIMIT);
}

Customize NFC behavior

As described in Extend CardboardActivity, the CardboardActivity base class implements the NfcSensor.OnCardboardNfcListener interface. This method is called whenever a phone is inserted in a Cardboard.
You can override the NfcSensor.OnCardboardNfcListener methods onInsertedIntoCardboard() and onRemovedFromCardboard() methods to to switch VR (binocular) mode on and off. Thus you could make the treasure hunt work without Cardboard too, by disabling VR mode. For example:
// Enable VR mode when phone is inserted into Cardboard
@Override 
public void onInsertedIntoCardboard() {
  this.cardboardView.setVRModeEnabled(true);
}
// Disable VR mode when phone is removed from Cardboard
@Override 
public void onRemovedFromCardboard() {
  this.cardboardView.setVRModeEnabled(false);
}

1 comment: