Skip to content

Latest commit

 

History

History
126 lines (73 loc) · 8.07 KB

readme.md

File metadata and controls

126 lines (73 loc) · 8.07 KB

Unity Wrapper for RealSense SDK 2.0

Download realsense.unitypackage and go to Assets > Scenes > Start Here to see the home screen above

Overview

The Unity wrapper for RealSense SDK 2.0 allows Unity developers to add streams from RealSense devices to their scenes using provided textures. We aim to provide an easy to use prefab which allows device configuration, and texture binding using the Unity Inspector, without having to code a single line of code. Using this wrapper, Unity developers can get a live stream of Color, Depth and Infrared textures. In addition we provide a simple way to align textures to one another (using Depth), and an example of background segmentation.

Table of content

Getting Started

Step 1 - Build Dependencies And Open The Unity Project

The Unity wrapper depends on the C# wrapper provided by the RealSense SDK 2.0. At the end of this step, the Unity project will be available in the CMake output folder:

  • Generate the VS solution using cmake (run from librealsense root dir):

    • mkdir build
    • cd build
    • cmake .. -DBUILD_CSHARP_BINDINGS=ON -DBUILD_UNITY_BINDINGS=ON -DBUILD_SHARED_LIBS=ON -DDOTNET_VERSION_LIBRARY=3.5
  • The file 'realsense2.sln' should be created in 'build' folder, open the file with Visual Studio, C# examples and library will be available in the solution under 'Wrappers/csharp'.

  • Build Intel.RealSense project, this step will build both the native library and the .NET wrapper and will copy both DLLs to the Unity project Plugins folder.

  • In case Unity is installed and activated, a Unity package should be created at build/<configuration>/realsense.unitypackage. This file can be imported by the Unity editor. If Unity is not installed or activated a Unity project folder will still be avilable at 'build/wrappers/unity'.

    NOTE: Unity requires that manage assemblies (such as the C# wrapper) are targeted to .NET Framework 3.5 and lower. The .NET wrapper provides assemblies for multiple targets, one of which is .NET 3.5 (net35).

Step 2 - Open a Unity Scene

The Unity wrapper provides several example scenes to help you get started with RealSense in Unity. Open one of the following scenes under unity/Assets/RealSenseSDK2.0/Scenes (Sorted from basic to advanced):

  1. RealSense Textures Depth and Infrared - Basic 2D scene demonstrating how to bind textures to a RealSense device stream. The Scene provides 2 live streams and 2 different textures: Depth and Infrared.
  2. RealSense Textures Depth and Color (Only cameras with RGB sensor are supported) - Basic 2D scene demonstrating how to bind textures to a RealSense device stream. The Scene provides 2 live streams and 3 different textures: Depth, Color and Color with background segmentation.
  3. RealSense PointCloud Depth - 3D scene demonstrating how to bind a PointCloud prefab to RealSense device depth stream.
  4. RealSense PointCloud Depth and Color - 3D scene demonstrating how to bind a PointCloud prefab to RealSense device depth and color streams.

Prefabs

RealSenseDevice

The RealSenseDevice provides an encapsulation of a single device connected to the system. The following image displays the RealSenseDevice script:

image

Process Mode

This option indicates which threading model the user expects to use.

  • Multithread - A separate thread will be used to raise device frames.
  • UnityThread - Unity thread will be used to raise new frames from the device.

Note that this option affects all other scripts that do any sort of frame processing.

Configuration

The device is configured the same way that a Pipeline is, i.e. with a Config. In order to make it available via Unity Inspector, a Configuration object is exposed for each RealSenseDevice. When Starting the scene, the device will try to start streaming the requested configuration (Pipeline.Start(config) is called). Upon starting the device, the device will begin raising frames via its OnNewSample and OnNewSampleSet public events. These frames are raised either from a separate thread or from the Unity thread, depending on the user's choice of Process Mode.

In addition to stream configuration, the panel also allows users to select a specific device by providing its serial number.

image

RealSense Device Inspector

Once the device is streaming, Unity Inspector will show the device's sensors and allow controlling their options via the GUI under the RealSenseDeviceInspector script. Changing these options affect the sensor directly on runtime:

image

Texture Streams

Under the RealSenseDeivce object in Unity's Hierarchy view, you can find a number of textures that bind to the device's frame callback and allow user to bind a texture to be updated upon frame arrival.

image

Each texture stream is associated with the RealSenseStreamTexture script which allows user to bind their textures so that they will be updated with each new frame. The following screenshot displays the configurations available for users when using a texture stream:

image

  • Source Stream Type - Indicates to the script from which stream of the device it should take frames.
  • Texture Format - Indicates which texture should be created from the input frame.
  • Texture Binding - Allows the user to bind textures to the script. Multiple textures can be bound to a single script.
  • Fetch Frames From Device - Toggle whether the script should fetch the frames from the device, or should wait for the user to pass frame to is using its OnFrame method.

The Alignment object is a special case of texture stream which uses the AlignImages script. This script takes FrameSets from the device, performs image alignment between the frames of the frame set, and passes them to each of the scripts stream texture: from and to which will provide textures that are aligned to one another.

image

An example usage of this script is to perform background segmentation between depth and color images by turning each colored pixel that is not within the given range into a grayscale pixel. The scene presents this example in the bottom right image labeled "Background Segmentation". Segmentation is performed using the BGSeg shader.

PointCloud

Also under the RealSenseDeivce object in Unity's Hierarchy view, you can find PointCloud object that provides a 3D point cloud of the depth data in the form of Unity Particles (Using the Particle System).

The PointCloud object uses the PointCloudGenerator.cs script which allows some user control over the output:

image

  • Gradient - is a color gradient used to color the particles
  • Points Size - Controls the size of the Particles
  • Skip Particles - A factor >= 1 meaning how many points to skip when creating the particles.

Images

The Unity wrapper comes with a set of Unity Materials alongside matching Shaders to provide users with textures created from the device's frames.

To get a texture, we provide several Image objects under the Images prefab:

image

The above image shows the ColorizedDepthImage