## Obtain Data Stream - 3.2.1 Stream profile - 3.2.2 Obtain Video Stream - 3.2.3 Obtain IMU Data - 3.2.4 Point Cloud: - Depth Point Cloud - RGBD Point Cloud ### Stream Profile Method 1: obtain stream profile by resolution, frame format, and frame rate ```c# Pipeline pipeline = new Pipeline(); // Obtain color profile through resolution, frame format and frame rate. A resolution width and height of 0 indicates wildcard characters StreamProfile colorProfile = pipeline.GetStreamProfileList(SensorType.OB_SENSOR_COLOR).GetVideoStreamProfile(0, 0, Format.OB_FORMAT_RGB, 0); // Get depth profile through resolution, frame format and frame rate, resolution width and height are 0 to indicate wildcard characters StreamProfile depthProfile = pipeline.GetStreamProfileList(SensorType.OB_SENSOR_DEPTH).GetVideoStreamProfile(0, 0, Format.OB_FORMAT_Y16, 0); ``` Method 2: Obtain stream profile through video frames ```c# var frames = pipeline.WaitForFrames(100); var colorFrame = frames?.GetColorFrame(); // Get color profile var colorSp = colorFrame.GetStreamProfile(); ``` ### Obtain Video Stream This section describes how to obtain color video frame data. The method of retrieving deep frame data is similar. ```c# // 1. Create Config and Pipeline objects Pipeline pipeline = new Pipeline(); Config config = new Config(); // 2. Get Color profile StreamProfile colorProfile = pipeline.GetStreamProfileList(SensorType.OB_SENSOR_COLOR).GetVideoStreamProfile(0, 0, Format.OB_FORMAT_RGB, 0); // 3. Enable Color stream config.EnableStream(colorProfile); pipeline.Start(config); // 4. Wait for a frame of data var frames = pipeline.WaitForFrames(100); var colorFrame = frames?.GetColorFrame(); ``` This section describes how to obtain ir video frame data. ```c# // 1. Create Config and Pipeline objects Pipeline pipeline = new Pipeline(); Config config = new Config(); // 2. Retrieve sensor list through device object Device device = pipeline.GetDevice(); SensorList sensorList = device.GetSensorList(); // 3. Traverse the IR streams supported by the enabled device for (uint i = 0, N = sensorList.SensorCount(); i < N; i++) { SensorType sensorType = sensorList.SensorType(i); if (sensorType == SensorType.OB_SENSOR_IR || sensorType == SensorType.OB_SENSOR_IR_LEFT || sensorType == SensorType.OB_SENSOR_IR_RIGHT) { config.EnableVideoStream(sensorType, 0, 0, 30, Format.OB_FORMAT_Y8); } } pipeline.Start(config); // 4. Wait for a frame of data var frames = pipeline.WaitForFrames(100); var irFrame = frames?.GetIRFrame(); var irLeftFrame = frames?.GetFrame(FrameType.OB_FRAME_IR_LEFT)?.As(); var irRightFrame = frames?.GetFrame(FrameType.OB_FRAME_IR_RIGHT)?.As(); ``` The Gemini 330 series supports left IR and right IR sensors, with the IR sensor types being OBSensorType.LEFT_IR_SENSOR and OBSensorType.RIGHT_IR_SENSOR, respectively. The depth sensor type is OBSensorType.DEPTH_SENSOR, and the color sensor type is OBSensorType.COLOR_SENSOR ### Obtain IMU Data Please refer to Sample 1.stream.imu. ```c# Pipeline pipeline = new Pipeline(); Device device = pipeline.GetDevice(); // Set frame callback function frameCallback = OnFrame; // enable accel and gyro Sensor accelSensor = device.GetSensor(SensorType.OB_SENSOR_ACCEL); using (StreamProfileList accelProfileList = accelSensor.GetStreamProfileList()) { using (StreamProfile accelProfile = accelProfileList.GetProfile(0)) { accelSensor.Start(accelProfile, frameCallback); } } Sensor gyroSensor = device.GetSensor(SensorType.OB_SENSOR_GYRO); using (StreamProfileList gyroProfileList = gyroSensor.GetStreamProfileList()) { using (StreamProfile gyroProfile = gyroProfileList.GetProfile(0)) { gyroSensor.Start(gyroProfile, frameCallback); } } // Obtain accel and gyro frames through frame callback function var accelFrame = frame.As(); var gyroFrame = frame.As(); ``` ### Point Cloud #### Depth Point Cloud ```c# Pipeline pipeline = new Pipeline(); Config config = new Config(); config.EnableVideoStream(StreamType.OB_STREAM_DEPTH, 0, 0, 0, Format.OB_FORMAT_Y16); // Start the stream pipeline.Start(config); // Create point cloud filter PointCloudFilter pointCloud = new PointCloudFilter(); // Set point cloud format pointCloud.SetCreatePointFormat(Format.OB_FORMAT_POINT); // Capture one frame of data var frames = pipeline.WaitForFrames(100); // Get Depth frame var depthFrame = frames?.GetDepthFrame(); // Apply the point cloud filter Frame frame = pointCloud.Process(depthFrame); PointsFrame pointsFrame = frame.As(); // Save point cloud data if (pointsFrame.GetFormat() == Format.OB_FORMAT_POINT) { SavePointsToPly(pointsFrame, "DepthPoints.ply"); } ``` #### RGBD Point Cloud ```c# Pipeline pipeline = new Pipeline(); Config config = new Config(); config.EnableVideoStream(StreamType.OB_STREAM_DEPTH, 0, 0, 0, Format.OB_FORMAT_Y16); config.EnableVideoStream(StreamType.OB_STREAM_COLOR, 0, 0, 0, Format.OB_FORMAT_RGB); config.SetFrameAggregateOutputMode(FrameAggregateOutputMode.OB_FRAME_AGGREGATE_OUTPUT_ALL_TYPE_FRAME_REQUIRE); // Enable frame synchronization pipeline.EnableFrameSync(); // Start the stream pipeline.Start(config); // Create point cloud and align filters PointCloudFilter pointCloud = new PointCloudFilter(); AlignFilter align = new AlignFilter(StreamType.OB_STREAM_COLOR); // Set point cloud format pointCloud.SetCreatePointFormat(Format.OB_FORMAT_RGB_POINT); // Capture one frame of data var frames = pipeline.WaitForFrames(100); // Apply the alignment filter Frame alignedFrameset = align.Process(frames); // Apply the point cloud filter Frame frame = pointCloud.Process(alignedFrameset); PointsFrame pointsFrame = frame.As(); // Save point cloud data if (pointsFrame.GetFormat() == Format.OB_FORMAT_RGB_POINT) { SavePointsToPly(pointsFrame, "RGBPoints.ply"); } ```