OrbbecSDK V2 ROS1 Wrapper
1. Overview
1.1. Introduction
1.1.1. Support Hardware Products
1.1.2. Orbbec camera datasheet
1.1.3. Support Platforms
1.2. Orbbec SDK Overview
1.2.1. Terms
1.2.2. Orbbec SDK v2 Architecture Overview
1.2.3. SDK Concept Overview
1.2.4. SDK Programming Model
2. Installation
2.1. Build from Source
2.1.1. Environment
2.1.2. Create ROS Workspace and Build
2.2. Registration script (required)
3. Quickstarts
3.1. ROS Package QuickStarts
3.1.1. Introduction
3.1.2. Build your First Camera Application
3.1.2.1. Step 1: Source ROS 1 and Workspace
3.1.2.2. Step 2: Launch the Camera Node
3.1.2.3. Step 3: Visualize in RViz
3.1.3. Sample Features
3.1.3.1. List available topics / services / parameters
3.1.3.2. Echo a topic
3.1.3.3. Call a service
3.1.3.4. Record with rosbag
3.2. OrbbecViewer QuickStarts
3.2.1. Download
3.2.2. Connect the device
3.2.3. Camera Control
3.2.4. Device information and firmware upgrade
4. Application Guide
4.1. Launch parameters
4.1.1. Core & Stream Configuration
4.1.2. Sensor Controls
4.1.2.1. Color Stream
4.1.2.2. Depth Stream
4.1.2.3. IR Stream
4.1.2.4. Laser / LDP
4.1.3. Device, Sync & Advanced Features
4.1.3.1. Multi-Camera Synchronization
4.1.3.2. Network Cameras
4.1.3.3. Device-Specific
4.1.3.4. Disparity
4.1.3.5. Interleave AE Mode
4.1.3.6. Intra-Camera Synchronization
4.1.4. Basic & General Parameters
4.1.4.1. Firmware & Backend
4.1.4.2. TF, Extrinsics & Calibration
4.1.4.3. Time Synchronization
4.1.4.4. Logging & Diagnostics
4.1.4.5. Miscellaneous
4.1.5. IMU
4.1.6. Depth Filters
4.2. All available services for camera control
4.2.1. Color Stream
4.2.2. Depth Stream
4.2.3. IR Stream
4.2.4. Sensor & Emitter Control
4.2.5. Device Information & Management
4.2.6. Synchronization
4.2.7. Depth Filter Configuration
4.2.8. Data Capture & Calibration Management
4.3. Available Topics
4.3.1. Image Streams
4.3.2. Point Cloud Topics
4.3.3. Device Status & Diagnostics
4.4. Coordinate Systems and TF Transforms
4.4.1. Camera sensor structure
4.4.2. ROS Robot Coordinate System vs Camera Optical Coordinate System
4.4.3. Using ROS1 TF Tools
4.4.3.1. View TF Tree Structure
4.4.3.2. Visualize the TF Tree in rviz
4.4.4. ROS1 Camera TF Calculation and Publication Mechanism
4.4.4.1. Core Function: OBCameraNode::calcAndPublishStaticTransform()
4.4.4.2. Function Explanation
4.5. nabling and Visualizing Point Cloud in ROS 1
4.5.1. Enabling Depth Point Cloud
4.5.1.1. Command to Enable Depth Point Cloud
4.5.1.2. Visualizing Depth Point Cloud in RViz
4.5.2. Enabling Colored Point Cloud
4.5.2.1. Command to Enable Colored Point Cloud
4.5.2.2. Visualizing Colored Point Cloud in RViz
5. Advanced Guide
5.1. Performance & Optimization
5.1.1. Reducing CPU Usage with Orbbec ROS Package
5.1.1.1. Recommended Settings for Lower CPU Usage
5.1.1.2. Launch Files Used for Testing
5.1.1.3. Test environment
5.1.1.4. Test Setup
5.1.1.5. Test Results
5.1.1.6. uvc_backend Comparison (RGB format)
5.1.1.7. color_format Comparison (MJPG vs RGB)
5.1.1.8. Filter Configuration Impact
5.1.1.9. Further Optimization
5.2. Multi-Camera
5.2.1. Using Multiple Cameras with the Orbbec ROS Package
5.2.1.1. Prerequisites
5.2.1.2. Increase usbfs_memory_mb Value (CRITICAL STEP)
5.2.1.3. Identifying Camera USB Ports
5.2.1.4. Launching Multiple Cameras
5.2.1.5. Configuring the TF Tree for Multiple Cameras
5.2.2. Multi_camera synced Instructions
5.2.2.1. Setup instructions
5.2.2.2. Checking camera port with OrbbecSDK_ROS1
5.2.2.3. OrbbecSDK_ROS1 multi-camera synced configuration
5.2.2.4. Run the following command to start the multi-camera synced
5.2.3. Multi-Camera Synchronization Verification Node
5.2.3.1. Usage Guide
5.2.3.2. System Configuration Requirements
5.2.4. Configuring Multiple Cameras with Orbbec ROS Package (Nodelet)
5.2.4.1. Configuring the Nodelet Manager
5.2.4.2. Configuring Camera Nodelets
5.2.4.3. Configuring Camera Parameters
5.2.5. Usage Limitations of GMSL Cameras
5.3. Configuration & Modes
5.3.1. Aligning Depth to Color in ROS
5.3.1.1. Commands to Align and View Depth and Color Images
5.3.1.2. Selecting Topics in RViz
5.3.1.3. Example of Depth to Color Overlay
5.3.2. Configuration of depth NFOV and WFOV modes
5.3.3. Depth work mode switch
5.3.4. Disparity_search_offset
5.3.4.1. Function Introduction
5.3.4.2. Parameter Introduction
5.3.4.3. Run the launch
5.3.5. Using interleave_ae with Gemini330 series cameras
5.3.5.1. Parameter Introduction
5.3.5.2. Run the launch
5.3.5.3. Multi_camera_synced + Interleave_ae
5.3.6. Predefined presets
5.3.7. Net_camera
5.3.7.1. Femto Mega
5.3.7.2. set_device_ip Utility
5.3.7.3. Force IP Function
6. Benchmark
6.1. Introduction
6.1.1. common benchmark node
6.1.2. service benchmark node
6.2. Benchmark Usage
6.2.1. Using common benchmark node
6.2.2. Using service benchmark node
6.2.2.1. ROS1 C++
6.2.2.2. ROS1 Python
6.2.2.3.
Example YAML configuration
6.3. Benchmark Data
6.3.1. Common Benckmark Data
6.3.2. Service Benchmark Data
7. Developer Guide
7.1. Migrating from main to Open-Source v2-main
7.1.1. Introduction
7.1.2. Advantages of Migrating from main to v2-main
7.1.2.1.
Comprehensive Device Support
7.1.2.2.
Transparency and Extensibility
7.1.2.3. Advantages in Maintenance and Updates
7.1.2.4.
Community and Ecosystem Support
7.1.3. Comparison Between main and v2-main Branches
7.1.3.1.
Launch File Differences
7.1.3.2.
Parameter Differences
7.1.3.3.
Topic Differences
7.1.3.4.
Service Differences
7.2. Building a Debian Package
8. Frequently Asked Questions
8.1. Unexpected Crash
8.2. No Data Stream from Multiple Cameras
8.3. Compilation Failure Due to OpenCV Version Issues
8.4. Additional Troubleshooting
8.5. Why Are There So Many Launch Files?
8.6. How to Launch a Specific Camera When Multiple Cameras Are Connected
OrbbecSDK V2 ROS1 Wrapper
OrbbecSDK V2 ROS1 Wrapper documentation
View page source
中文
|
English
OrbbecSDK V2 ROS1 Wrapper documentation
1. Overview
1.1. Introduction
1.1.1. Support Hardware Products
1.1.2. Orbbec camera datasheet
1.1.3. Support Platforms
1.2. Orbbec SDK Overview
1.2.1. Terms
1.2.2. Orbbec SDK v2 Architecture Overview
1.2.3. SDK Concept Overview
1.2.4. SDK Programming Model
2. Installation
2.1. Build from Source
2.1.1. Environment
2.1.2. Create ROS Workspace and Build
2.2. Registration script (required)
3. Quickstarts
3.1. ROS Package QuickStarts
3.1.1. Introduction
3.1.2. Build your First Camera Application
3.1.3. Sample Features
3.2. OrbbecViewer QuickStarts
3.2.1. Download
3.2.2. Connect the device
3.2.3. Camera Control
3.2.4. Device information and firmware upgrade
4. Application Guide
4.1. Launch parameters
4.1.1. Core & Stream Configuration
4.1.2. Sensor Controls
4.1.3. Device, Sync & Advanced Features
4.1.4. Basic & General Parameters
4.1.5. IMU
4.1.6. Depth Filters
4.2. All available services for camera control
4.2.1. Color Stream
4.2.2. Depth Stream
4.2.3. IR Stream
4.2.4. Sensor & Emitter Control
4.2.5. Device Information & Management
4.2.6. Synchronization
4.2.7. Depth Filter Configuration
4.2.8. Data Capture & Calibration Management
4.3. Available Topics
4.3.1. Image Streams
4.3.2. Point Cloud Topics
4.3.3. Device Status & Diagnostics
4.4. Coordinate Systems and TF Transforms
4.4.1. Camera sensor structure
4.4.2. ROS Robot Coordinate System vs Camera Optical Coordinate System
4.4.3. Using ROS1 TF Tools
4.4.4. ROS1 Camera TF Calculation and Publication Mechanism
4.5. nabling and Visualizing Point Cloud in ROS 1
4.5.1. Enabling Depth Point Cloud
4.5.2. Enabling Colored Point Cloud
5. Advanced Guide
5.1. Performance & Optimization
5.1.1. Reducing CPU Usage with Orbbec ROS Package
5.2. Multi-Camera
5.2.1. Using Multiple Cameras with the Orbbec ROS Package
5.2.2. Multi_camera synced Instructions
5.2.3. Multi-Camera Synchronization Verification Node
5.2.4. Configuring Multiple Cameras with Orbbec ROS Package (Nodelet)
5.2.5. Usage Limitations of GMSL Cameras
5.3. Configuration & Modes
5.3.1. Aligning Depth to Color in ROS
5.3.2. Configuration of depth NFOV and WFOV modes
5.3.3. Depth work mode switch
5.3.4. Disparity_search_offset
5.3.5. Using interleave_ae with Gemini330 series cameras
5.3.6. Predefined presets
5.3.7. Net_camera
6. Benchmark
6.1. Introduction
6.1.1. common benchmark node
6.1.2. service benchmark node
6.2. Benchmark Usage
6.2.1. Using common benchmark node
6.2.2. Using service benchmark node
6.3. Benchmark Data
6.3.1. Common Benckmark Data
6.3.2. Service Benchmark Data
7. Developer Guide
7.1. Migrating from main to Open-Source v2-main
7.1.1. Introduction
7.1.2. Advantages of Migrating from main to v2-main
7.1.3. Comparison Between main and v2-main Branches
7.2. Building a Debian Package
8. Frequently Asked Questions
8.1. Unexpected Crash
8.2. No Data Stream from Multiple Cameras
8.3. Compilation Failure Due to OpenCV Version Issues
8.4. Additional Troubleshooting
8.5. Why Are There So Many Launch Files?
8.6. How to Launch a Specific Camera When Multiple Cameras Are Connected