![]() ![]() This module handles time synchronization and geometric calibration issues, along with 360° data fusion with noisy data coming from multiple cameras and sensor sources. We are proud to introduce the new multi-camera Fusion API, which makes it easier than ever to fuse data coming from multiple cameras. We are also introducing an improved NEURAL depth mode, which offers even more accurate depth maps in challenging situations such as low-light environments and textureless surfaces. Our latest update supports the ZED X and ZED X Mini cameras, designed specifically for autonomous mobile robots in indoor and outdoor environments. you’d have to get the video, IMU, etc data up the tether but the topside computer would typically be more powerful and capable of running SLAM at high frame rates (which is critical) than an embedded computer.We are excited to announce the release of ZED SDK 4.0, which introduces a range of new features and enhancements to our ZED cameras. More recent SLAM systems (VINS-Fusion, OKVIS, etc) make use of the IMU data to partially infer position, velocity, and acceleration changes and I imagine these systems would be more sensitive to the quality of the IMU (but we haven’t tried them yet).įinally, you could definitely run a SLAM system on the topside computer to aid navigation. The IMUs on both the BlueROV and ZEDmini are more than capable of supplying fairly accurate camera pose, and the IMU update rates far exceed the camera frame rates. Using only a monocular camera the path/map will lack an absolute scale, but depending on your needs that may not be a major issue. Most SLAM systems will work with monocular systems (ORBSLAM does and I’ve tested it with some underwater video sequences and it performed reasonably). Not taking anything away from either of your fantastic work (as it makes great sense in driving the ArduSub project towards AUV navigation and is a huge plus in general) But given the BlueROV2 is a ROV and that it has a tether back to the surface and is typically driven by a laptop (and yes Andrew I understand you are running the control as a web server down on the ROV so in your case it may be lacking this topside computing grunt) Would it be a reasonable workable solution to have the ORB-SLAM2 for localization and mapping running on the topside computer and even if there was not complete loop closure these flight tracks would be of assistance to any operator In your opinions is the IMU utilised in standard BlueROV2 Pixhawk (the Invensense MPU 6000 sample rate up to 8kHz angular rate ☒50, ±500, ☑000, and ☒000°/sec) near suitable if not what sort of range of accuracy and sample rate do you believe is necessary to get usable data to determine the camera pose (I couldn’t find what IMU is running in the ZED mini but it looks like it is reasonably up specced from a twitter quote “ The IMU of the ZED mini is running at a fixed 500Hz, both the RAW and the fused data are available from the SDK” ) How well do you think this style system would work as a monocular system (with the camera down facing and assuming it is getting fed reasonable bottom images) The videos above show the ROV completing a ‘lawn mower’ pattern as our ultimate goal is to use the system for mapping marine ecosystems.Ī few questions for either / both and to get their options for my own (all be it poor coding knowledge) benefit The camera/computer enclosure was made by Sexton ( ). ![]() We’re using a slightly modified version of ORB-SLAM2 for localization and mapping (achieving ~10 fps with VGA images) and a basic trajectory controller to navigate predefined routes. All of the visual navigation software is run on the TX1. The key hardware components are a Stereolabs ZED mini interfaced to an NVIDIA Jetson TX1, which communicates to the BlueROV2 Raspberry Pi via Ethernet (substituting for the topside computer). The hardware and software components are diagrammed below, and some videos demonstrating early trials are available at (, BlueROV_2 - YouTube). Our prototype system has been tested in a pool and we are working toward some field tests soon. We have been working on automating our BlueROV2 using a visual navigation system. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |