Skip to content

General ‐ Calibration

JonasFrey96 edited this page Oct 15, 2024 · 7 revisions

Equipment

Box

  • The box
  • Tripod for the box, with correct spider mount.
  • Power supply for box
  • Ethernet
  • Opc
  • Leica Totalstation
  • Totalstation tripod

Calibration

  • Calibration board
  • Calibration lights from ASL
  • DC Power supply for calibration lights
  • Spotlight (above cabinets in RSL)
  • Tripod for spotlight

Setup

Environment

  1. Connect calibration lights to 24V, 1.35A max.
  2. Connect spotlight
  3. Connect to box
  4. Shut out natural light sources (so light sources are consistent across recordings)

Box

  1. Make sure time is synchronized across PCs.

Calibration Data Collection

Tip: Write down bag names as you go.

Camera Intrinsics

Record one bag for each each of the 3 directions:

  • Camera Front. Calib cfg file: calib_camera_front
  • Camera Left. Calib cfg file: calib_camera_left
  • Camera Right. Calib cfg file: calib_camera_right

Cover full field of view (very close, and then 2 meters, and then 5 meters away) -> change orientation of checker board a lot while recording -> cover all the edges of all cameras (important good lighting - set exposure to manual for alphasene following the best practices for intrinsic calibration) -> make sure to record alphasense, HDR and Zed 2i left and right image together

Camera Extrinsics

Record two bags:

  • Center to Right Calib cfg file: calib_camera_front_right
  • Center to Left Calib cfg file: calib_camera_front_left

Calibrating the left to center and center to right cameras -> move checker board very close to the alphasense such that both cameras can see part of the checkerboard -> also here try to rotate the checkerboard as much as possible.

Camera to IMU

Record three bags:

  • Slow Calib cfg file: calib_camera_front_imu
  • Medium Calib cfg file: calib_camera_front_imu
  • Fast Calib cfg file: calib_camera_front_imu

Reduce exposure to minimum - Ensure very good lighting conditions and then record front facing cameras (all) + all IMUs within the box and before the smooth motions -> ensure that no impacts while performing the calibration routine -> if I rehearse correctly this should be done at a range of 2-3 meters of the checkerboard and ideally you can collect a three bags (slow motion, medium motion and fast motions) - Excite roll pitch and yaw as much as possible.

LiDAR to Camera

Record two bags:

  • 5 positions per side Calib cfg file: calib_camera_lidar
  • 25 positions per side Calib cfg file: calib_camera_lidar

Put QR code static -> mount the box on the tripod -> track the PRISM with totalstation (record all cameras front and side at low frame rate e.g. 1Hz) and then move the box around - try to go really extreme with the rotations and distances to checkerboard - make sure to go very close to the checker board but also 3 meters away - ensure to see the checkerboard in all cameras - maybe do like roughly 25 positions per side (each position leave box static for 5s) and then go to next position. You can do maybe a quick bag with just 5 positions per side and then a longer bag.

Loading Calib data into box URDF

Default CAD Values

Exporting from CAD Software

Note: this PR has the boxi changes to enable this https://github.com/leggedrobotics/grand_tour_box/pull/444. Either check out dev/calib or merge this PR before you do this part.

  • Export CAD TFs as an HTML file, selecting the transforms that our URDF .xarco needs (eg: box base to each sensor group base, group base to each sensor).
Screenshot 2024-07-08 at 5 28 31 PM
  • Make sure the names of the HTML calibrations are mapped correctly in box_utils/box_calibration/config/html_tf_name_mappings.yaml
  • Run boxi load_calib --html_file /path/to/cad/file.html to load the data into the box_model calib directory. This should be reflected in RVIZ the next time you start up.
  • Alternatively, you can run the box_utils/box_calibration/cad_html_parser.py script as a standalone, if you want to test it/ get the names of the TFs in order to fill out the html_tf_name_mappings.yaml.

Calibration Introduction

  • Jetson running l-jetson-viz -> Add here l-jetson-calib directly starting the detectors
roslaunch grand_tour_camera_detectors detectors_on_jetson.launch use_april_grid:=true
  • Nuc -> Add here l-nuc-calib shortcut starting the detectors.
roslaunch grand_tour_camera_detectors detectors_on_nuc.launch use_april_grid:=true

Check for sufficient images received and detections increasing.

On OPC

rosrun grand_tour_ceres_apps camera_camera_online_calibration # allows for running --help

-> finds internally at first the config .yaml that specifies the setup -> should tell you that it receives samples / this is the ceres code for the optimization

roslaunch grand_tour_calibration_viewers all_cameras.launch needs some magic ID passe din

All images subscribe for 1 message - Expected update rate subscribes once every 10 s https://github.com/leggedrobotics/grand_tour_box/blob/dev/camera_calibration_detectors/box_calibration/grand_tour_calibration_viewers/scripts/view_camera_camera_optimization.py#L106

Not clear when GUI is hanging because of solve and when data is added to the solve. Not clear how many detection have been accumulated and how many are required for new solve.

Clone this wiki locally