Dissertation > Excellent graduate degree dissertation topics show

Calibration of Omnidirectional Vision Sensors

Author: LinYing
Tutor: LiuJiLin
School: Zhejiang University
Course: Communication and Information System
Keywords: single viewpoint omnidirectional cameras lidar viewing sphere low-rank texture sensor calibration
CLC: TP212
Type: PhD thesis
Year: 2013
Downloads: 63
Quote: 0
Read: Download Dissertation


The omnidirectional vision sensors discussed in this dissertation include a passive vision sensor-an omnidirectional camera and an active vision sensor-an omnidirectional lidar. With a large field of view, this kind of sensor is widely used in environment perception of automatic land platform. Due to its special geometric characteristics, its calibration is always a fundamental question in computer vision field.In this dissertation, we study on the calibration of omnidirectional camera and lidar, mainly focus on three aspects:calibration and self-calibration of omnidirectional cameras, extrinsic calibration of a lidar-camera system. In order to achieve precise omnidirectional camera calibration, we propose a robust calibration method based on a viewing sphere which improves the accuracy of the results. We take advantage of the idea of compressive sensing based low-rank texture recovery to achieve the self-calibration of omnidirectional cameras, and reliable results are achieved. Geometric constraint and motion estimation are adopted to solve the joint calibration of an omnidirectional lidar and a camera.The main contributions are outlined as follows:1. To provide accurate correspondences between image and space information, we propose an omnidirectional camera calibration method via the viewing sphere. The geometric properties of two mutually orthogonal sets of parallel lines on the viewing sphere can provide a closed form solution for estimation of intrinsic and extrinsic parameters. Benefitting from the relative precise estimation of the intrinsic and extrinsic parameters, this method can further reduce the uncertainty of calibration results compared with most of the state-of-the-art methods.2. We propose an omnidirectional camera self-calibration method based on compressive sensing, and the sensor can be quickly calibrated by a simple scenario. The method calibrates the camera by recovering the low-rank texture in the image, and only one image is demanded. Furthermore, we define a projection function for spherical large-field-of-view low-rank texture to meet the imaging characteristic of omnidirectional cameras. Different from most of the self-calibration methods, this method does not rely on low-level features such as edge, corner, and is weakly affected by external factors such as light, shadow etc. More reliable results can be obtained.3. We put forward two methods for a lidar-camera system extrinsic calibration based on natural scenarios. Compared with the stereo camera system, an omnidirectional lidar-camera system is of low computational complexity, high accuracy and less affected by environment when constructing3D scenes. To fuse the data of lidar and camera effectively, we need to calibrate the extrinsic parameters of the lidar-camera system. By defining a reference world coordinates according to a trihedron in the scene, we make use of geometric constraints or matched features of the trihedron to estimate the relative motions between the lidar or camera coordinates and the world coordinates. If the relative motions are known, the extrinsic parameters between the lidar and camera are easy to calculate. This method is flexible and does not need specific calibration objects. Furthermore, it does not largely rely on the input information and only two frames of data are enough to get the reliable results.

Related Dissertations

  1. The Airborne LiDAR and LiDAR Points-cloud’s Quick Processing Method,TN959.73
  2. Building Feature Extraction from LiDAR Point Cloud and CCD Image,P225.2
  3. Research on High-resolution 3D Imaging Lidar Techniques,TN958.98
  4. The Study on Modeling and Simulation of Infrared Lidar to Measure Methane Concentration,TN958.98
  5. Study on Airborne LiDAR Strip Adjustment Based on Attitude Correction,P225.1
  6. Airborne LiDAR Filtering by Fusing Multi-features,TN958.98
  7. Real-time Detecting and Tracking of Moving Objects Using 3D Lidar,TN957.52
  8. Investigation on Characteristics of Staring Lidar System,TN958.98
  9. Dynamic Balance Test Technology and Its Data Processing,TH877
  10. Gas Concentration Measurement Based on Laser Absorption Spectroscopy,O433.51
  11. A typical three-dimensional laser-based radar target recognition technology research ground,TP391.41
  12. Research on Key Techniques for Large Transport Vehicle Platform Autonomous Positioning,TH22
  13. Minicomputer contains three-dimensional imaging laser radar system is a key technology research,TN958.98
  14. Study on Algorithms of Airborne LiDAR Point Cloud Data Filtering,TN713
  15. Study of Survey Technology Based on the Domestic AOE Airborne Lidar,P231
  16. Study on Optical Properties of Haze Aerosol at Shanghai with Micro-Pulse Lidar,X513
  17. Hardware Design and Field Experiments of Airborne Lidar System for Oil Spill Monitoring,U698.7
  18. Target Detection and Recognition Base on 1-D Lidar Range Image,TJ439.2
  19. Study of Ground Points Extraction and Curving Surface Fitting Algorithm from ALS Data,TP722
  20. Study on Registration Technologies of Different Species Remote Sensing Images,TP751
  21. Research on Realtime Calibration System of Pressure Sensor in MWD,TP212.12

CLC: > Industrial Technology > Automation technology,computer technology > Automation technology and equipment > Automation components,parts > Transmitter ( converter),the sensor
© 2012 www.DissertationTopic.Net  Mobile