diff --git a/thesis/Main.tex b/thesis/Main.tex index fe002fb..bb13f02 100755 --- a/thesis/Main.tex +++ b/thesis/Main.tex @@ -419,21 +419,39 @@ Autoencoders have been shown to be useful in the anomaly detection domain by ass {Explain how lidars work and what data they produce} {understand why data is degraded, and how data looks} {explain how radar/lidar works, usecases, output = pointclouds, what errors} -{lidar used in automotive $\rightarrow$ related work - rain degradation} +{rain degradation paper used deepsad $\rightarrow$ explained in detail in next chapter} -\todo[inline]{related work in lidar} -\todo[inline, color=green!40]{the older more commonly known radar works by sending out an electromagnetic wave in the radiofrequency and detecting the time it takes to return (if it returns at all) signalling a reflective object in the path of the radiowave. lidar works on the same principle but sends out a lightray produced by a laser (citation needed) and measuring the time it takes for the ray to return to the sensor. since the speed of light is constant in air the system can calculate the distance between the sensor and the measured point. modern lidar systems send out multiple, often millions of measurement rays per second which results in a three dimensional point cloud, constructed from the information in which direction the ray was cast and the distance that was measured} -\todo[inline, color=green!40]{lidar is used in most domains reliant on accurate 3d representations of the world like autonomous driving, robot navigation, (+ maybe quickly look up two other domains), its main advantage is high measurement accuracy, precision (use correct term), and high resolution (possible due to single point measurements instead of cones like radar, ToF, Ultrasonic) which enables more detailed mappings of the environment} -\todo[inline, color=green!40]{due to point precision, lidar is sensitive to noise/degradation of airborne particles, which may produce early returns, deflections, errrors of light rays, this results in noise in the 3d point cloud and possibly missing data of the measurement behind the aerosol particle.} -\todo[inline, color=green!40]{because of the given advantages of lidar it is most commonly used nowadays on robot platforms for environment mapping and navigiation - so we chose to demonstrate our method based on degraded data collected by a lidar sensor as discussed in more dtail in section (data section)} +LiDAR (Light Detection and Ranging) measures distance by emitting short laser pulses and timing how long they take to return—a working principle which may be familiar from the the more commonly known radar technology, which uses radio-frequency pulses and measures their return time to gauge an object's range. Unlike radar, however, LiDAR operates at much shorter wavelengths and can fire millions of pulses per second, achieving millimeter-level precision and dense, high-resolution 3D point clouds. This fine granularity makes LiDAR ideal for applications such as detailed obstacle mapping, surface reconstruction, and autonomous navigation in complex environments. -\newsection{related_work}{Related Work} +A LiDAR sensor emits a laser pulse in a specific direction, then waits for the tiny flash of returned light. Because the speed of light in air is effectively constant, multiplying half the round‐trip time by that speed gives the distance to the reflecting surface. Modern spinning multi‐beam LiDAR systems emit millions of these pulses every second. Each pulse is sent at a known combination of horizontal and vertical angles, creating a regular grid of measurements: for example, 32 vertical channels swept through 360° horizontally at a fixed angular spacing. While newer solid-state designs (flash, MEMS, phased-array) are emerging, spinning multi-beam LiDAR remains the most commonly seen type in autonomous vehicles and robotics because of its proven range, reliability, and mature manufacturing base. -\threadtodo -{What other research has been done on this topic} -{reader knows all background, what is starting point of research} -{talk about rain degradation paper from automotive, cleaning pointclouds?} -{Rain paper successful with DeepSAD $\rightarrow$ what is DeepSAD} +\fig{lidar_working_principle}{figures/bg_lidar_principle_placeholder.png}{PLACEHOLDER - An illustration of lidar sensors' working principle.} + +Every time a pulse returns, the LiDAR records its direction (based on the angles at emission) and its range, producing a single three-dimensional point. By collecting millions of such points each second, the sensor constructs a “point cloud”—a dense set of 3D coordinates relative to the LiDAR’s own position. In addition to X, Y, and Z, many LiDARs also record the intensity or reflectivity of each return, providing extra information about the surface properties of the object hit by the pulse. + +%LiDAR’s high accuracy, long range, and full-circle field of view make it indispensable for tasks like obstacle detection, simultaneous localization and mapping (SLAM), and terrain modeling in autonomous driving and mobile robotics. While vehicles and robots often carry complementary sensors like time-of-flight cameras, ultrasonic sensors and RGB cameras, LiDAR outperforms them when it comes to precise 3D measurements over medium to long distances, operates reliably in varying light conditions, and delivers the spatial density needed for safe navigation. However, intrinsic sensor noise remains: range quantization can introduce discrete jumps in measured distance, angle jitter can blur fine features, and multi‐return ambiguities may arise when a single pulse generates several echoes (for example, from foliage or layered surfaces). Environmental factors further degrade data quality: specular reflections and multipath echoes can produce ghost points, beam occlusion by intermediate objects leads to missing measurements, and atmospheric scattering in rain, fog, snow, dust, or smoke causes early returns and spurious points. + +%In subterranean and disaster environments—such as collapsed tunnels or earthquake‐damaged structures—LiDAR has become the de facto sensing modality for mapping and navigation. Its ability to rapidly generate accurate 3D point clouds enables simultaneous localization and mapping (SLAM) even in GPS‐denied conditions. Yet, the airborne particles prevalent in these scenarios—dust stirred by collapse, smoke from fires—introduce significant noise: early returns from nearby aerosols obscure real obstacles, and missing returns behind particle clouds can conceal hazards. While many SLAM and perception algorithms assume clean, high‐quality data, real‐world rescue deployments must contend with degraded point clouds. This mismatch motivates our work: rather than simply removing noise, we aim to quantify the degree of degradation so that downstream mapping and decision‐making algorithms can adapt to and compensate for varying data quality. + +LiDAR’s high accuracy, long range, and full-circle field of view make it indispensable for tasks like obstacle detection, simultaneous localization and mapping (SLAM), and terrain modeling in autonomous driving and mobile robotics. While complementary sensors—such as time-of-flight cameras, ultrasonic sensors, and RGB cameras—have their strengths at short range or in particular lighting, only LiDAR delivers the combination of precise 3D measurements over medium to long distances, consistent performance regardless of illumination, and the point-cloud density needed for safe navigation. LiDAR systems do exhibit intrinsic noise (e.g., range quantization or occasional multi-return ambiguities), but in most robotic applications these effects are minor compared to environmental degradation. + +In subterranean and disaster scenarios—collapsed tunnels, mine shafts, or earthquake-damaged structures—the dominant challenge is airborne particles: dust kicked up by debris or smoke from fires. These aerosols create early returns that can mask real obstacles and cause missing data behind particle clouds, undermining SLAM and perception algorithms designed for cleaner data. Our work focuses on quantifying the degree of LiDAR degradation so that mapping and decision-making processes can adapt dynamically to the true quality of the sensor input. + +\todo[inline]{related work, survey on lidar denoising, noise removal in subt - quantifying same as us in rain, also used deepsad - transition} + +%\todo[inline]{related work in lidar} +%\todo[inline, color=green!40]{the older more commonly known radar works by sending out an electromagnetic wave in the radiofrequency and detecting the time it takes to return (if it returns at all) signalling a reflective object in the path of the radiowave. lidar works on the same principle but sends out a lightray produced by a laser (citation needed) and measuring the time it takes for the ray to return to the sensor. since the speed of light is constant in air the system can calculate the distance between the sensor and the measured point. modern lidar systems send out multiple, often millions of measurement rays per second which results in a three dimensional point cloud, constructed from the information in which direction the ray was cast and the distance that was measured} +%\todo[inline, color=green!40]{lidar is used in most domains reliant on accurate 3d representations of the world like autonomous driving, robot navigation, (+ maybe quickly look up two other domains), its main advantage is high measurement accuracy, precision (use correct term), and high resolution (possible due to single point measurements instead of cones like radar, ToF, Ultrasonic) which enables more detailed mappings of the environment} +%\todo[inline, color=green!40]{due to point precision, lidar is sensitive to noise/degradation of airborne particles, which may produce early returns, deflections, errrors of light rays, this results in noise in the 3d point cloud and possibly missing data of the measurement behind the aerosol particle.} +%\todo[inline, color=green!40]{because of the given advantages of lidar it is most commonly used nowadays on robot platforms for environment mapping and navigiation - so we chose to demonstrate our method based on degraded data collected by a lidar sensor as discussed in more dtail in section (data section)} + +%\newsection{related_work}{Related Work} + +%\threadtodo +%{What other research has been done on this topic} +%{reader knows all background, what is starting point of research} +%{talk about rain degradation paper from automotive, cleaning pointclouds?} +%{Rain paper successful with DeepSAD $\rightarrow$ what is DeepSAD} \newchapter{deepsad}{Deep SAD: Semi-Supervised Anomaly Detection} diff --git a/thesis/figures/bg_lidar_principle_placeholder.png b/thesis/figures/bg_lidar_principle_placeholder.png new file mode 100644 index 0000000..4b17833 Binary files /dev/null and b/thesis/figures/bg_lidar_principle_placeholder.png differ