Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

scholarly journals Assessing the effect of lithological setting, block characteristic and slope topography on the runout length of rockfalls in the Alps and on the La Réunion island

2020 ◽  
Author(s):  
Kerstin Wegner ◽  
Florian Haas ◽  
Tobias Heckmann ◽  
Anne Mangeney ◽  
Virginie Durand ◽  
...  

Abstract. In high mountain regions, rockfalls are common processes, which transport different volumes of material and therefore endanger populated areas and infrastructure facilities. In four study areas within different lithological settings, LiDAR (light detection and ranging) data were acquired for a morphometric analysis of block sizes, block shapes and talus cone characteristics. Based on these high-resolution terrestrial laser scanning (TLS) data, the three axes of every block larger than 0.5 m in the referenced point cloud were measured. Block sizes and shapes are used to investigate them in the context of runout distances and to analyse the spatial distribution of blocks on the talus cone. We also investigate the influence of terrain parameters such as slope inclination, roughness and profile curvature (longitudinal profiles). Our study shows that the relation of block size within different lithological settings on runout length is complex, because we can neither confirm nor reject the theory of gravitational sorting. We also found that the block shape (axial ratio) does not have a simple influence on runout length, as it plays the role of a moderating parameter in two study sites (Gampenalm: GA, Dreitorspitze: DTS) while we could not confirm this for Piton de la Fournaise (PF) and Zwieselbach valley (ZBT). The derived roughness values show a clear difference between the four study sites. This also applies for the parameter of slope inclination and longitudinal profiles.

2021 ◽  
Vol 21 (3) ◽  
pp. 1159-1177
Author(s):  
Kerstin Wegner ◽  
Florian Haas ◽  
Tobias Heckmann ◽  
Anne Mangeney ◽  
Virginie Durand ◽  
...  

Abstract. In four study areas within different lithological settings and rockfall activity, lidar data were applied for a morphometric analysis of block sizes, block shapes and talus cone characteristics. This information was used to investigate the dependencies between block size, block shape and lithology on the one hand and runout distances on the other hand. In our study, we were able to show that lithology seems to have an influence on block size and shape and that gravitational sorting did not occur on all of the studied debris cones but that other parameters apparently control the runout length of boulders. Such a parameter seems to be the block shape, as it plays the role of a moderating parameter in two of the four study sites, while we could not confirm this for our other study sites. We also investigated the influence of terrain parameters such as slope inclination, profile curvature and roughness. The derived roughness values show a clear difference between the four study sites and seem to be a good proxy for block size distribution on the talus cones and thus could be used in further studies to analyse a larger sample of block size distribution on talus cones with different lithologies.


Author(s):  
M. Rutzinger ◽  
M. Bremer ◽  
B. Höfle ◽  
M. Hämmerle ◽  
R. Lindenbergh ◽  
...  

The 2nd international summer school “Close-range sensing techniques in Alpine terrain” was held in July 2017 in Obergurgl, Austria. Participants were trained in selected close-range sensing methods, such as photogrammetry, laser scanning and thermography. The program included keynotes, lectures and hands-on assignments combining field project planning, data acquisition, processing, quality assessment and interpretation. Close-range sensing was applied for different research questions of environmental monitoring in high mountain environments, such as geomorphologic process quantification, natural hazard management and vegetation mapping. The participants completed an online questionnaire evaluating the summer school, its content and organisation, which helps to improve future summer schools.


2022 ◽  
pp. 811-822
Author(s):  
B.V. Dhandra ◽  
Satishkumar Mallappa ◽  
Gururaj Mukarambi

In this article, the exhaustive experiment is carried out to test the performance of the Segmentation based Fractal Texture Analysis (SFTA) features with nt = 4 pairs, and nt = 8 pairs, geometric features and their combinations. A unified algorithm is designed to identify the scripts of the camera captured bi-lingual document image containing International language English with each one of Hindi, Kannada, Telugu, Malayalam, Bengali, Oriya, Punjabi, and Urdu scripts. The SFTA algorithm decomposes the input image into a set of binary images from which the fractal dimension of the resulting regions are computed in order to describe the segmented texture patterns. This motivates use of the SFTA features as the texture features to identify the scripts of the camera-based document image, which has an effect of non-homogeneous illumination (Resolution). An experiment is carried on eleven scripts each with 1000 sample images of block sizes 128 × 128, 256 × 256, 512 × 512 and 1024 × 1024. It is observed that the block size 512 × 512 gives the maximum accuracy of 86.45% for Gujarathi and English script combination and is the optimal size. The novelty of this article is that unified algorithm is developed for the script identification of bilingual document images.


Author(s):  
Ada H. V. Repetto-Llamazares ◽  
Ove T. Gudmestad ◽  
Arne Gu¨rtner ◽  
Knut V. Ho̸yland

When studying ice interaction on sloped structures, a key parameter that is usually reported after experiments and observations either in Full Scale or Model Scale is the breaking length associated with the ice failure. Moreover, either for numerical modeling or load calculations the size of the blocks generated during ice-structure interaction that accumulates rubble is of importance. In this paper, the technique of image analysis has been used to obtain values of the breaking length and the ice block sizes generated during model tests of a Shoulder Ice Barrier (SIB)-ice interaction. The model tests were performed in the Hamburg Ship Model Basin (HSVA) during July 2007. Since the SIB represents a new concept in ice barrier structures, model tests were intended to evaluate its general performance. A brief description of the model tests and the image analysis technique used to analyze the data is done. A total of five experiments where ice thickness, ice flexural strength and shoulder inclination were varied, are analyzed. Results of the breaking length analysis show that there is a characteristic change in the breaking length associated with the transition from ice interactions on the bare structure (Phase 1) and interaction onto accumulated rubble (Phase 2). Average values of the breaking length of both phases for each experiment are presented. Since the information regarding breaking length in structures that accumulate rubble is sparse, the experimental results of Phase 1, where the rubble accumulation is still small, are compared with the predictions from three different models presented in the literature for sloped structures, under similar ice conditions, that do not accumulate rubble. The comparison allows concluding that the breaking phenomenon is being reasonably well modeled in the experiments. The block sizes of the upper layer of the accumulated rubble were analyzed and the block length and width distributions were found for each experiment. A linear trend was found between block size and ice thickness. A linear fitting of the data was performed in order to obtain simple equations which give an upper limit of the length and width of the ice blocks generated during the SIB-ice interaction as function of the ice thickness. The results may apply for ice interaction on sloped structures in general as well.


2018 ◽  
Vol 10 (11) ◽  
pp. 1677
Author(s):  
Virpi Junttila ◽  
Tuomo Kauranne

Remotely sensed data-based models used in operational forest inventory usually give precise and accurate predictions on average, but they often suffer from systematic under- or over-estimation of extreme attribute values resulting in too narrow or skewed attribute distributions. We use a post-processing method based on the statistics of a proper, representative training set to correct the predictions and their probability intervals, attaining corrected predictions that reproduce the statistics of the whole population. Performance of the method is validated with three forest attributes from seven study sites in Finland with training set sizes from 50 to over 400 field plots. The results are compared to those of the uncorrected predictions given by linear models using airborne laser scanning data. The post-processing method improves the accuracy assessment linear fit between the predictions and the reference set by 35.4–51.8% and the distribution fit by 44.5–95.0%. The prediction root mean square error declines on the average by 6.3%. The systematic under- and over-estimation are reduced consistently with all training set sizes. The level of uncertainty is maintained well as the probability intervals cover the real uncertainty while keeping the average probability interval width similar to the one in uncorrected predictions.


2011 ◽  
Vol 11 (3) ◽  
pp. 807-817 ◽  
Author(s):  
H. T. Nguyen ◽  
T. M. Fernandez-Steeger ◽  
T. Wiatr ◽  
D. Rodrigues ◽  
R. Azzam

Abstract. This study focuses on the adoption of a modern, widely-used Terrestrial Laser Scanner (TLS) application to investigate volcanic rock slopes in Ribeira de João Gomes valley (Funchal, Madeira island). The TLS data acquisition in May and December 2008 provided information for a characterization of the volcanic environment, detailed structural analysis and detection of potentially unstable rock masses on a slope. Using this information, it was possible to determine specific parameters for numerical rockfall simulations such as average block size, shape or potential sources. By including additional data, such as surface roughness, the results from numerical rockfall simulations allowed us to classify different hazardous areas based on run-out distances, frequency of impacts and related kinetic energy. Afterwards, a monitoring of hazardous areas can be performed in order to establish a rockfall inventory.


1960 ◽  
Vol 40 (2) ◽  
pp. 396-404
Author(s):  
I. L. Nonnecke

In 1957, vine and shelled pea weights of canning peas from an irrigated uniformity trial were recorded to determine the effect on yield variability of varying plot and block sizes and shapes. The most uniform reduction in variation occurred in block shapes of one plot long and six plots wide with each increase in plot length. These results agree with those of other workers, that long, narrow blocks are more efficient than square blocks. The optimum plot size was found to be 5 feet long and 10 feet wide. Considerably more shelled peas were required for processing than could be obtained from the optimum size of plot for yield.


2014 ◽  
Vol 24 (03) ◽  
pp. 1441006 ◽  
Author(s):  
Tobias Weinzierl ◽  
Michael Bader ◽  
Kristof Unterweger ◽  
Roland Wittmann

Spacetrees are a popular formalism to describe dynamically adaptive Cartesian grids. Even though they directly yield a mesh, it is often computationally reasonable to embed regular Cartesian blocks into their leaves. This promotes stencils working on homogeneous data chunks. The choice of a proper block size is sensitive. While large block sizes foster loop parallelism and vectorisation, they restrict the adaptivity's granularity and hence increase the memory footprint and lower the numerical accuracy per byte. In the present paper, we therefore use a multiscale spacetree-block coupling admitting blocks on all spacetree nodes. We propose to find sets of blocks on the finest scale throughout the simulation and to replace them by fused big blocks. Such a replacement strategy can pick up hardware characteristics, i.e. which block size yields the highest throughput, while the dynamic adaptivity of the fine grid mesh is not constrained—applications can work with fine granular blocks. We study the fusion with a state-of-the-art shallow water solver at hands of an Intel Sandy Bridge and a Xeon Phi processor where we anticipate their reaction to selected block optimisation and vectorisation.


2022 ◽  
Author(s):  
Lukas Winiwarter ◽  
Katharina Anders ◽  
Daniel Schröder ◽  
Bernhard Höfle

Abstract. 4D topographic point cloud data contain information on surface change processes and their spatial and temporal characteristics, such as the duration, location, and extent of mass movements, e.g., rockfalls or debris flows. To automatically extract and analyse change and activity patterns from this data, methods considering the spatial and temporal properties are required. The commonly used M3C2 point cloud distance reduces uncertainty through spatial averaging for bitemporal analysis. To extend this concept into the full 4D domain, we use a Kalman filter for point cloud change analysis. The filter incorporates M3C2 distances together with uncertainties obtained through error propagation as Bayesian priors in a dynamic model. The Kalman filter yields a smoothed estimate of the change time series for each spatial location, again associated with an uncertainty. Through the temporal smoothing, the Kalman filter uncertainty is, in general, lower than the individual bitemporal uncertainties, which therefore allows detection of more change as significant. In our example time series of bi-hourly terrestrial laser scanning point clouds of around 6 days (71 epochs) showcasing a rockfall-affected high-mountain slope in Tyrol, Austria, we are able to almost double the number of points where change is deemed significant (from 14.9 % to 28.6 % of the area of interest). Since the Kalman filter allows interpolation and, under certain constraints, also extrapolation of the time series, the estimated change values can be temporally resampled. This can be critical for subsequent analyses that are unable to deal with missing data, as may be caused by, e.g., foggy or rainy weather conditions. We demonstrate two different clustering approaches, transforming the 4D data into 2D map visualisations that can be easily interpreted by analysts. By comparison to two state-of-the-art 4D point cloud change methods, we highlight the main advantage of our method to be the extraction of a smoothed best estimate time series for change at each location. A main disadvantage of not being able to detect spatially overlapping change objects in a single pass remains. In conclusion, the consideration of combined temporal and spatial data enables a notable reduction in the associated uncertainty of the quantified change value for each point in space and time, in turn allowing the extraction of more information from the 4D point cloud dataset.


2020 ◽  
Vol 14 (9) ◽  
pp. 2925-2940 ◽  
Author(s):  
César Deschamps-Berger ◽  
Simon Gascoin ◽  
Etienne Berthier ◽  
Jeffrey Deems ◽  
Ethan Gutmann ◽  
...  

Abstract. Accurate knowledge of snow depth distributions in mountain catchments is critical for applications in hydrology and ecology. Recently, a method was proposed to map snow depth at meter-scale resolution from very-high-resolution stereo satellite imagery (e.g., Pléiades) with an accuracy close to 0.5 m. However, the validation was limited to probe measurements and unmanned aircraft vehicle (UAV) photogrammetry, which sampled a limited fraction of the topographic and snow depth variability. We improve upon this evaluation using accurate maps of the snow depth derived from Airborne Snow Observatory laser-scanning measurements in the Tuolumne river basin, USA. We find a good agreement between both datasets over a snow-covered area of 138 km2 on a 3 m grid, with a positive bias for a Pléiades snow depth of 0.08 m, a root mean square error of 0.80 m and a normalized median absolute deviation (NMAD) of 0.69 m. Satellite data capture the relationship between snow depth and elevation at the catchment scale and also small-scale features like snow drifts and avalanche deposits at a typical scale of tens of meters. The random error at the pixel level is lower in snow-free areas than in snow-covered areas, but it is reduced by a factor of 2 (NMAD of approximately 0.40 m for snow depth) when averaged to a 36 m grid. We conclude that satellite photogrammetry stands out as a convenient method to estimate the spatial distribution of snow depth in high mountain catchments.


Export Citation Format

Share Document