Abstract. Realistic visualization is crucial for a more intuitive representation of complex data,... more Abstract. Realistic visualization is crucial for a more intuitive representation of complex data, medical imaging, simulation, and entertainment systems. In this respect, multiview autostereoscopic displays are a great step toward achieving the complete immersive user experience, although providing high-quality content for these types of displays is still a great challenge. Due to the different characteristics/settings of the cameras in the multiview setup and varying photometric characteristics of the objects in the scene, the same object may have a different appearance in the sequences acquired by the different cameras. Images representing views recorded using different cameras, in practice, have different local noise, color, and sharpness characteristics. View synthesis algorithms introduce artifacts due to errors in disparity estimation/bad occlusion handling or due to an erroneous warping function estimation. If the input multiview images are not of sufficient quality and have mismatching color and sharpness characteristics, these artifacts may become even more disturbing. Accordingly, the main goal of our method is to simultaneously perform multiview image sequence denoising, color correction, and the improvement of sharpness in slightly defocused regions. Results show that the proposed method significantly reduces the amount of the artifacts in multiview video sequences, resulting in a better visual experience.
... Record Type, conference. Author, Patrick De Smet [801001015325] - Ghent University PatrickR.D... more ... Record Type, conference. Author, Patrick De Smet [801001015325] - Ghent University PatrickR.DeSmet@UGent.be; Johan De Bock [801001748380] - Ghent University Johan.DeBock@UGent.be; Quang Luong [801001749087] - Ghent University Hiep.Luong@UGent. ...
Although the use of SEM/EDX equipment for GunShot Residue (GSR) analysis was already introduced i... more Although the use of SEM/EDX equipment for GunShot Residue (GSR) analysis was already introduced in the 1980's and has since been used for decades in the search for microscopic primer particles, the technique has continuously evolved. Part of the drive for this ongoing development comes from the continuous changes in the composition of ammunition primers. Recently, munition manufacturers are progressing away from the 'classic' compositions, containing (heavy) metals, with the introduction of primers containing no metallic elements. Especially this recent innovation poses severe problems for modern analysis systems. The commercial GSR analysis software depends on the BackScattered Electron signal of the metal GSR particles to set them apart from the Environmental Particles (EP), which are present in abundance on any sampler. However, as the mean Z of these metal-free GSR particles will approach that of the EP, the standard procedures and the parameter settings of these search algorithms will probably fail. Although as a partial solution other signals could be used for the detection of the relevant particles, such as Secondary Electrons or Cathode Luminescence, a much larger number of potential GSR particles will have to be analysed because a number of EP will also be selected as potential GSR particles. Finally, the EDX classification algorithms may encounter problems in discerning GSR particles from EP because of their similar chemical composition. The use of Big Data Analysis (BDA) techniques is a novel approach in the GSR field, which may yield a solution for a number of the problems posed by these new primers. In order to implement these BDA techniques, a database of the GSR particles is compiled, together with databases of EP. Against these 'Ground Truth' databases, a test sample's particle populations can be compared as a group, which will potentially yield a shortlist of munition types which produce similar particle groups. In order to develop and test these techniques, databases were compiled using classic munition data which was readily available from case samples. In this presentation, the preliminary results of this study, which involves researchers from three Belgian universities, all working within the iMinds ICON BAHAMAS project (2015-2016), will be discussed.
In this paper we describe a novel approach to image interpolation while preserving sharp edge inf... more In this paper we describe a novel approach to image interpolation while preserving sharp edge information. Many interpolation methods already have been proposed in the literature, but suffer from one or more artifacts such as aliasing, blurring, ringing etc. Non-linear methods or edge-directed methods result in sharp interpolated images but often look segmented or have great visual degradation in fine structured textures. We concentrate in this paper on tackling blurred edges by mapping the image's level curves. Image's level curves or isophotes are spatial curves with a constant intensity level. The mapping of these intensity levels can be seen as a local contrast enhancement problem, therefore we can rely on some contrast enhancement techniques. A great advantage of this approach is that the shape of the level curves (and of the objects) are preserved and no explicit edge detection is needed here. Additional constraints in function of the image interpolation are defined in this flexible framework. Different strategies of extending greyscale interpolation to colour images are also discussed in this paper. The results show a large improvement in visual quality: the edges are sharper and ringing effects are removed.
Proceedings of the ... International Conference on Document Analysis and Recognition, Sep 1, 2007
Abstract In this paper we present a novel method for reconstructing low-resolution text images. U... more Abstract In this paper we present a novel method for reconstructing low-resolution text images. Unlike other conventional interpolation methods, the unknown pixel value is not estimated based on its local surrounding neighbourhood, but on the whole text image. In ...
Abstract The huge amount of incoming synthetic aperture radar (SAR) data nowadays demands the nee... more Abstract The huge amount of incoming synthetic aperture radar (SAR) data nowadays demands the need for automatic image registration. Due the presence of speckle noise and the huge size of SAR images, registering SAR images is more difficult than traditional ...
Abstract. Realistic visualization is crucial for a more intuitive representation of complex data,... more Abstract. Realistic visualization is crucial for a more intuitive representation of complex data, medical imaging, simulation, and entertainment systems. In this respect, multiview autostereoscopic displays are a great step toward achieving the complete immersive user experience, although providing high-quality content for these types of displays is still a great challenge. Due to the different characteristics/settings of the cameras in the multiview setup and varying photometric characteristics of the objects in the scene, the same object may have a different appearance in the sequences acquired by the different cameras. Images representing views recorded using different cameras, in practice, have different local noise, color, and sharpness characteristics. View synthesis algorithms introduce artifacts due to errors in disparity estimation/bad occlusion handling or due to an erroneous warping function estimation. If the input multiview images are not of sufficient quality and have mismatching color and sharpness characteristics, these artifacts may become even more disturbing. Accordingly, the main goal of our method is to simultaneously perform multiview image sequence denoising, color correction, and the improvement of sharpness in slightly defocused regions. Results show that the proposed method significantly reduces the amount of the artifacts in multiview video sequences, resulting in a better visual experience.
... Record Type, conference. Author, Patrick De Smet [801001015325] - Ghent University PatrickR.D... more ... Record Type, conference. Author, Patrick De Smet [801001015325] - Ghent University PatrickR.DeSmet@UGent.be; Johan De Bock [801001748380] - Ghent University Johan.DeBock@UGent.be; Quang Luong [801001749087] - Ghent University Hiep.Luong@UGent. ...
Although the use of SEM/EDX equipment for GunShot Residue (GSR) analysis was already introduced i... more Although the use of SEM/EDX equipment for GunShot Residue (GSR) analysis was already introduced in the 1980's and has since been used for decades in the search for microscopic primer particles, the technique has continuously evolved. Part of the drive for this ongoing development comes from the continuous changes in the composition of ammunition primers. Recently, munition manufacturers are progressing away from the 'classic' compositions, containing (heavy) metals, with the introduction of primers containing no metallic elements. Especially this recent innovation poses severe problems for modern analysis systems. The commercial GSR analysis software depends on the BackScattered Electron signal of the metal GSR particles to set them apart from the Environmental Particles (EP), which are present in abundance on any sampler. However, as the mean Z of these metal-free GSR particles will approach that of the EP, the standard procedures and the parameter settings of these search algorithms will probably fail. Although as a partial solution other signals could be used for the detection of the relevant particles, such as Secondary Electrons or Cathode Luminescence, a much larger number of potential GSR particles will have to be analysed because a number of EP will also be selected as potential GSR particles. Finally, the EDX classification algorithms may encounter problems in discerning GSR particles from EP because of their similar chemical composition. The use of Big Data Analysis (BDA) techniques is a novel approach in the GSR field, which may yield a solution for a number of the problems posed by these new primers. In order to implement these BDA techniques, a database of the GSR particles is compiled, together with databases of EP. Against these 'Ground Truth' databases, a test sample's particle populations can be compared as a group, which will potentially yield a shortlist of munition types which produce similar particle groups. In order to develop and test these techniques, databases were compiled using classic munition data which was readily available from case samples. In this presentation, the preliminary results of this study, which involves researchers from three Belgian universities, all working within the iMinds ICON BAHAMAS project (2015-2016), will be discussed.
In this paper we describe a novel approach to image interpolation while preserving sharp edge inf... more In this paper we describe a novel approach to image interpolation while preserving sharp edge information. Many interpolation methods already have been proposed in the literature, but suffer from one or more artifacts such as aliasing, blurring, ringing etc. Non-linear methods or edge-directed methods result in sharp interpolated images but often look segmented or have great visual degradation in fine structured textures. We concentrate in this paper on tackling blurred edges by mapping the image's level curves. Image's level curves or isophotes are spatial curves with a constant intensity level. The mapping of these intensity levels can be seen as a local contrast enhancement problem, therefore we can rely on some contrast enhancement techniques. A great advantage of this approach is that the shape of the level curves (and of the objects) are preserved and no explicit edge detection is needed here. Additional constraints in function of the image interpolation are defined in this flexible framework. Different strategies of extending greyscale interpolation to colour images are also discussed in this paper. The results show a large improvement in visual quality: the edges are sharper and ringing effects are removed.
Proceedings of the ... International Conference on Document Analysis and Recognition, Sep 1, 2007
Abstract In this paper we present a novel method for reconstructing low-resolution text images. U... more Abstract In this paper we present a novel method for reconstructing low-resolution text images. Unlike other conventional interpolation methods, the unknown pixel value is not estimated based on its local surrounding neighbourhood, but on the whole text image. In ...
Abstract The huge amount of incoming synthetic aperture radar (SAR) data nowadays demands the nee... more Abstract The huge amount of incoming synthetic aperture radar (SAR) data nowadays demands the need for automatic image registration. Due the presence of speckle noise and the huge size of SAR images, registering SAR images is more difficult than traditional ...
Uploads