Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Vladislav  Kaplan
  • Raanana, Tel Aviv, Israel

Vladislav Kaplan

Monitoring of pattern roughness for advanced technology nodes is crucial as this roughness can adversely affect device yield and degrade device performance. The main industry work horse for in-line roughness measurements is the CD-SEM,... more
Monitoring of pattern roughness for advanced technology nodes is crucial as this roughness can adversely affect device yield and degrade device performance. The main industry work horse for in-line roughness measurements is the CD-SEM, however, today no adequate reference metrology tools exist that allow to evaluate its roughness measurement sensitivity and precision. To bridge this gap, in this work the roughness measurement capabilities of different analytical techniques are investigated. Different metrology methods are used to evaluate roughness on a same set of samples and results are compared and used in a holistic approach to better characterize and quantify the measured pattern roughness. To facilitate the correlation between the various metrology techniques and the evaluation of CD-SEM sensitivity, an effective approach is to induce pattern roughness in a controlled way by adding well defined levels of roughness to the designed patterns on a EUV mask and to measure the response and sensitivity of CD-SEM and of the other techniques to these different pattern roughness levels once printed on wafers. This paper presents the roughness measurement results obtained with various metrology technologies including CD-SEM, OCD, S-TEM and XCD on EUV Lithography patterned wafers both postlithography and post-etch. The benefits of recently developed metrology enhancements are demonstrated as well; automated TEM allows to generate accurate and rather precise reference roughness data, Machine Learning enables OCD based roughness metrology with good correlation to CD-SEM and STEM, and the improved sensitivity of EUV and X-ray scattering systems allows to extract roughness information that does correlate to CD-SEM.
New approach for LER/LWR PSD calculations Abstract Monitoring of pattern roughness for advanced technology nodes is crucial as this roughness can adversely affect device yield and degrade device performance. The main industry work horse... more
New approach for LER/LWR PSD calculations
Abstract
Monitoring of pattern roughness for advanced technology nodes is crucial as this roughness can adversely affect device yield and degrade device performance. The main industry work horse for in-line roughness measurements is the CD-SEM, however, today no adequate reference metrology tools exist that allow to evaluate its roughness measurement sensitivity and precision. To bridge this gap, in this work the roughness measurement capabilities of different analytical techniques are investigated. Different metrology methods are used to evaluate roughness on a same set of samples and results are compared and used in a holistic approach to better characterize and quantify the measured pattern roughness. To facilitate the correlation between the various metrology techniques and the evaluation of CD-SEM sensitivity, an effective approach is to induce pattern roughness in a controlled way by adding well defined levels of roughness to the designed patterns on a EUV mask and to measure the response and sensitivity of CD-SEM and of the other techniques to these different pattern roughness levels once printed on wafers. This paper presents the roughness measurement results obtained with various metrology technologies including CD-SEM, OCD, S-TEM and XCD on EUV Lithography patterned wafers both postlithography and post-etch. The benefits of recently developed metrology enhancements are demonstrated as well; automated TEM allows to generate accurate and rather precise reference roughness data, Machine Learning enables OCD based roughness metrology with good correlation to CD-SEM and STEM, and the improved sensitivity of EUV and X-ray scattering systems allows to extract roughness information that does correlate to CD-SEM.
Abstract Steganography is a “science”, the method of hiding sent information. It embeds the secret message in cover media (image, audio, video, text, etc.). The most popular steganography method is LSB (Last Significant Bit) replacement... more
Abstract
Steganography is a “science”, the method of hiding sent information. It embeds the secret message in cover media (image, audio, video, text, etc.). The most popular steganography method is LSB (Last Significant Bit) replacement in the cover image. The most notable Steganalysis algorithm is the RS method [1], which detects stegamesage by the statistical analysis applied on image pixels. The goal of the project is to demonstrate effectiveness of improved GSM algorithm proposed by Shen Wang and others [2] against RS analysis.
Key words: Steganography, Steganalysis, LSB, Digital image.
1. Introduction and problem definition
In modern word information is has great value. With global computer networks appearance volume of transmitted and received information has been increased, a lot of data transferred via global webs. And as results of easy accessibility to different information, sometimes to high sensitive information, there is a need to protect data security and threat unauthorized access to information. On other hand, with advancements in digital communication technology and the growth of computer power and storage, the difficulties in ensuring individuals’ privacy become increasingly challenging. Data, intellectual property and privacy protection – this is scabrous problem with that we face on a daily basis. Various methods have been investigated and developed to perform data protection and personal privacy. Encryption is probably the most obvious one, and then comes steganography. Steganography is the art and science of invisible communication. This is accomplished through hiding information in other information, thus hiding the existence of the communicated information. The word steganography is derived from the Greek words “stegos” meaning “cover” and “grafia” meaning “writing” defining it as “covered writing”. In general, steganography approaches hide a message in a cover e.g. text, image, audio file, etc., in such a way that is assumed to look innocent and there for would not raise suspicion [3]. Except to transfer secret information or embed secret messages into media, one of important and perspective application of steganography is to protect intellectual property and copyright on digital media, images, books to avoid unauthorized copying and theft. The main purpose of this work is to study LSB based Steganographic and Steganalysis methods.
Implement and study RS Fridrich algorithm [1]. In second part of work introduce modified “Genetic shifting Algorithm” proposed by Shen Wang [2], method of embedding secret message in to digital image, without causing visual degradation of cover/stego image and to avoid stegamesage presence detection by RS Analysis algorithm.
Opposite to Shen Wang steganography method, which performs final stegoimage bits manipulation, this paper is deals with original “cover” image. Changes are made in cover image with target to “worsen” bits statistics. And as a result of this permutations, secret message embedding provides “positive” statistics changes that affect RS analysis determine message existence.
2. Project steps
The current project objectives are:
 Perform comparison visual and statistical analysis for different message length.
 Check what message length can be embedded into cover image without visual or statistical image degradation.
 Check dependence of the image degradation from embedded message length.
 New Stego optimized Genetic Shifting Algorithm definition.
 Confirm effectiveness of new method in interaction with Fridrich RS algorithm, for Grey scale images.
All experiments and research have been performed in LabVIEW environment, and include next steps:
1. Build LabVIEW based working steganography model
for embedding text message into digital image with LSB embedding algorithm. According to LSB method, every text and every image can be transferred into binary form, after this operation we can replace least significant bit of image by bits of message. After successful LSB-1 encoding and image recovery, perform same experiment on same images up to LSB-4. LSB-1 and LSB-4 signs level of significant bits permutations, there LSB-1 – only one least bit is used and LSB-4 – four significant bits are used.
2. Perform basic message coding (Cover Image) and
recovery up to LSB-4 for gray images.
Figure 1 Demonstrates message embedding steps,
סיכום פרויקט
Application of improved Steganographic Genetic Shifting algorithm against RS analysis
פורינסון ודים
קפלן ולדיסלב MScEE
Figure 1. Basis diagram for message embedding.
3. Compare visual image degradation.
4. Compare visual degradation through common
tools
(Histogram, STD). With Statistical tools help perform image degradation analysis versus LSB level and imbedded message volume.
5. Perform study of coded message saturation
(message of different length) vs. recovery and image degradation per different LSB coding at gray images.
6. Build RS analysis (Fridrich algorithm) [1]
routine.
7. Confirm validity of RS analysis on gray images.
Figure 2. Demonstrates resulting RS analysis plot.
Figure 2. RS analysis plot
8. Implement secure genetic steganography method
for RS baseline shifting for LSB-1. (GSM for RS shifting).
9. Perform basic message recovery with GSM for
RS shifting for LSB-1.
10. Perform RS analysis comparison for different
message length with GSM for RS shifting and without, use different “snake” division array image representation.
3. Results and discussion
This project is meet all objectives was targeting in start of the work:
 Working LabVIEW based model for secret message to image encoding/decoding is implemented.
 Study images visual degradation by “naked” eye and Compare visual degradation through common tools (Histogram, STD).
 Study Fridrich RS analysis algorithm is performed.
 Checked and proved RS algorithm validity for secret message presence detection into Grey scale images.
 Determined strong dependence of image visual degradation on message length (volume) and depth of LSB levels manipulations.
 Definition and improvement of existing Shen Wang and al [2] Genetic Shifting algorithm.
 Checked and proved ability of proposed Shifting algorithm to against RS attack.
4. Summary and conclusions
In most of the original digital images exists a high matching between the pixels that are placed next to each other [2], in case any bit manipulation is performed this causes a matching between pixels is worsens. This is reason for high histogram sensitivity for any bits replacements. But in same time, we can see, that LSB-1 level do not dramatically impact image histogram, and in case no clean image histogram presents to compare, this is impossible to determinate stegamesage is exists. By using received statistical data we can with high probability determine embedded message existence and approximate message length. Another words, 𝑅𝑚−𝑆𝑚 differences (under normal conditions) less 7% indicates LSB manipulations with high probability. Recess of LSB levels manipulations improve RS analysis stability to determine embedded messages, but in this case visual attack is prefer and easy.
5. Literature
[1] Jessica Fridrich, Miroslav Goljan “Practical Steganalysis of Digital Images – State of the Art.” SUNY Binghamton, Department of Electrical Engineering, Binghamton, NY 13902-6000.
[2] Shen Wang, Bian Yang and Xiamu Niu, “A secure Steganography Method based on Genetic
Algorithm”, School of computer Science and Technology Harbin Institute of Technology.
[3] E.L. Zorin, N.V. Chichvarin “Steganography”, MGTU. N.E. Bauman, Faculty "Information and Management", Department "Information Security", Moscow 2011.
[4] E.V Selantev “Fundamentals of computer steganography”, Moscow Institute of Electronic Engineering, Faculty of Information, Moscow 2009.
[5] C.P.Sumathi, T.Santanam and G.Umamaheswari “A Study of Various Steganographic Techniques Used for Information Hiding” International Journal of Computer Science & Engineering Survey (IJCSES),Vol.4, No.6, December 2013.
[6] Arooj Nissar , A.H. Mir. “Classification of steganalysis techniques”, Department of Information Technology, National Institute of Technology, Srinagar 190006, India.
[7] LabVIEW “Getting Startedwith LabVIEW” National Instruments, June 2009, 373427F-01.
[8]http://bilder.buecher.de/zusatz/22/22359/22359536_lese_1.pdf. “A survey of steganographic techniques” Chapter 3.
Research Interests:
Significant challenges created by introduction of advanced flash technologies means increasingly higher importance of daily maintenance procedures, based on Beam Alignment (BA) routine. Although currently CDSEM tool manufacturers provide... more
Significant challenges created by introduction of advanced flash technologies means increasingly higher importance of daily maintenance procedures, based on Beam Alignment (BA) routine. Although currently CDSEM tool manufacturers provide automated self-adjusted BA software, the ability to support various generation toolsets on unique platform still does not exist. The obvious alternative for lower generation toolsets still involved in manufacturing is the increase of matching procedures, requiring day to day comparison data as part of a preventive maintenance package. As result, amount of negative charging exposed to Critical Dimension (CD) impacts its measurement stability. Conventional ways to support high demands (smaller delta) for CD matching will lead to instability of measurements and require measurement site renewal. In order to solve predefined conflict between the willingness to trace matching performance of the tool on the daily basis and non proportional increase of negative charging impact on the same measurement we turned to the one of basic characteristic of the CDSEM - its resolution performance. Using non repeatable image data from Hitachi Standard Microscale with the help of basic image processing operation, simple spatial techniques allow us to trace BA performance on daily basis and thus effectively substituted daily matching procedure. This developed approach simplifying tool maintenance, using clear judgment criteria for equipment performance and moreover tracing resolution stability over a significant amount of time.
Research Interests:
Significant challenges for various Critical Dimension (CD) measurement matching procedures are reaching a comparable complexity as result of negative effects of roughness on the features. Due to the constant trend of integrated circuit in... more
Significant challenges for various Critical Dimension (CD) measurement matching procedures are reaching a comparable complexity as result of negative effects of roughness on the features. Due to the constant trend of integrated circuit in features reduction, impact of roughness start to be more destructive for various sets of measurement algorithms. Commonly used attempts to increase magnification for pattern recognition in measurement mode could in turn detect higher deviation from predefined patterns and thus initiate shift in placement of measurement gate. The purpose of this paper is to discuss how to reduce measurement gate (MG) placement variation impact and filter acquired data using edge correlation approach. The essence of listed above approach is to create set of width correlation function represents particular feature under test and compare it to “golden” one as a mean of detection of uncorrelated scans, which in turn should be excluded from overall computation of matching results. We describe general approach for algorithm stepping and various techniques for judgment of measurement comparison validity. Presented approach also has particular interest in determination of specified tool performance for predefined pattern recognition feature as well as for pattern recognition algorithm robustness study - direct interest for manufacturer. Precise matching estimation as part of Round Robin (RR) routines creating possibility to work with restricted amount of data and perform quick reliable qualification procedures. This paper concentrated on practical approach and used both simulation and actual data measurements data before and after proposed optimization taken by various generation tools by Hitachi (S-8840, S-9300, S-9380) in production environment.
Research Interests:
Research Interests:
Research Interests:
Problem statement: Challenges in achievement of critical layers CD stability in advanced flash processes are reaching significant complexity as result of tight process window imposed on expected focus/dose variation in lithography tool.... more
Problem statement:
Challenges in achievement of critical layers CD stability in advanced flash processes are reaching significant complexity as result of tight process window imposed on expected focus/dose variation in lithography tool. Common way to perform feed forward prediction for focus/dose variation is creation of FE (Focus /Exposure) Map. In current methodology CD measurements is the only source of FE map construction. Taking in account that for critical layers significant amount of most sensitive features are created using OPC techniques which in turn impact shape/roughness of targeted CD with minimal variation of the focus/dose. Thus CD measurements could be highly unreliable for FE map predictor with commonly used quadratic approximation. Manual CD measurements data filtering is necessary condition in this case. Furthermore for some features quadratic approximation approach itself raising additional concerns about focus shifts for different dose levels. Also no well defined techniques exist for online focus performance tracing and focus trend detection partly due to the mentioned above approximation approach.
Solution:
The purpose of this paper is to discuss how to perform reliable feed forward FE prediction taking in account challenges in advanced flash processes. We introduce here additional variable for FE determination – pattern recognition score and thus elimination manual data preprocessing. Also modification in commonly used approximation techniques introduced with sole purpose to differentiate positive and negative focus trends as part of superposition of classical FE map with score FE maps.
Description of general approach for algorithm stepping and various techniques for judgment of measurement validity presented in paper as well.
Benefits results & summary:
        Elimination of manual data preprocessing and construction of  reliable FE map predictor which
        could in turn be used for online FE drift estimation as part of routine DCCD/FCCD check as well       
        as drastically reduction of FE measurement.
Research Interests:
Problem statement: Challenges in achievement of critical layers CD stability in advanced flash processes are reaching significant complexity as result of tight process window imposed on expected focus/dose variation in lithography tool.... more
Problem statement:
Challenges in achievement of critical layers CD stability in advanced flash processes are reaching significant complexity as result of tight process window imposed on expected focus/dose variation in lithography tool. Common way to perform feed forward prediction for focus/dose variation is creation of FE (Focus /Exposure) Map. In current methodology CD measurements is the only source of FE map construction. Taking in account that for critical layers significant amount of most sensitive features are created using OPC techniques which in turn impact shape/roughness of targeted CD with minimal variation of the focus/dose. Thus CD measurements could be highly unreliable for FE map predictor with commonly used quadratic approximation. Manual CD measurements data filtering is necessary condition in this case. Furthermore for some features quadratic approximation approach itself raising additional concerns about focus shifts for different dose levels. Also no well defined techniques exist for online focus performance tracing and focus trend detection partly due to the mentioned above approximation approach.
Solution:
The purpose of this paper is to discuss how to perform reliable feed forward FE prediction taking in account challenges in advanced flash processes. We introduce here additional variable for FE determination – pattern recognition score and thus elimination manual data preprocessing. Also modification in commonly used approximation techniques introduced with sole purpose to differentiate positive and negative focus trends as part of superposition of classical FE map with score FE maps.
Description of general approach for algorithm stepping and various techniques for judgment of measurement validity presented in paper as well.
Benefits results & summary:
        Elimination of manual data preprocessing and construction of  reliable FE map predictor which
        could in turn be used for online FE drift estimation as part of routine DCCD/FCCD check as well       
        as drastically reduction of FE measurement.
Research Interests:
Motivation: Significant challenges for various CD measurement matching procedures are reaching a comparable complexity as result of negative effects of roughness on the features. Due to the constant trend of integrated circuit in... more
Motivation:
Significant challenges for various CD measurement matching procedures are reaching a comparable complexity as result of negative effects of roughness on the features. Due to the constant trend of integrated circuit in features reduction, impact of roughness start to be more destructive for various sets of measurement algorithms. Commonly used attempts to increase magnification for pattern recognition in addressing mode could in turn detect higher deviation from predefined patterns and thus initiate shift in placement of measurement gate.
Description of the approach:
        The purpose of this paper is to discuss how to reduce measurement gate placement variation     
          impact and filter acquired data using edge correlation approach – creation of width correlation
        function represents particular feature under test and it’s comparison to “golden” one as a mean of
        detection of uncorrelated scans, which in turn should be excluded from overall computation of
        matching results.
We describe general approach for algorithm stepping and various techniques for judgment of measurement comparison validity. Presented approach also has particular interest in determination of specified tool performance for predefined pattern recognition feature as well as for pattern recognition algorithm robustness study - direct interest for manufacturer.
Evaluation of results:
        Precise matching estimation as part of Round Robin routines creating possibility to work with 
        restricted amount of data and perform quick reliable qualification procedures.
        This paper concentrated on practical approach and used both simulation data and actual 
        measurement data before and after proposed optimization taken by various generation tools by
        Hitachi (S-8840, S-9300, S-9380) in production environment.
Research Interests:
Abstract short (short description) Common imaging metrology (Litho Metrology, Defect Metro metrology, Optical inspection etc..) characterized by high level of specialization and optimization for certain measurement condition and... more
Abstract short (short description) Common imaging metrology (Litho Metrology, Defect Metro metrology, Optical inspection etc..) characterized by high level of specialization and optimization for certain measurement condition and requirements. But the process of measurement produce enormous amount of information in the form of the image taken during PR and PM. Significant amount of the information just wasted. The question is could we use this additional information encapsulated in image itself as mean of control over the process. This project purpose is to show the new approach for metrology - decision making conclusion based not just on plain X/Y measurement, but on image itself.
Project long description - free style, with/without charts/pictures etc. In order to apprehend and use the vast amount of information, common measurement practices embedded in SEM/Optical tool algorithms are very limited. The need for advanced Image Processing set of functions is obvious. Due to the fact that it is impossible to create additional IP function in embedded environment without manufacturer interruption much easier approach is to store images acquired during PR and PM and run the set of IP functions over. For example Litho Metrology has wide spectrum of measurement algorithms associated with litho tool parameters (line/space width, via/contact diameter, LER). Pattern recognition in turn as previous step for measurement also tweaked up to recognize common lithography features and create stable output to control lithography process. And the key output parameter is the standard CD's measurement. As example we could process the PR and PM images in certain way to compare them POR images, thus creating binary (good vs bad) control over the process which is independent from X/Y measurement control and will be able to detect non litho issues/defects.
                                       
Proposed system created in the form of framework to support basic requirements for Fab automated tool: User tracing and association, Tool communication, Alarm and Warning interface, Mail communication, Template library, Local SPC, Databases and includes National Instruments Vision Build interface.
Project status: Depicted configuration active and working
Conclusion and expected gain: Totally new approach, last TFs show huge possibility for inline monitoring for non trivial process issues.
Research Interests:
Problem statement: Possibility to command set of CDSEMs tools in remote mode and instantly adress any failures and issues during POR run of the products ( patter recognition failures, measurement failures, lot release, beam adjustment,... more
Problem statement:
Possibility to command set of CDSEMs tools in remote mode and instantly adress any failures and issues during POR run of the products ( patter recognition failures, measurement failures, lot release, beam adjustment,  etc..) are highly valuable in HVM environment. CDSEM manufactures marketing such features for failrly long period of time, but significant issues is related to the cost and flexibility (layout dependence, tool type restriction, etc..)
Solution:
Development of the low cost remote operation center which virtually independent of  listed above shortages and has basic operational capabililties of the manufacturers systems with wide range of flexibility regarding layout and type tool mdependance.
Such system succesfully developed in F18 and implemented for Litho Metrology operation..
Benefits results & summary:
        1. HC reduction.
        2. Cost - up to 20 times less than OEM.
        3. Flexibility - could work with any tool type where remote operation could be beneficial.
        4. No impact on local network.
Research Interests:
In order to check performance of Fuzzy APC vs. WA APC simulation of the system performed (Labview). Dose values were taken as input variables, also Focus values are present, but not used in simulation. Membership function were created... more
In order to check performance of Fuzzy APC vs. WA APC simulation of the system performed (Labview).
Dose values were taken as input variables, also Focus values are present, but not used in simulation.
Membership function were created as well as for Dose and Focus variables.
Rules includes Dose and Focus impact, but feedback loop updates just Dose performance (close simulation for FAB Litho tool activity).
Actual simulation not included any translation of Dose values to CD values for given Focus,  it assumes that any inconsistencies are added as WN or trend in the final measurement.
WA APC simulated as 5 tag window with 0.35/0.25/0.2/0.14/0.06 weights accordingly which is effectively matched NSO exponential weights average approach.
Research Interests:
Comparison of denoising procedure by Classical optimized vs Deep Learning filters for CDSEM metrology and their impact on Resolution, Signal to Noise and Measurement
Research Interests: