CQE
CQE
CQE
Information
Certified Quality Engineer
The Certified Quality Engineer is a professional who understands
the principles of product and service quality evaluation and
methods to diagnose and correct improper quality control
practices, an understanding of human factors and motivation,
control. This body of knowledge and applied technologies facility with quality cost concepts and techniques, and the
include, but are not limited to, development and operation of knowledge and ability to develop and administer management
quality control systems, application and analysis of testing and information systems and to audit quality systems for deficiency
inspection procedures, the ability to use metrology and statistical identification and correction.
Proof of Professionalism Examination Will have a fundamental understanding of a
Proof of professionalism may be demonstrated in one Each certification candidate is required to pass a quality system and its development, documentation,
of three ways: written examination that consists of multiple-choice and implementation to domestic and international
questions that measure comprehension of the Body standards or requirements.
• Membership in ASQ, an international affiliate
of Knowledge. The Quality Engineer examination Will have a basic understanding of the audit process
society of ASQ, or another society that is a
is a one-part, 160-question, five-hour exam. It is including types of audits, planning, preparation,
member of the American Association of
offered in English. execution, reporting results, and follow-up.
Engineering Societies or the Accreditation Board
for Engineering and Technology. Certified Quality Engineer Expectations Will be able to develop and implement quality
• Registration as a Professional Engineer. Will have a fundamental understanding of quality programs, including tracking, analyzing, reporting,
philosophies, principles, systems, methods, tools, and problem solving.
• The signatures of two persons—ASQ members, standards, organizational and team dynamics,
members of an international affiliate society, or customer expectations and satisfaction, supplier Will be able to plan, control, and assure product and
members of another recognized professional relations and performance, leadership, training, process quality in accordance with quality principles,
society—verifying that you are a qualified interpersonal relationships, improvement systems, which include planning processes, material control,
practitioner of the quality sciences. and professional ethics. acceptance sampling, and measurement systems.
Will have basic knowledge of reliability, maintainability, Education and/or Experience If you have completed a degree* from a college,
and risk management, including key terms and You must have eight years of on-the-job experience university, or technical school with accreditation
definitions, modeling, systems design, assessment tools, in one or more of the areas of the Certified Quality accepted by ASQ, part of the eight-year experience
and reporting. Engineer Body of Knowledge. A minimum of three requirement will be waived, as follows (only one of
years of this experience must be in a decision-making these waivers may be claimed):
Will have a thorough understanding of problem-
position. “Decision-making” is defined as the authority • Diploma from a technical or trade school—one year
solving and quality improvement tools and techniques.
to define, execute, or control projects/processes and will be waived.
This includes knowledge of management and
to be responsible for the outcome. This may or may • Associate degree—two years waived.
planning tools, quality tools, preventive and corrective
not include management or supervisory positions. • Bachelor’s degree—four years waived.
actions, and how to overcome barriers to quality
• Master’s or doctorate—five years waived.
improvements. If you were ever certified by ASQ as a Quality
Auditor, Reliability Engineer, Software Quality *Degrees/diplomas from educational institutions
Will be able to acquire and analyze data using
Engineer, or Manager, experience used to qualify outside the United States must be equivalent to
appropriate standard quantitative methods across
for certification in these fields applies to certification degrees from U.S. educational institutions.
a spectrum of business environments to facilitate
process analysis and improvements. as a Quality Engineer. For comprehensive exam information on the Quality
Engineer certification, visit www.asq.org/certification .
I Management and Leadership brainstorming, nominal group technique, conflict D. Quality Audits
(15 Questions) resolution, force-field analysis, etc. (Analyze) 1. Types of audits
A. Quality Philosophies and Foundations F. Communication Skills Describe and distinguish between various
Explain how modern quality has evolved from Describe and distinguish between various types of quality audits such as product,
quality control through statistical process control communication methods for delivering information process, management (system), registration
(SPC) to total quality management and leadership and messages in a variety of situations across all (certification), compliance (regulatory), first,
principles (including Deming’s 14 Points), and levels of the organization. (Analyze) second, and third party, etc. (Apply)
how quality has helped form various continuous 2. Roles and responsibilities in audits
G. Customer Relations
improvement tools including lean, six sigma, Identify and define roles and responsibilities
Define, apply, and analyze the results of
theory of constraints, etc. (Remember) for audit participants such as audit team
customer relation measures such as quality
B. The Quality Management System (QMS) function deployment (QFD), customer satisfaction (leader and members), client, auditee, etc.
1. Strategic planning surveys, etc. (Analyze) (Understand)
Identify and define top management’s 3. Audit planning and implementation
H. Supplier Management
responsibility for the QMS, including Describe and apply the steps of a quality
Define, select, and apply various techniques
establishing policies and objectives, audit, from the audit planning stage through
including supplier qualification, certification,
setting organization-wide goals, conducting the audit, from the perspective of
evaluation, ratings, performance improvement,
supporting quality initiatives, etc. (Apply) an audit team member. (Apply)
etc. (Analyze)
2. Deployment techniques 4. Audit reporting and follow-up
I. Barriers to Quality Improvement
Define, describe, and use various deployment Identify, describe, and apply the steps of audit
Identify barriers to quality improvement, their
tools in support of the QMS: benchmarking, reporting and follow-up, including the need to
causes and impact, and describe methods for
stakeholder identification and analysis, verify corrective action. (Apply)
overcoming them. (Analyze)
performance measurement tools, and project
E. Cost of Quality (COQ)
management tools such as PERT charts, Gantt II The Quality System (15 Questions) Identify and apply COQ concepts, including
charts, critical path method (CPM), resource A. Elements of the Quality System cost categories, data collection methods and
allocation, etc. (Apply) Define, describe, and interpret the basic classification, and reporting and interpreting
3. Quality information system (QIS) elements of a quality system, including planning, results. (Analyze)
Identify and define the basic elements of a control, and improvement, from product and F. Quality Training
QIS, including who will contribute data, the process design through quality cost systems, Identify and define key elements of a training
kind of data to be managed, who will have audit programs, etc. (Evaluate) program, including conducting a needs analysis,
access to the data, the level of flexibility for B. Documentation of the Quality System developing curricula and materials, and
future information needs, data analysis, etc. Identify and apply quality system documentation determining the program’s effectiveness. (Apply)
(Remember) components, including quality policies,
C. ASQ Code of Ethics for Professional Conduct procedures to support the system, configuration III Product and Process Design (25 Questions)
Determine appropriate behavior in situations management and document control to manage A. Classification of Quality Characteristics
requiring ethical decisions. (Evaluate) work instructions, quality records, etc. (Apply) Define, interpret, and classify quality
D. Leadership Principles and Techniques C. Quality Standards and Other Guidelines characteristics for new products and processes.
Describe and apply various principles and Define and distinguish between national and [Note: The classification of product defects is
techniques for developing and organizing teams international standards and other requirements covered in IV.B.3.] (Evaluate)
and leading quality initiatives. (Analyze) and guidelines, including the Malcolm Baldrige B. Design Inputs and Review
E. Facilitation Principles and Techniques National Quality Award (MBNQA), and describe Identify sources of design inputs such as customer
Define and describe the facilitator’s role and key points of the ISO 9000 series of standards needs, regulatory requirements, etc., and how
responsibilities on a team. Define and apply and how they are used. [Note: Industry-specific they translate into design concepts such as robust
various tools used with teams, including standards will not be tested.] (Apply) design, QFD, and Design for X (DFX, where X
can mean six sigma (DFSS), manufacturability
(DFM), cost (DFC), etc.). Identify and apply D. Measurement and Test 5. Descriptive statistics
common elements of the design review process, 1. Measurement tools Describe, calculate, and interpret measures
including roles and responsibilities of participants. Select and describe appropriate uses of of central tendency and dispersion (central
(Analyze) inspection tools such as gage blocks, calipers, limit theorem), and construct and interpret
C. Technical Drawings and Specifications micrometers, optical comparators, etc. frequency distributions including simple,
Interpret technical drawings including (Analyze) categorical, grouped, ungrouped, and
characteristics such as views, title blocks, 2. Destructive and nondestructive tests cumulative. (Evaluate)
dimensioning, tolerancing, GD&T symbols, etc. Distinguish between destructive and 6. Graphical methods for depicting relationships
Interpret specification requirements in relation to nondestructive measurement test methods Construct, apply, and interpret diagrams and
product and process characteristics. (Evaluate) and apply them appropriately. (Analyze) charts such as stem-and-leaf plots, box-and-
D. Design Verification E. Metrology whisker plots, etc. [Note: Run charts and
Identify and apply various evaluations and tests Identify, describe, and apply metrology techniques scatter diagrams are covered in V.A.] (Analyze)
to qualify and validate the design of new products such as calibration systems, traceability to 7. Graphical methods for depicting distributions
and processes to ensure their fitness for use. calibration standards, measurement error and Construct, apply, and interpret diagrams such
(Evaluate) its sources, and control and maintenance of as normal probability plots, Weibull plots, etc.
measurement standards and devices. (Analyze) [Note: Histograms are covered in V.A.] (Analyze)
E. Reliability and Maintainability
1. Predictive and preventive maintenance tools F. Measurement System Analysis (MSA) B. Quantitative Concepts
Describe and apply these tools and techniques Calculate, analyze, and interpret repeatability 1. Terminology
to maintain and improve process and product and reproducibility (Gage R&R) studies, Define and apply quantitative terms, including
reliability. (Analyze) measurement correlation, capability, bias, population, parameter, sample, statistic, random
linearity, etc., including both conventional and sampling, expected value, etc. (Analyze)
2. Reliability and maintainability indices
Review and analyze indices such as, MTTF, control chart methods. (Evaluate) 2. Drawing statistical conclusions
MTBF, MTTR, availability, failure rate, etc. V Continuous Improvement (30 Questions) Distinguish between numeric and analytical
(Analyze) studies. Assess the validity of statistical conclu-
A. Quality Control Tools sions by analyzing the assumptions used and
3. Bathtub curve Select, construct, apply, and interpret tools such as the robustness of the technique used. (Evaluate)
Identify, define, and distinguish between the 1) flowcharts, 2) Pareto charts, 3) cause and
basic elements of the bathtub curve. (Analyze) 3. Probability terms and concepts
effect diagrams, 4) control charts, 5) check sheets,
Describe and apply concepts such as
4. Reliability/Safety/Hazard Assessment Tools 6) scatter diagrams, and 7) histograms. (Analyze)
independence, mutually exclusive, multiplication
Define, construct, and interpret the results of B. Quality Management and Planning Tools rules, complementary probability, joint
failure mode and effects analysis (FMEA), Select, construct, apply, and interpret tools such occurrence of events, etc. (Apply)
failure mode, effects, and criticality analysis as 1) affinity diagrams, 2) tree diagrams, 3)
(FMECA), and fault tree analysis (FTA). C. Probability Distributions
process decision program charts (PDPC), 4)
(Analyze) 1. Continuous distributions
matrix diagrams, 5) interrelationship digraphs,
Define and distinguish between these
IV Product and Process Control 6) prioritization matrices, and 7) activity network distributions: normal, uniform, bivariate
(32 Questions) diagrams. (Analyze) normal, exponential, lognormal, Weibull,
A. Tools C. Continuous Improvement Techniques chi square, Student’s t, F, etc. (Analyze)
Define, identify, and apply product and process Define, describe, and distinguish between various 2. Discrete distributions
control methods such as developing control plans, continuous improvement models: total quality Define and distinguish between these
identifying critical control points, developing and management (TQM), kaizen, Plan-Do-Check-Act distributions: binomial, Poisson, hypergeometric,
validating work instructions, etc. (Analyze) (PDCA), six sigma, theory of constraints (TOC), multinomial, etc. (Analyze)
B. Material Control lean, etc. (Analyze)
D. Statistical Decision-Making
1. Material identification, status, and traceability D. Corrective Action 1. Point estimates and confidence intervals
Define and distinguish these concepts, and Identify, describe, and apply elements of the Define, describe, and assess the efficiency
describe methods for applying them in various corrective action process including problem and bias of estimators. Calculate and interpret
situations. [Note: Product recall procedures will identification, failure analysis, root cause standard error, tolerance intervals, and
not be tested.] (Analyze) analysis, problem correction, recurrence control, confidence intervals. (Evaluate)
2. Material segregation verification of effectiveness, etc. (Evaluate)
2. Hypothesis testing
Describe material segregation and its E. Preventive Action Define, interpret, and apply hypothesis tests
importance, and evaluate appropriate methods Identify, describe, and apply various preventive for means, variances, and proportions. Apply
for applying it in various situations. (Evaluate) action tools such as error-proofing/poka- and interpret the concepts of significance level,
3. Classification of defects yoke, robust design, etc., and analyze their power, type I and type II errors. Define and
Define, describe, and classify the seriousness effectiveness. (Evaluate) distinguish between statistical and practical
of product and process defects. (Evaluate) VI Quantitative Methods and Tools significance. (Evaluate)
4. Material review board (MRB) (43 Questions) 3. Paired-comparison tests
Identify the purpose and function of an MRB, A. Collecting and Summarizing Data Define and use paired-comparison (parametric)
and make appropriate disposition decisions in 1. Types of data hypothesis tests, and interpret the results.
various situations. (Analyze) Define, classify, and compare discrete (Apply)
C. Acceptance Sampling (attributes) and continuous (variables) 4. Goodness-of-fit tests
1. Sampling concepts data. (Apply) Define and use chi square and other
Define, describe, and apply the concepts of 2. Measurement scales goodness-of-fit tests, and interpret the results.
producer and consumer risk and related terms, Define, describe, and use nominal, ordinal, (Apply)
including operating characteristic (OC) curves, interval, and ratio scales. (Apply) 5. Analysis of variance (ANOVA)
acceptable quality limit (AQL), lot tolerance 3. Data collection methods Define and use ANOVAs and interpret the
percent defective (LTPD), average outgoing Describe various methods for collecting data, results. (Analyze)
quality (AOQ), average outgoing quality limit including tally or check sheets, data coding, 6. Contingency tables
(AOQL), etc. (Analyze) automatic gaging, etc., and identify their Define, construct, and use contingency tables to
2. Sampling standards and plans strengths and weaknesses. (Apply) evaluate statistical significance. (Analyze)
Interpret and apply ANSI/ASQ Z1.4 and 4. Data accuracy E. Relationships Between Variables
Z1.9 standards for attributes and variables Describe the characteristics or properties of 1. Linear regression
sampling. Identify and distinguish between data (e.g., source/resource issues, flexibility, Calculate the regression equation for simple
single, double, multiple, sequential, and versatility, etc.) and various types of data regressions and least squares estimates.
continuous sampling methods. Identify the errors or poor quality such as low accuracy, Construct and interpret hypothesis tests for
characteristics of Dodge-Romig sampling tables inconsistency, interpretation of data values, regression statistics. Use regression models for
and when they should be used. (Analyze) and redundancy. Identify factors that can estimation and prediction, and analyze the
3. Sample integrity influence data accuracy, and apply techniques uncertainty in the estimate. [Note: Nonlinear
Identify the techniques for establishing and for error detection and models and parameters will not be tested.]
maintaining sample integrity. (Analyze) correction. (Apply) (Analyze)
2. Simple linear correlation 6. Control chart analysis response, treatment, error, and replication.
Calculate the correlation coefficient and its Read and interpret control charts, use rules (Understand)
confidence interval, and construct and interpret for determining statistical control. (Evaluate) 2. Planning and organizing experiments
a hypothesis test for correlation statistics. 7. PRE-control charts Define, describe, and apply the basic
[Note: Serial correlation will not be tested.] Define and describe how these charts elements of designed experiments, including
(Analyze) differ from other control charts and how determining the experiment objective, selecting
3. Time-series analysis they should be used. (Apply) factors, responses, and measurement methods,
Define, describe, and use time-series analysis 8. Short-run SPC choosing the appropriate design, etc.
including moving average, and interpret Identify, define, and use short-run SPC (Analyze)
time-series graphs to identify trends and rules. (Apply) 3. Design principles
seasonal or cyclical variation. (Analyze) Define and apply the principles of power
G. Process and Performance Capability
F. Statistical Process Control (SPC) 1. Process capability studies and sample size, balance, replication, order,
1. Objectives and benefits Define, describe, calculate, and use process efficiency, randomization, blocking, interaction,
Identify and explain objectives and benefits of capability studies, including identifying and confounding. (Apply)
SPC such as assessing process performance. characteristics, specifications, and tolerances, 4. One-factor experiments
(Understand) developing sampling plans for such studies, Construct one-factor experiments such as
2. Common and special causes establishing statistical control, etc. (Analyze) completely randomized, randomized
Describe, identify, and distinguish between 2. Process performance vs. specifications block, and Latin square designs, and use
these types of causes. (Analyze) Distinguish between natural process limits computational and graphical methods to
3. Selection of variable and specification limits, and calculate percent analyze the significance of results. (Analyze)
Identify and select characteristics for defective. (Analyze) 5. Full-factorial experiments
monitoring by control chart. (Analyze) 3. Process capability indices Construct full-factorial designs and use
4. Rational subgrouping Define, select, and calculate Cp, Cpk, Cpm, and computational and graphical methods to
Define and apply the principles of rational Cr, and evaluate process capability. (Evaluate) analyze the significance of results. (Analyze)
subgrouping. (Apply) 4. Process performance indices 6. Two-level fractional factorial experiments
5. Control charts Define, select, and calculate Pp and Ppk and Construct two-level fractional factorial
Identify, select, construct, and use various evaluate process performance. (Evaluate) designs (including Taguchi designs) and
control charts, including X-R, X-s, individuals H. Design and Analysis of Experiments apply computational and graphical methods
and moving range (ImR or XmR), moving 1. Terminology to analyze the significance of results. (Analyze)
average and moving range (MamR), Define terms such as dependent and
p, np, c, u, and CUSUM charts. (Analyze) independent variables, factors, levels,
Levels of Cognition
Based on Bloom’s Taxonomy – Revised (2001)
In addition to content specifics, the subtext for
each topic in this BOK also indicates the intended
complexity level of the test questions for that
topic. These levels are based on “Levels of
Cognition” (from Bloom’s Taxonomy – Revised,
2001) and are presented below in rank order,
from least complex to most complex.
Remember (Knowledge Level) Recall or recognize
terms, definitions, facts, ideas, materials,
patterns, sequences, methods, principles, etc.
Understand (Comprehension Level) Read and
understand descriptions, communications, reports,
tables, diagrams, directions, regulations, etc.
Apply (Application Level) Know when and how
to use ideas, procedures, methods, formulas,
principles, theories, etc.
Analyze (Analysis Level) Break down information
into its constituent parts and recognize their
relationship to one another and how they are
organized; identify sublevel factors or salient
data from a complex scenario.
Evaluate (Evaluation Level) Make judgments
about the value of proposed ideas, solutions,
etc., by comparing the proposal to specific
criteria or standards.
Create (Synthesis Level) Put parts or elements
together in such a way as to reveal a pattern or
structure not clearly there before; identify which
data or information from a complex set are
appropriate to examine further or from which
supported conclusions can be drawn.
Item B0050