Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Face Mask Detection Using Python

The document discusses the development of a face mask detection system using Python, emphasizing the importance of face masks during the COVID-19 pandemic. It outlines a methodology that incorporates image processing, machine learning, and computer vision techniques, including data collection, preprocessing, feature extraction, model training, and real-time integration. Additionally, it covers Python's characteristics, data structures, and libraries like NumPy and Pandas that support the implementation of the proposed system.

Uploaded by

shubham191
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Face Mask Detection Using Python

The document discusses the development of a face mask detection system using Python, emphasizing the importance of face masks during the COVID-19 pandemic. It outlines a methodology that incorporates image processing, machine learning, and computer vision techniques, including data collection, preprocessing, feature extraction, model training, and real-time integration. Additionally, it covers Python's characteristics, data structures, and libraries like NumPy and Pandas that support the implementation of the proposed system.

Uploaded by

shubham191
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

FACE MASK DETECTION USING PYTHON

1 – Introduction
The coronavirus COVID-19 pandemic is triggering a worldwide
health catastrophe, hence the World Health Organization
recommends wearing a face mask in designated areas. Face
Mask Detection and Hand Sanitization have been a well-known
subject in recent times, as well as in image preparation and
computer vision. Many new computations are being devised that
use convolutional architectures to make the computation as
exact as possible. These convolutional architectures have made
it possible to extract even pixel nuances. I propose to design a
dual face classifier that can recognize every face in the edge,
independent of its arrangement. We offer a method for
generating precise face division veils from any subjectively sized
input image. Preparation is carried out using Fully Convolutional
Networks to semantically segment out the faces in the image.
This is accomplished by adding a microprocessor such as the
Raspberry Pi 3 Model B, a Pi-camera, a Relay, an Infrared Non-
contact Temperature sensor, and other sensors, and thereafter
developing a model by connecting each of these components.
DECLARATION

We hereby declare that this submission is our own work and that, to the best of our knowledge and
belief, it contain no matter previously published or written by another person nor material
substantial extent has been accepted for the award of any other degree or diploma of the university
or other institute of higher learning, except where due acknowledgment has been made in the text.

Signature: Shubham Pundir

Name: (Shubham Pundir)

Roll No:( A9929723000145(el))

Date: (04/01/2025)
2– Flowchart

3 – Methodology
Methodology: Face Mask Detection Using Python

The methodology for developing a face mask detection system using Python involves a
systematic approach, combining image processing, machine learning, and computer vision
techniques. The process begins with data collection, where a dataset comprising images of
people with and without face masks is gathered. The dataset is then preprocessed, including
resizing images, normalizing pixel values, and performing data augmentation to enhance
model robustness.

Next, feature extraction is carried out using a deep learning-based approach. Pretrained
convolutional neural networks (CNNs), such as MobileNetV2 or ResNet, are employed to
extract high-level features from the images. Transfer learning is applied by fine-tuning these
models on the face mask dataset, reducing training time while improving accuracy.
The core of the project is model training. The dataset is divided into training, validation, and
test sets, and the model is trained using a suitable optimizer (e.g., Adam) and a categorical
cross-entropy loss function. Key metrics such as accuracy, precision, recall, and F1-score are
monitored to evaluate performance.

Finally, the trained model is integrated with a real-time face detection system using OpenCV.
This allows for real-time face mask detection through a webcam. Extensive testing is
conducted to ensure reliable detection across diverse conditions, including varying lighting,
angles, and face orientations.

This methodology ensures a scalable, accurate, and efficient face mask detection system.

Block Diagram of the proposed model :

4 – Python
Python is a high-level, object-oriented, and interpreted programming language. It was created
by Guido van Rossum from 1985 to 1990 and released in 1991.
Python's syntax is close to the English language, allowing developers to construct programs
with fewer lines than some other programming languages. Python is an interpreter-based
language, which means that code can be executed as soon as it is written. Prototyping may be
done quickly.

Characteristics of Python :-

Following are the important characteristics of python programming -


• Python is a dynamic, high-level, free open source and interpreted programming language.
• It supports object-oriented programming as well as procedural oriented programming.
• It can be used as a scripting language or can be compiled to byte-code for building large
applications.
• It provides very high-level dynamic data types and supports dynamic type checking.
• It supports automatic garbage collection.
• It can be easily integrated with C, C+, COM, ActiveX, CORBA, and Java.

Features of python:-

• Interpreted Language: Python is an interpreted language because python code is


executed line by line at a time. Like other languages C, C++, java, etc. there is no need to
compile python code this makes it easier to debug our code. The source code of python is
converted into an immediate form called byte code.
• Easy to code: Python is a High-level programming language and it is very easy to learn as
compared to other languages like C, C#, Java script, Java, etc. It is very easy to code in
Python language and it is also a developer friendly language.
• Portable: Python language is also a portable language. For example, if we have python
code for windows and if we want to run this code on other platform such as Linux, UNIX, and
Mac then we do not need to change it, we can run this code on any platform.
• Extensible: Python is an extensible language. We can write us some Python code into C or
C++ language and also we can compile that code in C/C++ language.
• GUI Programming Support: Graphical user interfaces can be made using a module such
as PyQt5, PyQt4, wx-Python, or Tk in python. PyQt5 is the most popular option for creating
graphical apps with python.
• Scalable: Python provides an improved structure for supporting large programs than shell-
scripts.
• Integrated Language: Python is also an integrated language because we can easily
integrated python with other languages like c, c++, etc.

WHY PYTHON?
 Python is an excellent cross- platform language and it works on different platforms
(Windows, Mac, Linux, UNIX, Raspberry Pi, and so on).
 Python has a simple syntax similar to the English language.
 Python has syntax similar to English language that allows developers to write programs
with fewer lines than some other programming languages.
 Python can be treated in a procedural way, an object-oriented way or a functional way.
 Python runs on an interpreter system, meaning that code can be executed as soon as
written. This means that proto-typing can be very quick.
 The formation of python syntax is simple and straight forward which also makes it
popular.
 Python codes can be run on a wide variety of hardware platforms having the same
interface.
 Python is a preferred high-level, server-side programming language for websites and
mobile apps.
 For both, new and old developers, Python has managed to stay a language of choice with
ease.
 Python is also foraying into Big Data in a significant way.
Use of NumPy : -
NumPy is a Python package. It means 'Numerical Python'. It is a library that includes
multidimensional array objects as well as routines for array processing.
Jim Hugunin created Numeric, the predecessor to NumPy. Another package, Num array, was
also created, with some more features.
Travis Oliphant created NumPy in 2005 by adding Num array functionality into the Numeric
library. There are numerous contributors to this open source project.
Operations with NumPy NumPy allows developers to conduct logical and mathematical
operations on arrays, as well as Fourier transformations and shape manipulation methods.

Operations related to linear algebra.


NumPy has in-built functions for linear algebra and random number generation.
Simple program to create a matrix-
First of all we import numpy package then using this we take input in numpy function as a
list then we create a matrix.
There is many more function can be perform by using this like that take sin value of the given
value ,print a zero matrix etc. we also take any image in the form of array.

Data Structure in Python:-


The data stored in memory can be of many types. For example, a person’s age is stored as a
numeric value and his or her address is stored as alphanumeric characters. Python has
various standard data types that are used to define the operations possible on them and the
storage method for each of them. Python has different standard data types such as-

1. LISTS -
• Ordered collection of data or sequence of values.
• List is also a sequence type— Sequence operations are applicable
• Supports similar slicing and indexing functionalities as in the case of strings.
• They are mutable.
• Advantages of a list over a conventional array.
• List have no size or type constraints (no setting restrictions beforehand)
• They can contain different object types.
• Example-
my_list = [‘one’, ’two’, ‘three’, 4, 5]
len(my_list) would output 5.
2. DICTIONARY-
• Lists are sequences but the dictionaries are mappings.
• They are mapping between a unique key and a value pair.
• Each key is separated from its value by a colon (:), the items are separated by
commas, and the whole thing is enclosed in curly braces.
• These mappings may not retain order.
• Constructing the dictionary.
• Accessing object from the dictionary.
• Nesting Dictionaries.
• Basic Dictionary Methods.
• Basic syntax
d = {} empty dictionary will be generated and assign keys and values to it, like
d [‘animal’] = ‘Dog’
d = (‘k1’, ‘v1’, ‘k2’, ‘v2’}
d [‘k1’] outputs ‘v1’

3. TUPLES-
• A tuple is a sequence of immutable python objects.
• Tuples are sequences, just like lists.
• The difference between tuples ad lists are, the tuples cannot be changed unlike lists
and tuples use parentheses, whereas lists use square brackets.
• Immutable in nature, i.e. they cannot be changed.
• No type restriction
• Indexing and slicing, everything's same like that in strings and lists.
• Constructing tuples.
• Basic tuple methods:- immutability.
• We can use tuples to present things that shouldn’t change, such as days of the week,
or dates on a calendar, etc.
• We can delete elements from a list by using Del list_name [index_val].

Built-in Tuple Functions:-


1. cmp(tuple1, tuple2) Compares elements of both tuples.
2. len(tuple) Gives the total length of the tuple.
3. max(tuple)Returns item from the tuple with max value.
4. min(tuple)Returns item from the tuple with the min value.
5. tuple(seq)Converts a list into tuple.

4. SETS-
• A set contains unique and unordered elements and we can construct them by using a
set() function.
• Convert a list into Set-  l=[1,2,3,4,1,1,2,3,6,7]
• k = set(l)  k becomes {1,2,3,4,6,7}
• Basic Syntax-
•  x=set()
•  x.add(1)
•  x = {1}
•  x.add(1)
• This would make no change in x now.

Python – Basic Operators


Operators are the constructs which can manipulate the value of operands. Or Operators are
the special symbol which performs operation on two or more operands.

Consider a mathematical expression 4 + 5 = 9.


Here, ‘4’ and ‘5’ are called operands and ‘+’ and ‘=’ is called operators.
Python language supports the following types of operators:-
1. Arithmetic Operators[+, -, *, /, %]
2. Assignment Operator[=]
3. Relational or Comparison Operators [>, <, ==, >=, <=, !=]
4. Logical Operators[and, or, not]
5. Bitwise Operators[~, &, |, ^, <<, >>]
6. Identity Operators
Python – Decision Making
Decision making is anticipation of conditions occurring while execution of the program and
specifying action taken according to the conditions. Decision structures evaluated multiple
expressions which produce TRUE or FALSE as outcome. You need to determine which action
to take and which statements to execute if the outcome is TRUE or FALSE otherwise.
Python programming language assumes any non-zero and non-null values as TRUE, and if it
is either zero or null, then it is assumed as FALSE value. The Python programming language
provides the following types of decision making statements.
1-if statements:
An if statement consists of a Boolean expression followed by one or more statements.
Syntax:-
If true:
Statement (it execute)
If false:
Statement (not execute)
If basically a block or in python it is a suite (where you can write multiple statement).
In if-statement we follow the proper indentation to write a code.
Example:-

X = int(input(“enter any number”))


if x>0:
print(“Positive Number”)
if x<0:
print(“Negative Number”)
if x==0:
print(“Zero”)

1. if-else statements:
An if statement can be followed by an optional else statement, which executes when the
Boolean expression is False.
Example:-

x=8
r=x%2
if r== 0:
Print(“Even”)
else:
Print(“Odd”)

2-nested-if statements:
An nested-if statements can use one if or else if statement inside another if or else
statement (can be followed by many optional if or else statement)
Example:-

x=8
r=x%2
if r==0:
print(“Even”)
if x > 5:
print(“Great”)
else :
print(“Not so great!”)
else :
print(“odd”)

Python – Strings
Strings are amongst the most popular types in python. We can create them simply by
enclosing characters in quotes. Python treats single quotes the same as double quotes.
Creating strings is as simple as assigning a value to a variable.
For example –
var1 = “Hello World!”
var2 = “Python Programming”
Indexing-
 Strings can be indexed.
 First character has index 0.
 Negative indices start counting from the right.
 Negative indices start from -1.
 -1 means last, -2 means second last, and so on.

Slicing-
 To obtain a substring
 s [start – end] means substring of s starting at index start and ending at index end -1.
 s [0:len(s)] is same as s.
 Both start and end are optional -
— If start is omitted, it defaults to 0.
— If end is omitted, it defaults to the length of string.
 S[:] is same as s[0: len(s)] – Essentially, a copy of s.

Python string in-built method-


1 Capitalize () capitalizes first letter of string.
2 Center (width, fillchar) Returns a space-padded string with the original string centered to a
total of width columns.
3 Count (str, beg=0, end-len(string)) counts how many times str occurs in string or in a
substring of string if starting index beg and ending index end are given.
4 Isalnum() Returns true if string has at least 1 character and all characters are alphanumeric
and false otherwise.
5 Isalpha () Returns true if string has at least 1 character and all characters are alphabetic
and false otherwise.
6 Isdigit () Returns true if string contains only digits and false otherwise.7 Islower () Returns
true if string has at least 1 cased character and all cased characters are in lowercase and
false otherwise.
8 Isnumeric () Returns true if a Unicode string contains only numeric characters and false
otherwise.
9 Isspace () Returns true if string contains only whitespace characters and false otherwise.

Use of Pandas : -
Pandas is an open-source Python library licensed under BSD that provides high-performance,
user-friendly data structures and data analysis tools for the Python programming language.
Python with Pandas is utilized in a variety of sectors, both academic and commercial, such as
finance, economics, statistics, and analytics.
Pandas is an open-source Python library that offers high-performance data manipulation and
analysis tools through its strong data structures. Pandas is named after Panel Data, a type of
econometrics that uses multidimensional data.

Key Features of Pandas:


• A fast and efficient Data Frame object with default and configurable indexing.
• Tools for loading data into in-memory data objects from various file formats.
• Data alignment and seamless handling of missing data.
• Reshaping and pivoting data sets.
• Label-based slicing, indexing, and subsetting of big datasets.
• Columns in a data structure can be deleted or added.
• Group data for aggregation and transformation.
Pandas works with the following three data structures:

1 – Series

2 – Data Frame

3 – Panel

These data structure are built on top of NumPy array , which means they are fast .

Use of Plotly: -
Plotly is another great Python module for building interactive and publishable visuals. It
supports a wide range of chart types and customization possibilities, making it ideal for
building interactive dashboards, online apps, and presentations. Here are some of the main
uses and features of Plotly.
Plotly specializes in creating interactive visualizations that allow users to explore data points,
zoom in on specific areas, and reveal extra information when hovered.

Plotly is an open-source Python module for data visualization that supports a variety of
graphs such as line charts, scatter plots, bar charts, histograms, and area plots. Plotly creates
interactive graphs, which can be integrated on websites and offer a wide range of advanced
plotting choices.

1- Plotly specializes in creating interactive visualizations that allow users to explore data
points, zoom in on specific areas, and reveal extra information when hovered.
2- Scatter Plots: Plotly supports scatter plot customisation with markers, colors, sizes,
and tooltips for each data point.
3- Plotly allows you to build interactive bar charts with choices for stacked, grouped,
and horizontal bars, as well as color and annotation customization.
4- Heatmaps: Plotly can create interactive heatmaps to visualize matrix or categorical
data.
5- Dashboard and Layout Customization: Plotly lets you customize dashboard layouts
with many subplots, annotations, and responsive designs.

5-Dataset
Creating a dataset for analyzing employee career surveys requires multiple processes. Here's
an organized technique to generating such a dataset:
1. Define the variables and survey questions.
Employee Information: - ID (anonymous if required)
Please provide your age, gender, department/division, and job title/position.

1- Career and Work Experience:

Years of experience.
Previous companies worked for
Previous positions Education level

Career development includes training and development opportunities.


Career development within the firm

Career objectives (short and long term)

Feedback and suggestions:


Ideas for improvement
Feedback on Company Culture
Suggestions for Career Development programs

2. Data Collection.

Conduct the survey using a platform that supports structured data export (e.g., Google Forms,
SurveyMonkey).
Maintain anonymity and secrecy according to corporate policies.

3. Data preparation.

Clean the data to eliminate inconsistencies and inaccuracies.


Encode categorical variables correctly (for example, gender, department).

4. Data Structure

Divide the data into columns (variables) and rows (individual survey replies).
Ensure that each variable has a clear definition and data type.

5. Data Analysis.

Analyze the dataset with statistical methods and visualisations.


Look for trends and connections between variables, such as work happiness vs. age and
career.
6 - Source Code

TRAINING MASK DETECTION CODE


# import the necessary packages
from tensorflow. keras.preprocessing.image import ImageDataGenerator from tensorflow.
keras.applications import MobileNetV2 from tensorflow. keras.layers import
AveragePooling2D
from tensorflow.keras. layers import Dropout from tensorflow.keras.layers import Flatten
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Input
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.applications.mobilenet_v2 import preprocess_input from
tensorflow.keras.preprocessing.image import img to array
from tensorflow.keras.preprocessing.image import load_img
from tensorflow.keras.utils import to_categorical
from sklearn.preprocessing import LabelBinarizer from sklearn.model_ selection import
train_test _split from sklearn.metrics import classification report from imutils import paths
import matplotlib.pyplot as pit import numpy as np
import os
# initialize the initial learning rate, number of epochs to train for,
# and batch size
INIT LR = 1c-4
EPOCHS - 20
BS - 32
DIRECTORY = r"C:Mask Detection\CODE Face-Mask-Detection-master dataset"
CATEGORIES = ["with_mask", "without_mask"]
# grab the list of images in our dataset directory, then initialize
# the list of data (i.e., images) and class images
print("[INFO] loading images...")
data = U
labels = 0
for category in CATEGORIES:
path = os.path.join(DIRECTORY, category)
for img in os.listdir(path):
img path = os.path.join(path, img)
image = load_img(img_path, target size=(224, 224))
image = img_to_array(image)
image = preprocess_input(image)
data.append(image)
labels.append(category)
# perform one-hot encoding on the labels
Ib = LabelBinarizerO
labels = Ib.fit_transform(labels)
labels = to_categorical(labels)
data = np.array(data, dtype="float32")
labels = np.array(labels)
(trainX, testX, trainY, testY) = train _test_split(data, labels,
test size=0.20, stratify=labels, random_state=42)
# construct the training image generator for data augmentation
aug = ImageDataGenerator (
rotation range=20,
zoom range=0.15,
width shift_ range=0.2,
height_shift range=0.2, shear range=0.15,
horizontal_flip=True,
fill mode="nearest")
# load the MobileNetV2 network, ensuring the head FC layer sets are
165
HacModel MobileNctV 2(weights "imagenet", include top-False, input tensor=
Input(shape=(224, 224, 3))
f construct the head of the model that will be placed on top of the
# the base model
beadModel - baseModel.output
headModel = AveragePooling2D(pool_ size (7, 7)(headModel)
beadModel = Flatten(name="flatten") (headModel)
headModel = Dense(128, activation "relu) (headModel)
headModel = Dropout(0.5)(headModel)
headModel = Dense(2, activation "softmax")(headModel)
# place the head FC model on top of the base model this will become
# the actual model we will train)
model = Model(inputs=baseModel.input, outputs-headModel)
# loop over all layers in the base model and freeze them so they will
#not be updated during the first training process
for layer in baseModel. layers:
layer trainable - False
# compile our model
print(*[INFO] compiling model...")
opt = Adam(Ir-INIT_LR, decay=INIT_LR / EPOCHS)
model compile/loss-"binary _crossentropy", optimizer-opt,
train the head of the network pries"(INFO) training hcad
H- model.fit
aug. Now (trainX, trainY, batch, size-BS), steps per_spoch-len(train) // BS
vlidation data"(testx, testy,
validation_ steps-les/testX) // BS epochs-EPOCHS)
# make predictions on the testing ser
print"[INFO) evaluating network...)
predides - model, predict(test, butch size-BS)
" for each image in the testing set we need to find the index of the a label with corresponding
largest prodicted probability
predids - op argmax(prodides, axis-1)
# show a nicely formatiod classification teno
print(classification _report/testY argmat(axis-1), prodides,
target nameselb class
167
# serialize the model to disk
print("[INFO] saving mask detector model...")
model save(" mask_detector.model", save_ format-"hS")
# plot the training loss and accuracy
N = EPOCHS
pit.style.use("ggplot")
plt. figure
plt. plot(np.arange(0, N), H.history ["loss"], label="train_loss")
plt. plot(np.arange(0, N), H.history["val _loss"], label="val _loss")
plt.plot(np.arange(0, N), H.history ["accuracy"], label="train acc")
pit. plot(np.arange(0, N), H.history ["val _accuracy"], label="val _ace")
plt.title("Training Loss and Accuracy")
pit.xlabel("Epoch #")
pit.ylabel ("Loss/Accuracy")
plt. legend(loc="lower left")
plt.savefig("plot.png")

TESTING MASK DETECTION CODE


Import the necessary packages
from ensorflow keras, preprocessing image import img. 10 array from tensorflow. keras
models import load model from imutils. video import VideoStream import numpy as p import
imutils
import time
import cv2
import os
def detect _and_predict _mask(frame, faceNet, maskNet):
# grab the dimensions of the frame and then construct a blob
# from it
(h, w) = frame.shape[:2]
blob = cv2.dnn.blobFromlmage(frame, 1.0, (224, 224),
(104.0, 177.0, 123.0))
# pass the blob through the network and obtain the face detections
faceNet.setInput(blob)
detections = faceNet.forward
print(detections.shape)
initialize our list of faces, their corresponding locations,
# and the list of predictions from our face mask network
# loop over the detections
for i in range(0, detections.shape[2]):
# extract the confidence (i.., probability) associated with
# the detection
confidence = detections[0, 0, i, 2]
# filter out weak detections by ensuring the confidence is
# greater than the minimum confidence
if confidence > 0.5:
# compute the (x, y)-coordinates of the bounding box for
# the object
box = detections[0, 0, i, 3:7] * p.array([w, h, w, hj)
(startX, startY, endX, endY) = box.astype("int")
# ensure the bounding boxes fall within the dimensions of
# the frame
(startX, startY) = (max(0, startX), max(Q, startY))
(endX, endY) = (min(w - 1, endX), min(h - 1, endY))
Fextract the face ROI, convert it from BGR to RGB channel
# ordering, resize it to 224x224, and preprocess it
face = frame[startY :endY, startX:endX)
face = cv2.evtColor(face, cv2.COLOR_ BGR2RGB)
face = cv2 resize(face, (224, 224))
face = img to_array(face)
face = preprocess_input(face)
# add the face and bounding boxes to their respective
# lists
faces.append(face)
locs.append(startX, startY, endX, endY)
#only make a predictions if at least one face was detected if len(faces) > 0:
# for faster inference we'll make batch predictions on all
# faces at the same time rather than one-by-one predictions
# in the above 'for loop
faces = p.array(faces, dtype="float32")
preds = maskNet.predict(faces, batch _size=32)
# return a 2-tuple of the face locations and their corresponding
# locations
return (locs, preds)
bad our serialized face detector model from disk
/xtPath = "face detector'deploy,protot
you o mask detector model from disk
motive bad model" mask detector: model)
initialize the video stream print'INFO) starting video stream...")
Is = VideoStream(src=0).startO
# loop over the frames from the video stream
while True:
# grab the frame from the threaded video stream and resize it
#to have a maximum width of 400 pixels
frame = vs.read
frame = imutils.resize(frame, width=400)
# detect faces in the frame and determine if they are wearing a
#face mask or not
(locs, preds) = detect _and_predict_mask(frame, faceNet, maskNet)
# loop over the detected face locations and their corresponding
# locations
for (box, pred) in zip(locs, preds):
# unpack the bounding box and predictions
(starX, startY, endX, endY) = box
(mask, withoutMask) - pred
& determine the class label and color we'll use to draw
# the bounding box and text
label = "Mask" if mask >withoutMask else "No Mask"
color = (0, 255, 0) if label = Mask" else (0, 0, 255)
# include the probability in the label
label = "0: (:2f)%" format(label, max(mask, withoutMask) * 100)
# display the label and bounding box rectangle on the output
# frame
cv2.putText(frame, label, (startX, startY - 10), cv2.FONT_HERSHEY _SIMPLEX, 0.45,
color, 2)
cv2.rectangle(frame, (startX, startY), (endX, endY), color, 2)
# show the output frame
cv2. imshow("Frame", frame)
key = cv2.waitKey(1) & 0xFF
#ifthe q' key was pressed, break from the loop
if key = ord("q'):
break
#do a bit of cleanup
V2. destroy AllWindows)
vs.stop
Figure 1
Figure 2

Proposed Face-Feature Based Method to Validate


the Wearing of the Mask
The designed preliminary method combines Haar-like feature descriptors to detect the face as
well as key features of the face from the camera-based acquisition of a mobile phone; namely
e.g., detection of eyes, mouth, nose. Nowadays, Haar-related analysis techniques are very
investigated to perform face feature analysis. In, the authors exploit Haar-based feature
analysis in a hybrid processing scheme and a comparative study shows their efficiencies. In
this paper, the design of our method conjointly employs Haar-based face feature detection
techniques. Fig. 2, shows a on ted and fa he san tre the tha re mal wering stala r
I amad ate method an lya face wearing conventional masks namely with a range all mask-
wearing configurations.
pape and opaque olg contig presented preliminary mask wearing va hamity stemains 10
Te basic principle of the aresolution sonably exploits the real-time detection of motiple faces
at different resolutions in video streams and Har feature-ioned scade classifiers that rely onin
particular, the goal of Haar-like features is to code de variations of pixels' content in the
image. To this end, a small detection window, composed for instance of two adjacent
rectangular zones (black and white, is postioned on the image; then the saiation on this part of
the image is calculated by abracting the sum of pixel intensities resulting from the areas
covered by black and white zones. respectively.
The value obtained from this calculation corresponds to an encoded Haar-like feature that can
detect a texture change or the location of a boundary in the image. The window is moved in
such a way as to scan the whole surface of the image and its size increases to ensure
robustness against scale variations. Moreover, several patterns are exploited on the detection
window (not only ajacent rectangles) in order to code different types of relevant information
existing in the image.
To optimize the computation time of these features the usage of integral images is
highly recommended

You might also like