Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Unit III - Software

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 61

UNIT III: Software Design & Testing methods

Prof. S.D.Shirbahadurkar
Contents:
• Types of software
• Waterfall Model
• Matrices & software limitations
• Risk abatement & failure prevention
• Software bugs & testing
• Good programming practices & user
interfaces
• Embedded, Real time software
Why Software w.r. t. Electronics Product
•All Electronics computing Product, Automobile product
•Communication Industries, Complex traffic network
•Software have major cost in product
•Software maintenance & testability
•Problems with software are uncontrollable
•Personal capability & Product complexity
•Common mistake : Much confidence in software
•Improve software in area like code generation, reliability ,
maintainability , correctness
Process to produce software : plan, measure & document
Types of Software : Prerequiste
• Algorithm: List of the instruction, recipes for action
• Algorithms have effect: Utility, success and failure of S/W
• Data structure: Way of design processing architecture
• Handling the amount of data, understand each Algorithm, its
limitation, its boundary
• Habit of collection algorithms for future Programming
• Language: w.r,t, Embedded System –
firm ware, peripheral interface & drivers, OS,
User interface,
Application program
Types of Software:
Types of Software:
Types of Software:
• Algorithms
• Languages
• Methods
• Selection
• Purchase
9

SE Vs Computer Science(CS)
SE deals with practical problems
• Complex software products (I)
• Processes (II)
• Methods/Models (III)
• People (IV)

CS is concerned with
• Theories
• Methods
Algorithms, data structures, programs,
formal grammars, abstract machines,
complexity, numerical methods…
10

Software applications
Potential applications
• System software
• Real-time software
• Business software
• Engineering and scientific software
• Embedded software
• Personal computer software
• Web-based software
• Artificial Intelligence software
• Research software
• ML, Data Analysis, Application base
11

Software failures
Complex software systems failures and bugs:
• Taurus (1993): the planned automated transaction
settlement system for London Stock Exchange cancelled
after five years of development
• Ariane 5 (1996): rocket exploded soon after its launch due
an error conversion (16-bit floating point into 16-bit
integer)
• The Mars Climate Orbiter assumed to be lost by NASA
officials (1999): different measurement systems (Imperial
and metric)

http://infotech.fanshawec.on.ca/gsantor/Computing/
FamousBugs.htm
12

However…
Important progress:

• Ability to produce more complex software has increased


• New technologies have led to new SE approaches
• A better understanding of the activities involved in
software development
• Effective methods to specify, design, implement software
have been developed
• New notations and tools have been produced
13

Processes
A software process (II) consists of a set of activities and
associated results which lead to the production of a software
product (I) Summerville p43

Fundamental activities:
• Software specification
• Software design and implementation
• Software validation
• Software evolution
Software developed from scratch or by extending and modifying
existing systems
Software specifications (Ss)
Ss refers to services requested (functional aspects) and
constraints (non-functional component) – called
requirements engineering

• Feasibility study
• Requirements elicitation and analysis
• Requirements specification
• Requirements validation

Lead to reports, models, documents


15

Software design and implementation


Software design process - a set of activities transforming
(iteratively) the set of requirements into design products

• Abstract specification of each sub-system


• Component design
• Interface design
• Data structure
• Algorithm design

A set of reports, models (notations), documents is generated


Software design and implementation -cont
Implementation (programming) stage – transforms the design model(s) into
code

• Sometimes interleaved with design


• Tools used to (partially) convert into code
• Programming strategies: top-down, bottom-up
• Use of coding standards
• Quality aspects
• Debugging and testing

Software product
Software validation
The validation is the process of checking that “the correct system” was
implemented – inspections and reviews

Verification – “building the product right”; formal verification, testing

• Unit testing
• Module testing
• Sub-system testing
• Systems testing
• Acceptance testing
Software evolution
Software evolution process: changes made to a software product after the
system development (but not always) - maintenance

• Changes to repair software faults


• Changes to adapt a software system to different operating environment
• Changes regarding system’s functionality

Increasingly maintenance is part of system’s development (open source,


generic frameworks etc)
Purchase
• Qualifications require from Vendor:
Acceptance Testing
Review of vendors quality assurance
Verification Testing
Qualification report
Document require from Vendor:
Requirements Specification
Interface specifications
Test plans, procedures, results
Configuration Management Plan
Hazards Analysis
Summary
• Software engineering covers all aspects of software production
• Historically motivated by a lack of suitable methods to specify and
develop complex software systems
• Software engineering includes software products, processes,
models and people
• Failures in software production impose adequate approaches and
require a limitation of mythical believes
• Software engineering processes cover the whole cycle of
developing a software product (specification through to
implementation and maintenance)
Traditional software Life cycle/
Linear Sequential Life cycle model
• Quality Assurance: For development of reliable &
useful software
• Concept & Analysis: S/W problem & project
• Requirements: Tells WHAT the s/w does?
• Design: Tells HOW s/w does?
• Programming: Coding stage
• Testing & Verification: Ensure that program
performs according to requirement or not?
• Maintenance: Update, correct & modify for
future applications.
Quality Assurance

• Produce useful reliable s/w


• Prepare s/w development plan
• Plan must include hazards & fault tree
analysis
• It must Configuration management,
documentation & traceability.
Concept & Analysis

• Technical Trade off


• Performance & Timing
• Human Factors
• Hazards or Risk Analysis
• Fault tree Analysis
Requirements

• General Specification
• Functional performance specification
• Requirements specification
• Design specifications
Design
• System preparation & setup
• Operating system & procedure
• Communication & I/O
• Monitoring procedures
• Fault recovery & special
procedure
• Diagnostic features
Programming:-
Models, Metrics & Software
Limitations:
• Models:
 Waterfall
Prototyping
Spiral
Prototyping model for s/w development
Spiral model of s/w development
Testing & Verification
• Internal Reviews
• Black Box Testing
• White Box Testing
• Alpha Testing
• Beta Testing
Software Bugs & Testing
Phases of Bugs
• Intent
• Translation
• Execution
• Operation
Nature of Bugs
Debugging
Inspection & Reviews
Testability
Maintenance:
• Bug number
• Bug description
• Severity
• Date
• Target device configuration
• Person responsible for fixing the bug
• Current status
Characteristics of Testable Software
• Operable
▫ The better it works (i.e., better quality), the easier it is to test
• Observable
▫ Incorrect output is easily identified; internal errors are automatically
detected
• Controllable
▫ The states and variables of the software can be controlled directly by the
tester
• Decomposable
▫ The software is built from independent modules that can be tested
independently
continued..

• Simple
▫ The program should exhibit functional, structural, and code simplicity
• Stable
▫ Changes to the software during testing are infrequent and do not
invalidate existing tests
• Understandable
▫ The architectural design is well understood; documentation is available
and organized
Test Characteristics
• A good test has a high probability of finding an error
▫ The tester must understand the software and how it might fail
• A good test is not redundant
▫ Testing time is limited; one test should not serve the same purpose as
another test
• A good test should be “best of breed”
▫ Tests that have the highest likelihood of uncovering a whole class of errors
should be used
• A good test should be neither too simple nor too complex
▫ Each test should be executed separately; combining a series of tests could
cause side effects and mask certain errors
Two Unit Testing Techniques
• Black-box testing
▫ Knowing the specified function that a product has been designed to
perform, test to see if that function is fully operational and error free
▫ Includes tests that are conducted at the software interface
▫ Not concerned with internal logical structure of the software
• White-box testing
▫ Knowing the internal workings of a product, test that all internal
operations are performed according to specifications and all internal
components have been exercised
▫ Involves tests that concentrate on close examination of procedural detail
▫ Logical paths through the software are tested
▫ Test cases exercise specific sets of conditions and loops
White-box Testing
• Uses the control structure part of component-level design to derive the
test cases
• These test cases
▫ Guarantee that all independent paths within a module have been exercised at
least once
▫ Exercise all logical decisions on their true and false sides
▫ Execute all loops at their boundaries and within their operational bounds
• Exercise internal data structures to ensure their validity
• Enables the test case designer to derive a logical complexity measure of
a procedural design
• Uses this measure as a guide for defining a basis set of execution paths
• Test cases derived to exercise the basis set are guaranteed to execute
every statement in the program at least one time during testing
Black-box Testing
• Complements white-box testing by uncovering different classes of
errors
• Focuses on the functional requirements and the information domain
of the software
• Used during the later stages of testing after white box testing has
been performed
• The tester identifies a set of input conditions that will fully exercise
all functional requirements for a program
• The test cases satisfy the following:
▫ Reduce, by a count greater than one, the number of additional test cases
that must be designed to achieve reasonable testing
▫ Tell us something about the presence or absence of classes of errors, rather
than an error associated only with the specific task at hand
Black-box Testing Categories

• Incorrect or missing functions


• Interface errors
• Errors in data structures or external data base access
• Behavior or performance errors
• Initialization and termination error
What are Metrics? ( for S/W Measurement)
 Basic quality and productivity data are collected
 These data are analyzed, compared against past
averages, and assessed
 The goal is to determine whether quality and
productivity improvements have occurred
 The data can also be used to pinpoint problem
areas
 Remedies can then be developed and the software
process can be improved
Software metrics are important for measurement

• Software performance
• Measuring productivity
• Planning work items
• Measuring Productivity
• Debugging
• Estimating cost
Software Metrics Baseline Process
Software
Engineering
Process

Measures
Software Data
Project Collection
Metrics
Metrics
Software Computation
Product
Indicators
Metrics
Evaluation
Categories of Software Measurement Electronic
Product
Design-
SGK

• Two categories of software measurement


▫ Direct measures of the
 Software process (cost, effort, etc.)
 Software product (lines of code produced, execution speed,
defects reported over time, etc.)
▫ Indirect measures of the
 Software product (functionality, quality, complexity, efficiency,
reliability, maintainability, etc.)
• Project metrics can be consolidated to create process metrics for an
organization
Software Limitations:

• Not all problems can be solved.


• Specifications can’t anticipate all possible uses &
problems.
• Errors creep into development in number of ways.
• Software simulation can predict only known
outcomes.
• Human error can occur in operating the software.
What is Risk? Software project
• Risk is the possibility of suffering loss
• Possible – but not certain, so it is expressed as probability
• Loss - is any unwanted consequence that might occur
• Risk in the Projects
In a development project, the loss describes the impact to the
project which could be in the form of diminished quality of the
end product, increased costs, delayed completion, or failure.
Risk Management
• Four Main Steps

Risk Risk Risk Response Risk Response


Identification Assessment Development Control

• Before these activities


Plan Risk Management Process
Risk abatement & failure prevention

1. Issues
2. Development Plan
3. Safety & Reliability
4. Fault Tolerance
1.Issues
• Consider nature of s/w
• s/w never fulfil conventional boundaries of
stability
• s/w fault are difficult to handle than physical
defect
2.Safety & Reliability
• Make each module independent
• Reduce complexity of each task
• Isolate tasks from external influences, both h/w
& timing
• Communicate through a single, well defined
interface between tasks.
3. Fault Tolerance
• How system prevents or responds bugs,
errors, faults or failures?
4.Development Plan

• Develop a formal require documents


• Perform hazard analysis
• Review quality metrics
• Include quality assurance team in the process
• Establish design reviews & a s/w monitoring
plan
• Plan for validation & verification testing
Good programming practices

1. Style & Format


2. Structured Programming
3. Coupling & Cohesion
4. Documentation & Source Control
5. Scheduling
1.Style & Format
Design: Utility of s/w

Comments: Should be clear & correct

Variable: Name variable for code clarity


Don’t use global variable
2.Structured Programming
• Describes the framework based on modules or
procedures
• Design architecture for debugging & testing
• Code small module that you can test & forget
• Code single entry & exit in routines
3.Coupling & Cohesion
• It help to define tasks & design module.
• Cohesion- Everything within module must be
closely related.
4.Documentation & Source Control

• It describes overall system function


• 1st when you begin & last that you finish
• Ensures completeness & veracity.
5.Scheduling:

• Meeting
• Planning
• Designing
• Testing
• Debugging
User Interface:
Defines Guidelines & Development
Embedded & Real Time s/w
• Case studies & design Examples:
1. Golf course /farm sprinkler system
2. Agro electronics – parameters control for
polyhouse, open farming , remote control
application ,GIS & control
3. House hold applications
Embedded & Real Time s/w

You might also like