Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Software Engineering

Download as pdf or txt
Download as pdf or txt
You are on page 1of 79

Software Engineering

Unit 1
Introduction to Software Engineering
Importance of Software Engineering as a Discipline
Software Applications
Software Crisis
Software Processes & Characteristics
Waterfall Model:
Prototype and Prototyping Model:
Evolutionary Model:
Spiral Model
Software Requirements Analysis & Specifications:
Requirements Engineering:
Functional and Non-Functional Requirements:
User Requirements:
System Requirements:
Requirement Elicitation Techniques: FAST, QFD, and Use Case Approach
Data Flow Diagrams
Levels in Data Flow Diagrams (DFD)
Requirements Analysis Using Data Flow Diagrams (DFD):
Data Dictionary
Components of Data Dictionary:
Data Dictionary Notations tables :
Features of Data Dictionary :
Uses of Data Dictionary :
Importance of Data Dictionary:
Entity-Relationship (ER) Diagrams:
Requirements Documentation:
Software Requirements Specification (SRS) - Nature and Characteristics:
Requirement Management:
IEEE Std 830-1998 - Recommended Practice for Software Requirements
Specifications:
Unit 2
Software Project Planning:
Project size estimation
Size Estimation in Software Development: Lines of Code and Function Count:
Cost Estimation Models in Software Development:
COCOMO (Constructive Cost Model):
Putnam Resource Allocation Model

Software Engineering 1
Validating Software Estimates:
Risk Management in Software Development:
Software Design: Cohesion and Coupling
Function-Oriented Design in Software Engineering:
Object-Oriented Design (OOD) in Software Engineering:
User Interface Design

Unit 1
Introduction to Software Engineering
Software Engineering is a systematic approach to the design, development,
maintenance, and documentation of software. It encompasses a set of methods,
tools, and processes to create high-quality software efficiently.

Key Concepts:

1. Software Development Process: Software engineering follows a well-defined


process to manage and control the development of software. This process
typically includes stages such as requirements analysis, design, coding, testing,
and maintenance.

2. Software Development Life Cycle (SDLC): SDLC is a framework for


understanding and managing the software development process. Common
SDLC models include the Waterfall model, Agile, and Iterative models.

3. Requirements Engineering: Understanding and documenting the software


requirements is a critical phase. It involves gathering, analyzing, and specifying
what the software should do.

4. Design: During this phase, the software architecture is planned. It includes


creating a high-level structure, defining data structures, and laying out the overall
system design.

5. Implementation: Writing code and building the software based on the design
specifications. It involves programming, coding, and unit testing.

6. Testing: Rigorous testing is performed to ensure that the software functions


correctly and meets the specified requirements. This includes unit testing,
integration testing, and system testing.

7. Maintenance: Software maintenance is an ongoing process that includes


making enhancements, fixing bugs, and adapting the software to changing

Software Engineering 2
requirements.

Why Software Engineering?

Software engineering is essential because it ensures that software is developed


and maintained in a systematic and cost-effective manner. It helps in reducing
errors, managing complexity, and delivering high-quality software products.

Challenges in Software Engineering:

1. Complexity: Software can become incredibly complex, and managing this


complexity is a significant challenge in software engineering.

2. Changing Requirements: Customer requirements often change during the


development process, and software engineers must adapt to these changes.

3. Quality Assurance: Ensuring that software is of high quality and free of defects
is a continuous challenge.

Software Engineering Tools and Techniques:

Various tools and techniques are used in software engineering, such as version
control systems (e.g., Git), integrated development environments (IDEs),
modeling tools, and project management software.

Career Opportunities:

Software engineers have a wide range of career opportunities, including software


developer, quality assurance engineer, systems analyst, software architect, and
more.

That's a brief introduction to software engineering. If you have any specific questions
or need further details on any aspect, please feel free to ask.

Importance of Software Engineering as a Discipline


1. Management of Complexity:

Software engineering plays a crucial role in managing the complexity of modern


software systems. As software applications become more intricate, the discipline
provides methodologies and tools to design, develop, and maintain software in
an organized and comprehensible manner.

2. Quality Assurance:

It ensures the delivery of high-quality software products. Through rigorous testing


and quality control processes, software engineering helps in identifying and

Software Engineering 3
rectifying defects, reducing errors, and ensuring reliability.

3. Cost-Efficiency:

By following systematic development processes and adhering to best practices,


software engineering contributes to cost-efficiency. It helps in reducing
development costs and minimizes the expenses associated with post-
development maintenance and bug fixing.

4. Predictable Timelines:

Software engineering methodologies provide project management techniques


that enable the estimation of project timelines and deliverables more accurately.
This is crucial for planning and ensuring projects are completed on schedule.

5. Adaptation to Changing Requirements:

In today's dynamic business environment, software requirements often change.


Software engineering methodologies, such as Agile, allow for flexibility and
adaptability, enabling software projects to accommodate changing needs without
causing significant disruptions.

6. Risk Management:

Software engineering helps in identifying and mitigating risks associated with


software development. By evaluating potential challenges and implementing
strategies to address them, it reduces the likelihood of project failures.

7. Reusability:

Software engineering promotes the concept of code and component reusability.


This not only accelerates development but also improves software quality and
consistency.

8. Scalability:

As software systems grow and evolve, scalability becomes a vital consideration.


Software engineering principles support the design of scalable architectures,
ensuring that systems can expand to handle increased loads.

9. Documentation:

Proper documentation is an essential aspect of software engineering. It ensures


that the knowledge about a software system is preserved, making it easier for
future development, maintenance, and troubleshooting.

10. Industry Standards and Best Practices:

Software Engineering 4
Software engineering adheres to industry standards and best practices. This
consistency across the discipline fosters a common understanding of how to
develop and maintain software systems, promoting professionalism and quality.

11. User Satisfaction:

Effective software engineering results in software products that meet or exceed


user expectations. This leads to higher user satisfaction and trust in the software.

12. Innovation:

The discipline of software engineering drives innovation. By developing new


methods, tools, and techniques, it continually evolves to meet the demands of an
ever-changing technology landscape.

Overall, software engineering is of paramount importance in the IT industry, as it is


the foundation for creating reliable, high-quality software systems that drive
businesses, enhance user experiences, and address the challenges of complexity
and change in the digital age.

Software Applications
Definition:

Software applications, commonly known as "apps," are computer programs or


sets of instructions designed to perform specific tasks or functions on electronic
devices, such as computers, smartphones, tablets, and more.

Types of Software Applications:

1. Desktop Applications:

These are software programs designed to run on personal computers or


workstations. Examples include word processors (e.g., Microsoft Word),
spreadsheet software (e.g., Microsoft Excel), and graphic design tools (e.g.,
Adobe Photoshop).

2. Mobile Applications (Mobile Apps):

These applications are developed for smartphones and tablets. They can be
categorized into two major platforms:

iOS Apps: Designed for Apple devices like iPhones and iPads.

Android Apps: Developed for devices running the Android operating


system.

Software Engineering 5
3. Web Applications:

These are accessed through web browsers and run on remote servers.
Users can interact with web applications through a web page. Examples
include email services (e.g., Gmail), social media platforms (e.g., Facebook),
and online shopping websites (e.g., Amazon).

4. Enterprise Applications:

These are software solutions designed for business and organizational use.
Enterprise applications often include Customer Relationship Management
(CRM) software, Enterprise Resource Planning (ERP) systems, and project
management tools.

5. Gaming Applications:

Video games are a significant category of software applications,


encompassing a wide range of genres and platforms, including console
games, PC games, and mobile games.

6. Utility Applications:

These serve specific utility purposes. Examples include antivirus software,


file compression tools, and system maintenance applications.

Key Characteristics and Functions of Software Applications:

1. User Interface (UI):

Most applications have a graphical user interface (GUI) that allows users to
interact with the software.

2. Functionality:

Applications are designed to perform specific tasks or functions, such as


word processing, data analysis, communication, and entertainment.

3. Platform Compatibility:

Applications are developed for specific operating systems (e.g., Windows,


macOS, iOS, Android) and may not be compatible with all platforms.

4. Connectivity:

Many applications require internet connectivity for updates, data


synchronization, and real-time communication.

5. Data Storage:

Software Engineering 6
Applications may store data locally on the device or in remote servers,
depending on their design and purpose.

6. Updates and Maintenance:

Developers regularly release updates to improve functionality, security, and


performance. Users are encouraged to keep their applications up to date.

Development Process:

The development of software applications involves stages such as requirements


gathering, design, coding, testing, and deployment. Different methodologies, like
Agile and Waterfall, can be used for software development.

User Experience (UX) Design:

A critical aspect of application development, UX design focuses on creating an


enjoyable and intuitive user experience, including user interface design, usability,
and user interaction.

App Stores:

Many applications are distributed through app stores specific to their platforms
(e.g., Apple App Store, Google Play Store, Microsoft Store). These platforms
provide a centralized marketplace for users to discover and download apps.

Monetization:

Developers may monetize their applications through various models, including


one-time purchases, in-app purchases, subscription models, and advertising.

Security:

Security is a significant concern for software applications, and developers must


implement measures to protect user data and prevent unauthorized access.

Software applications have become an integral part of daily life, serving diverse
purposes from productivity and communication to entertainment and business
operations. Their development and continuous improvement contribute significantly
to the digital world's evolution and functionality.

Software Crisis
Definition:

The software crisis refers to a period in the early history of software development
when the industry faced significant challenges and difficulties in producing

Software Engineering 7
software that met the desired quality, cost, and delivery targets. It was a time
when software projects often ran over budget, exceeded timelines, and resulted
in systems that were error-prone and unreliable.

Causes of the Software Crisis:

1. Complexity: The increasing complexity of software systems was a major


contributing factor. As software applications grew in size and functionality,
managing and understanding them became challenging.

2. Lack of Methodology: In the early days of software development, there was a


lack of well-defined methodologies and processes for managing and developing
software. This led to ad-hoc approaches that often resulted in chaotic
development.

3. Limited Tools and Resources: The absence of sophisticated tools and


resources for software development hindered the efficiency of the process.
Programmers had to write code manually, and debugging was time-consuming.

4. Changing Requirements: Customers frequently changed their software


requirements during the development process, leading to scope creep and
project delays.

5. Inadequate Testing: Testing procedures were often insufficient, and the


absence of systematic testing resulted in software systems with numerous
defects.

6. Limited Communication: Communication between developers and customers


was challenging, which often led to misunderstandings and misaligned
expectations.

Consequences of the Software Crisis:

1. Project Failures: Many software projects failed to meet their objectives,


resulting in wasted time and resources.

2. Budget Overruns: Software projects frequently exceeded their budgeted costs,


causing financial strain on organizations.

3. Delayed Deliveries: Timelines for software projects were often extended,


impacting organizations' ability to respond to changing business needs.

4. Quality Issues: Software systems produced during this period often had
numerous defects, making them unreliable and requiring frequent updates and
maintenance.

Software Engineering 8
Solutions to the Software Crisis:

1. Development Methodologies: The development of systematic software


engineering methodologies, such as the Waterfall model, Agile, and Iterative
approaches, improved the organization and management of software projects.

2. Standardization: The introduction of coding standards and best practices


improved code quality and maintainability.

3. Testing Procedures: Rigorous testing processes, including unit testing,


integration testing, and system testing, were established to ensure software
quality.

4. Communication: Improved communication between developers and customers


led to better-defined requirements and expectations.

5. Advancements in Tools: The development of integrated development


environments (IDEs) and other software development tools improved efficiency
and productivity.

6. Training and Education: The software engineering discipline expanded with


universities offering formal education in software development.

The software crisis prompted the development of modern software engineering


practices and methodologies that have significantly improved the quality, efficiency,
and predictability of software development projects. It marked a pivotal moment in
the history of software engineering, leading to the industry's continued growth and
evolution.

Software Processes & Characteristics


Software Processes:

1. Definition:

A software process, also known as a software development process or


software engineering process, is a set of activities and tasks that are
systematically organized to design, develop, test, deploy, and maintain
software. These processes provide a structured approach to managing and
controlling software development projects.

2. Software Development Life Cycle (SDLC):

A software process typically follows a specific Software Development Life


Cycle (SDLC). Common SDLC models include the Waterfall model, Agile, V-

Software Engineering 9
Model, and Iterative models. Each SDLC model prescribes a series of
phases and activities to guide the development process.

3. Phases of a Typical SDLC:

While specific SDLC models may vary, a typical software development


process includes phases such as Requirements Gathering, Design,
Implementation, Testing, Deployment, and Maintenance.

4. Activities in Each Phase:

Each phase involves specific activities. For example, the Requirements


Gathering phase includes eliciting, analyzing, and documenting the software
requirements, while the Testing phase encompasses unit testing, integration
testing, and system testing.

Characteristics of Software Processes:

1. Systematic Approach:

Software processes provide a systematic and structured approach to


software development. They ensure that the development activities are
organized and follow a predefined order.

2. Repeatability:

Software processes are designed to be repeatable. When a process is


established, it can be reused for similar projects, improving efficiency and
consistency.

3. Quality Assurance:

Quality is a key characteristic of software processes. They include quality


assurance activities such as testing, code reviews, and verification to ensure
the final product meets specified requirements and quality standards.

4. Project Management:

Software processes facilitate project management by providing a framework


for estimating project timelines, managing resources, and tracking progress.

5. Flexibility:

While processes provide structure, they can be adapted to fit the needs of
different projects. Agile methodologies, for example, prioritize flexibility and
adaptability in response to changing requirements.

6. Documentation:

Software Engineering 10
Software processes emphasize the importance of documentation. This
includes requirement documents, design specifications, code documentation,
and test plans to ensure that the project's progress and outcomes are well-
documented.

7. Risk Management:

Software processes often incorporate risk management activities to identify


potential challenges and develop strategies to address them. This helps in
mitigating project risks.

8. Iterative Improvement:

Many software processes include a feedback loop for continuous


improvement. Lessons learned from previous projects are used to enhance
the process for future projects.

9. Communication:

Effective communication is an integral part of software processes. Clear


communication between team members and stakeholders ensures that
everyone has a shared understanding of project goals and requirements.

10. Measurable Outcomes:

Software processes are designed to produce measurable outcomes. This


allows for objective evaluation of the project's progress and success.

Software processes are a fundamental component of software engineering. They


help ensure that software projects are well-organized, efficient, and deliver high-
quality software products. The choice of a specific process model can vary based on
the project's requirements, size, and other factors.

Waterfall Model:
Description:

The Waterfall Model is a traditional and linear software development life cycle
model. It is often considered a classic approach, where the project is divided into
distinct phases, and each phase must be completed before the next one begins.
It follows a sequential, top-down flow where the output of one phase becomes
the input for the next.

Phases:

Software Engineering 11
1. Requirements Gathering: This is the initial phase where the project's
requirements are gathered, documented, and analyzed. It involves interactions
with stakeholders to understand their needs.

2. System Design: In this phase, the system architecture is designed based on the
gathered requirements. This includes defining system components, their
relationships, and a high-level design.

3. Implementation: The actual coding and development of the software take place
in this phase. Programmers write code according to the system design
specifications.

4. Testing: Once the implementation is complete, the software is subjected to


rigorous testing to detect and rectify defects or issues. Testing includes unit
testing, integration testing, system testing, and user acceptance testing.

5. Deployment: After successful testing, the software is deployed to the production


environment or made available to users.

6. Maintenance: This phase involves ongoing maintenance and updates as


necessary to address issues, implement enhancements, or adapt to changing
requirements.

Characteristics:

Sequential: The phases in the Waterfall Model proceed sequentially, and each
phase depends on the deliverables of the previous one.

Inflexible: It can be rigid and less adaptable to changing requirements or


evolving user needs once the project is underway.

Well-Documented: It emphasizes comprehensive documentation at each stage,


ensuring a clear record of the project's progress.

Risk Management: It's challenging to accommodate changing requirements,


which can be a significant risk for the project.

Suitability: Best suited for projects with well-understood and stable


requirements, where changes are minimal during the development process.

Advantages:

Well-structured and easy to understand.

Clear documentation at each stage helps with future maintenance and


understanding.

Software Engineering 12
Suitable for projects with stable, well-defined requirements.

Progress can be monitored at each phase.

Disadvantages:

Inflexible to changes in requirements during the development process.

Risky if initial requirements are not well-understood or if users' needs change.

Can lead to long development cycles, potentially resulting in late delivery.

Testing and user feedback often occur late in the project, which may lead to
costly defects.

The Waterfall Model is a straightforward and structured approach to software


development, making it suitable for projects with well-defined and stable
requirements. However, it may not be the best choice for projects where
requirements are likely to change during the development process or for projects with
high levels of uncertainty.

Prototype and Prototyping Model:


Prototype:

Software Engineering 13
A prototype is a working model of a software system or a part of it. It is created to
provide a tangible representation of the software's functionality and features
before the final system is developed. Prototypes can be of various types,
including:

Throwaway Prototypes: These are created to explore specific design ideas


or user requirements. Once the design or requirements are validated, the
prototype is discarded, and development begins from scratch.

Evolutionary Prototypes: These prototypes are built with the intention of


evolving them into the final system. As development progresses, the
prototype is incrementally improved and expanded upon.

Prototyping Model:

The Prototyping Model is a software development approach that emphasizes the


creation of prototypes during the requirements and design phase. It is an
iterative and feedback-driven process that allows stakeholders to better
understand the software's requirements and functionality.

Software Engineering 14
Phases:

1. Requirements Gathering: Initial requirements are collected, but they may be


incomplete or not well-defined.

2. Prototyping: A basic working model (prototype) is built to help stakeholders


visualize the final product. This can be a throwaway or evolutionary prototype.

3. Feedback and Refinement: Stakeholders provide feedback on the prototype,


and this feedback is used to refine both the requirements and the prototype itself.

4. Development: Once the prototype is approved, full-scale software development


takes place, often based on the refined requirements.

5. Testing: The final software is tested to ensure quality and compliance with the
requirements.

Characteristics:

Software Engineering 15
Iterative: Prototyping involves multiple iterations, allowing for refinements and
improvements based on user feedback.

Early User Involvement: Stakeholders are involved early in the development


process, leading to a better understanding of their needs.

Adaptable to Changing Requirements: Prototyping is flexible and can


accommodate evolving requirements.

Risk Management: It reduces the risk of delivering a final product that doesn't
meet user needs.

Suitability: Effective for projects with evolving or unclear requirements and


those that require strong user involvement.

Advantages of Prototyping:

Better user understanding and satisfaction.

Early detection of design flaws and issues.

Reduced project risk through feedback.

Increased collaboration between stakeholders.

Disadvantages of Prototyping:

Can be time-consuming, especially if multiple iterations are needed.

May not be suitable for projects with well-defined requirements.

Prototypes may not always accurately reflect the final product.

May require additional effort to convert prototypes into production-ready


software.

In summary, prototypes are working models used to represent software functionality,


while the Prototyping Model is an iterative approach that uses prototypes to improve
requirements understanding and user satisfaction. The choice between throwaway
and evolutionary prototypes depends on project goals and requirements.

Evolutionary Model:
Description:

The Evolutionary Model is a software development life cycle model that


emphasizes iterative and incremental development. It is particularly suited for
projects with evolving and changing requirements, often found in complex and

Software Engineering 16
dynamic environments. The model allows for the early delivery of a basic working
system and then iteratively enhances the software.

Phases:

1. Baseline System: In the initial phase, an essential, minimal system is


developed, often with the most critical features. This version serves as a baseline
or starting point.

2. Iterative Enhancement: In subsequent iterations, the software is enhanced by


adding more features, functionality, and improvements based on user feedback
and evolving requirements.

3. Feedback and Refinement: Stakeholder feedback from each iteration guides


further iterations and improvements, allowing the system to evolve over time.

Characteristics:

Incremental: Software development occurs in increments or stages, with each


stage building upon the previous one.

Feedback-Driven: Stakeholder feedback is central to the model, shaping the


evolution of the software.

Adaptable: Suited for projects with changing requirements and high uncertainty.

Early Delivery: The model allows for the early delivery of a basic working
system, which can provide value to users.

Complex Projects: It is effective for complex and large-scale projects where


requirements may evolve.

Advantages of the Evolutionary Model:

Adaptability: Ideal for projects with changing or unclear requirements.

Early User Feedback: Stakeholder involvement leads to better understanding


and user satisfaction.

Reduced Risk: Iterative nature helps identify issues early and allows for course
corrections.

Early Delivery: Provides a basic working system in early iterations.

Disadvantages of the Evolutionary Model:

Management Complexity: Managing multiple iterations can be complex.

Software Engineering 17
Potential Scope Creep: Iterative enhancements may lead to expanding the
project scope beyond the original plan.

Resource Intensive: Requires ongoing stakeholder involvement, which can be


resource-intensive.

Documentation: Ongoing changes may require frequent updates to project


documentation.

The Evolutionary Model is a valuable approach for software projects where


requirements are not well-defined, change frequently, or when early user feedback is
critical. It provides the flexibility to adapt to evolving needs and allows for the delivery
of working software in early iterations, ensuring that the system remains aligned with
user expectations.

Spiral Model
The Spiral Model is a software development and project management approach that
combines iterative development with elements of the Waterfall model. It was first
introduced by Barry Boehm in 1986 and is especially suitable for large and complex
projects. The Spiral Model is characterized by a series of cycles, or "spirals," each of
which represents a phase in the software development process. Here are the key
components and principles of the Spiral Model:

1. Phases:

The Spiral Model divides the software development process into several
phases, each of which represents a complete cycle of the model. The typical

Software Engineering 18
phases include Planning, Risk Analysis, Engineering (or Development), and
Evaluation (or Testing).

2. Iterative and Incremental:

The Spiral Model is inherently iterative and incremental. It doesn't follow a


linear path like the Waterfall model but instead repeats a series of cycles,
with each cycle building upon the previous one. This allows for flexibility and
continuous improvement.

3. Risk Analysis:

The Risk Analysis phase is a unique feature of the Spiral Model. It involves
identifying and assessing project risks, such as technical, schedule, and cost
risks. The goal is to make informed decisions about whether to proceed with
the project based on risk analysis.

4. Prototyping:

Prototyping is often incorporated into the Spiral Model to manage


uncertainties and gather user feedback. Prototypes can be developed in
early cycles to help stakeholders better understand the requirements and
design.

5. Flexibility:

The Spiral Model provides flexibility in accommodating changes and


adjustments to the project as it progresses. This adaptability is particularly
valuable for projects where requirements may evolve or are not well-
understood initially.

6. Customer Involvement:

Continuous customer involvement is encouraged throughout the


development process. Stakeholder feedback is collected in each cycle,
allowing for adjustments to be made based on changing requirements and
priorities.

7. Documentation:

Documentation is created and updated at each phase of the Spiral Model.


This ensures that project progress is well-documented, which is valuable for
both project management and future maintenance.

8. Monitoring and Control:

Software Engineering 19
The project is continually monitored, and control mechanisms are in place to
manage risks and resources. This ensures that the project remains on track
and aligned with its goals.

Advantages of the Spiral Model:

Effective for large and complex projects where risks and uncertainties are high.

Incorporates risk management as a central element, allowing for informed


decision-making.

Supports flexibility and adaptability to changing requirements.

Promotes customer involvement and feedback throughout the development


process.

Limitations of the Spiral Model:

Requires a significant level of expertise in risk assessment and management.

May involve higher development costs due to its iterative nature.

The potential for project scope creep or endless iteration if not properly
controlled.

Not suitable for small projects with well-defined requirements.

The Spiral Model is a robust approach for projects that require risk management,
flexibility, and a focus on iterative development. It is particularly useful in domains
where requirements are complex, evolving, or not well-understood initially. However,
it does require a disciplined approach to risk assessment and management to be
effective.

Software Engineering 20
Software Requirements Analysis & Specifications:
1. Software Requirements Analysis:
Definition:

Software Requirements Analysis is the process of gathering, documenting, and


understanding the needs and constraints of stakeholders to define what a
software system should achieve and how it should function.

Key Activities:

1. Requirements Elicitation: This phase involves interacting with stakeholders,


such as clients, users, and domain experts, to collect their needs and
expectations. Techniques like interviews, surveys, and workshops are used.

2. Requirements Documentation: Capturing and recording requirements in a


structured manner is crucial. This typically involves creating requirement
documents that can take the form of textual descriptions, diagrams, or use
cases.

3. Requirements Analysis: The collected requirements are analyzed to ensure


that they are clear, complete, consistent, and feasible. Ambiguities and

Software Engineering 21
contradictions are addressed during this phase.

4. Requirements Validation: Validation involves ensuring that the requirements


align with the overall goals of the project and are achievable within budget and
time constraints.

5. Requirements Verification: Verification ensures that the requirements


accurately represent the stakeholders' needs and are free from errors. It involves
reviews and inspections.

Challenges:

Ambiguous or changing requirements, poor communication with stakeholders,


and balancing competing needs can be challenging during requirements
analysis.

2. Software Specifications:
Definition:

Software Specifications refer to the detailed documentation that translates the


collected requirements into a precise and unambiguous description of the
software's behavior, functionality, and constraints. Specifications serve as the
basis for design and implementation.

Key Components:

1. Functional Specifications: These describe the functions, features, and


interactions the software should provide. Use cases, flowcharts, and state
diagrams are common tools for describing functionality.

2. Non-Functional Specifications: Non-functional specifications address qualities


like performance, security, usability, and reliability. They define how the software
should perform in various conditions.

3. User Interface (UI) Specifications: For software with a graphical user interface,
these specifications outline the layout, design, and behavior of the user interface
elements.

4. Data Specifications: Describes data structures, storage, and database


requirements, including data types, relationships, and constraints.

5. Interface Specifications: In cases where the software interacts with other


systems, these specify the data exchange formats and protocols.

Importance:

Software Engineering 22
Clear and detailed specifications serve as a common reference point for all
stakeholders, including designers, developers, testers, and users. They help
ensure that the software is built as per the requirements and can be tested
effectively.

Documentation Standards:

Depending on the project and organization, various standards may be followed


for documenting requirements and specifications, such as IEEE Std 830-1998 for
software requirements specifications.

Tools:

Various software tools, such as requirements management software and


modeling tools, can aid in documenting and managing requirements and
specifications.

Traceability:

Traceability matrices are used to establish links between requirements,


specifications, and design elements to ensure that all aspects of the software
align with the original requirements.

Software Requirements Analysis and Specifications are critical phases in software


development as they lay the foundation for the entire project. Clear, well-documented
requirements and specifications are essential for building software that meets
stakeholder needs and performs as intended.

Requirements Engineering:
Definition:

Requirements Engineering (RE) is a systematic and disciplined approach to


elicit, document, analyze, validate, and manage software requirements. It is a
crucial phase in software development that focuses on understanding and
defining what a software system should do, how it should behave, and its
constraints.

Key Activities in Requirements Engineering:

1. Elicitation: The process of collecting requirements from various stakeholders,


including clients, end-users, domain experts, and project teams. Techniques like
interviews, surveys, and workshops are used to elicit requirements.

Software Engineering 23
2. Documentation: Capturing and recording requirements in a structured and
comprehensible format. Requirement documents can take the form of textual
descriptions, diagrams, or use cases.

3. Analysis: Analyzing requirements to ensure they are clear, complete, consistent,


and feasible. This involves identifying ambiguities, contradictions, and omissions
in the requirements.

4. Specification: Translating the gathered requirements into precise and


unambiguous descriptions of the software's functionality, constraints, and
behavior. This often includes functional and non-functional specifications.

5. Validation: Validating requirements to ensure that they align with the project's
goals, are achievable within budget and time constraints, and meet the needs of
stakeholders.

6. Verification: Verifying that the requirements accurately represent the


stakeholders' needs and are free from errors. This typically involves reviews and
inspections.

Importance of Requirements Engineering:

Foundation of Software Development: Well-defined requirements serve as the


foundation for designing, developing, and testing software. They are the basis for
the entire software development life cycle.

Alignment with Stakeholder Needs: Effective requirements engineering


ensures that the software system aligns with the needs, expectations, and
constraints of stakeholders.

Risk Management: Identifying and addressing issues and ambiguities in


requirements during the early stages of development reduces the risk of costly
errors and changes later in the project.

Communication: Requirements documents serve as a common reference point


for all project stakeholders, promoting effective communication and
understanding.

Change Management: Requirements engineering provides a structured process


for handling changing requirements and scope throughout the project.

Quality Assurance: Clear and validated requirements contribute to the quality


and reliability of the final software product.

Challenges in Requirements Engineering:

Software Engineering 24
Ambiguity and Incompleteness: Requirements are often stated vaguely or
may not cover all necessary aspects.

Changing Requirements: Stakeholder needs can evolve over time, leading to


changing requirements during the project.

Communication: Ensuring that all stakeholders have a shared understanding of


requirements can be challenging.

Conflicting Requirements: Different stakeholders may have conflicting or


competing requirements.

Managing Scope: Defining the project's scope and ensuring it does not expand
beyond the original intent can be complex.

Traceability:

Traceability matrices are used to establish and maintain links between


requirements, specifications, and design elements to ensure alignment
throughout the software development process.

Effective requirements engineering is critical for the success of software projects, as


it ensures that software is developed to meet the needs of stakeholders, is of high
quality, and can adapt to changing requirements.

Functional and Non-Functional Requirements:


In software engineering, requirements are typically categorized into two main types:
functional requirements and non-functional requirements. These categories help in
clearly defining what a software system should do and how it should perform.
1. Functional Requirements:
Definition:

Functional requirements specify the specific functions, features, capabilities, and


interactions that a software system must provide. They define the behavior of the
system under various conditions and outline what actions or processes the
software should perform.

Characteristics of Functional Requirements:

What the System Does: Functional requirements describe what the system
does in response to specific inputs or under certain conditions.

Specific and Testable: They are typically specific, well-defined, and testable,
allowing for validation and verification.

Software Engineering 25
User-Centric: Often, functional requirements focus on user interactions and
system behavior from the user's perspective.

Interactions and Use Cases: They often include use cases, scenarios, and user
stories that describe how the system functions in real-world situations.

Examples: Functional requirements might include actions like "user registration,"


"inventory management," "calculate total order cost," or "generate monthly
reports."

2. Non-Functional Requirements:
Definition:

Non-functional requirements, sometimes referred to as quality attributes or


constraints, define the characteristics and constraints of the software system
other than its specific functionality. They describe how the system should
perform, rather than what it should do.

Characteristics of Non-Functional Requirements:

How the System Performs: Non-functional requirements focus on aspects like


performance, reliability, usability, security, and scalability.

Qualities and Constraints: They define the qualities or constraints that the
software must adhere to, such as response times, data storage, and security
measures.

Cross-Cutting Concerns: Non-functional requirements often affect multiple


parts of the system and cut across various functional areas.

Examples: Non-functional requirements might include aspects like "response


time should be less than 2 seconds," "the system should be available 24/7,"
"data should be stored securely," or "the user interface should be user-friendly."

Examples of Non-Functional Requirements Categories:

1. Performance: Response time, throughput, and resource utilization, e.g., the


system should handle 1000 concurrent users.

2. Reliability: Availability, fault tolerance, and error handling, e.g., the system
should have 99.9% uptime.

3. Usability: User interface design, accessibility, and user satisfaction, e.g., the
system should be intuitive for novice users.

Software Engineering 26
4. Security: Authentication, authorization, and data protection, e.g., user
passwords must be stored securely.

5. Scalability: The system's ability to handle increased load or data, e.g., the
system should scale to accommodate ten times the current user base.

6. Compatibility: Compatibility with various devices, browsers, and operating


systems, e.g., the system should work on the latest versions of major browsers.

7. Regulatory Compliance: Adherence to legal and industry-specific regulations,


e.g., the system must comply with GDPR for data protection.

8. Maintainability: Ease of system maintenance and code changes, e.g., code


should be well-documented for easy maintenance.

Importance:
Both functional and non-functional requirements are vital in ensuring that software
meets the needs and expectations of users, performs well, and complies with quality
and performance standards. Balancing and satisfying both types of requirements is
crucial for successful software development and user satisfaction.

User Requirements:
User requirements, also known as user needs or user stories, are a critical
component of software development. These requirements describe what the users,
stakeholders, or customers expect from a software system. User requirements are
typically expressed in non-technical language to ensure clear communication
between developers and end-users.
Key Characteristics of User Requirements:

1. User-Centric: User requirements are focused on the needs and expectations of


the end-users or customers of the software. They represent the features and
functions that users consider essential for the system to be valuable and
effective.

2. Non-Technical Language: User requirements are expressed in plain, non-


technical language, making them accessible to a wide audience. This helps
bridge the communication gap between users and developers.

3. Functional and Non-Functional: User requirements can encompass both


functional aspects (what the system should do) and non-functional aspects (how
the system should perform). This includes features, user interactions,
performance expectations, security requirements, and more.

Software Engineering 27
4. User Stories: User requirements are often framed as user stories, which are
short, narrative descriptions that explain a specific user's need and the expected
outcome. User stories typically follow the "As a [user], I want [feature] so that
[benefit]" format.

Examples of User Requirements:

1. As a customer, I want to be able to browse products, add them to my cart, and


complete the checkout process online so that I can easily make purchases from
the e-commerce website.

2. As a student, I want the e-learning platform to provide interactive quizzes and


assignments to help me assess my understanding of the course material.

3. As a financial analyst, I need the software to generate detailed financial reports,


including income statements and balance sheets, to support my financial
analysis and decision-making.

4. As a mobile app user, I expect the application to load within two seconds and
respond quickly to my interactions to provide a smooth and responsive user
experience.

5. As a healthcare provider, I require the system to comply with all relevant data
security and privacy regulations to safeguard patient information.

Importance of User Requirements:


User requirements play a central role in the software development process for
several reasons:

User-Centered Design: Focusing on user requirements ensures that the


software is designed and developed with the end-users' needs and preferences
in mind, leading to a user-friendly product.

Clear Communication: User requirements help facilitate clear communication


between development teams and users, reducing misunderstandings and
misalignment.

Validation: User requirements serve as a basis for validating the final product to
ensure it meets user expectations.

Prioritization: They help prioritize features and functions based on their


importance to users, guiding the development process.

User Satisfaction: Meeting user requirements is crucial for user satisfaction,


which, in turn, affects user adoption and the software's success.

Software Engineering 28
Reducing Development Risk: A clear understanding of user needs helps
mitigate the risk of building features that users do not value or neglecting
essential functionality.

User requirements are a fundamental aspect of the requirements engineering


process and are essential for creating software that fulfills user needs and delivers a
positive user experience.

System Requirements:
System requirements, also known as technical requirements or software
requirements specifications, describe the technical and operational characteristics
that a software system must possess to meet the user requirements and function
effectively. These requirements provide guidance to the development and testing
teams on how to design, build, and maintain the software.
Key Characteristics of System Requirements:

1. Technical Details: System requirements delve into technical specifics, including


hardware, software, networking, and data-related aspects.

2. Implementation Guidance: They provide guidance on how to implement the


software, such as programming languages, frameworks, and databases to be
used.

3. Performance Metrics: System requirements specify performance criteria, such


as response times, scalability, and resource usage, that the software must meet.

4. Compatibility: These requirements outline the compatibility of the software with


various operating systems, browsers, databases, and other software
components.

5. Constraints: System requirements may include constraints, such as regulatory


compliance, security standards, and data storage limitations.

6. Integration: They specify how the software will integrate with other systems or
components, if applicable.

Examples of System Requirements:

1. The system must run on Windows 10 and macOS 11 operating systems.

2. The software shall be built using Java and utilize the Spring framework for web
development.

3. The database management system must be PostgreSQL version 13.2.

Software Engineering 29
4. The system should support a minimum of 500 concurrent users without a
significant degradation in performance.

5. Data backup must be performed every day at midnight and stored securely for a
minimum of one year.

6. The software should integrate with the company's single sign-on (SSO) system
for user authentication.

7. Security requirements: The system must adhere to industry standards, such as


OWASP Top Ten, and employ encryption for sensitive data.

Importance of System Requirements:

System requirements serve several crucial functions in the software development


process:

Technical Guidance: They guide developers in making technical decisions


during the implementation of the software.

Performance Metrics: System requirements set performance expectations and


help ensure the software performs adequately.

Interoperability: Compatibility requirements ensure that the software can work


seamlessly with other systems and components.

Resource Planning: They assist in resource allocation and infrastructure


planning.

Risk Mitigation: By addressing regulatory compliance and security standards,


system requirements help mitigate risks related to legal and security issues.

Documentation: They provide a clear reference for the development and testing
teams and are essential for future maintenance and updates.

System requirements are integral to the development of a software system. They


help bridge the gap between user needs and technical implementation, ensuring that
the software is designed, built, and operated effectively and in alignment with user
expectations.

Requirement Elicitation Techniques: FAST, QFD, and


Use Case Approach
Requirement elicitation techniques are methods used to gather and capture user
needs and system requirements effectively. Here, we'll discuss three popular

Software Engineering 30
techniques in detail: Function Analysis System Technique (FAST), Quality Function
Deployment (QFD), and the Use Case Approach.
1. Function Analysis System Technique (FAST):
Definition:

FAST is a structured method used to decompose complex systems into smaller,


more manageable parts, allowing for a comprehensive understanding of system
functions and their relationships.

Key Concepts and Steps:

1. Identify Functions: Begin by identifying the primary functions of the system or


process under analysis.

2. Functional Decomposition: Decompose each identified function into sub-


functions, breaking them down into smaller, more detailed parts. This process
continues hierarchically.

3. Hierarchical Diagram: Create a hierarchical diagram that visually represents


the decomposition, showing the relationships between functions at different
levels.

4. Linkages and Constraints: Identify linkages and constraints between functions.


This helps in understanding dependencies and relationships.

5. Constraints Analysis: Examine any constraints, limitations, or requirements


associated with specific functions.

Use Cases:

FAST is particularly useful in engineering, design, and systems analysis, helping


teams gain a comprehensive understanding of system functions, dependencies,
and constraints.

2. Quality Function Deployment (QFD):


Definition:

QFD is a structured approach used to translate customer needs and


requirements into specific product or system features, ensuring that the final
product aligns with customer expectations.

Key Concepts and Steps:

1. Gather Customer Needs: Begin by gathering and prioritizing customer needs


and expectations through surveys, interviews, or other feedback mechanisms.

Software Engineering 31
2. Create the House of Quality: This is a visual matrix that correlates customer
needs with specific product features, indicating the strength and nature of the
relationship.

3. Technical Requirements: Define technical requirements or product


characteristics that are essential for fulfilling customer needs.

4. Prioritization: Prioritize technical requirements based on their importance in


meeting customer needs.

5. Deployment Matrix: Create a deployment matrix to map how technical


requirements are linked to specific design and manufacturing processes.

Use Cases:

QFD is commonly used in product development, particularly in industries where


customer satisfaction is a key driver. It ensures that products or systems are
designed with a clear focus on meeting customer expectations.

3. Use Case Approach:


Definition:

The Use Case Approach is a technique used to capture and describe the
interactions between an actor (usually a user) and a software system. Use cases
provide a clear understanding of system functionality from a user's perspective.

Key Concepts and Steps:

1. Identify Actors: Identify the different actors or users who interact with the
software system. Actors can be individuals, other systems, or entities.

2. Define Use Cases: Describe specific use cases, which are scenarios of
interactions between actors and the system. Each use case represents a
discrete piece of functionality.

3. Use Case Diagrams: Create use case diagrams to visualize the relationships
between actors and use cases.

4. Detail Scenarios: Write detailed descriptions of each use case, including the
steps involved, preconditions, postconditions, and any exceptions.

5. Validate and Refine: Use cases are reviewed and refined to ensure they
accurately represent user needs and system functionality.

Use Cases:

Software Engineering 32
The Use Case Approach is a standard technique in software requirements
engineering. It provides a user-centric view of system functionality, making it a
valuable tool for software development teams.

Each of these techniques offers a structured approach to requirement elicitation,


ensuring that user needs and system requirements are thoroughly understood and
effectively translated into the design and development process. The choice of
technique depends on the project's specific needs and context.

Data Flow Diagrams


A Data Flow Diagram (DFD) is a traditional visual representation of the information
flows within a system. A neat and clear DFD can depict the right amount of the
system requirement graphically. It can be manual, automated, or a combination of
both.
It shows how data enters and leaves the system, what changes the information, and
where data is stored.
The objective of a DFD is to show the scope and boundaries of a system as a whole.
It may be used as a communication tool between a system analyst and any person
who plays a part in the order that acts as a starting point for redesigning a system.
The DFD is also called as a data flow graph or bubble chart.
The following observations about DFDs are essential:

1. All names should be unique. This makes it easier to refer to elements in the
DFD.

2. Remember that DFD is not a flow chart. Arrows is a flow chart that represents
the order of events; arrows in DFD represents flowing data. A DFD does not
involve any order of events.

3. Suppress logical decisions. If we ever have the urge to draw a diamond-shaped


box in a DFD, suppress that urge! A diamond-shaped box is used in flow charts
to represents decision points with multiple exists paths of which the only one is
taken. This implies an ordering of events, which makes no sense in a DFD.

4. Do not become bogged down with details. Defer error conditions and error
handling until the end of the analysis.

Standard symbols for DFDs are derived from the electric circuit diagram analysis and
are shown in fig:

Software Engineering 33
Circle: A circle (bubble) shows a process that transforms data inputs into data
outputs.
Data Flow: A curved line shows the flow of data into or out of a process or data
store.
Data Store: A set of parallel lines shows a place for the collection of data items. A
data store indicates that the data is stored which can be used at a later stage or by
the other processes in a different order. The data store can have an element or group
of elements.
Source or Sink: Source or Sink is an external entity and acts as a source of system
inputs or sink of system outputs.

Levels in Data Flow Diagrams (DFD)


The DFD may be used to perform a system or software at any level of abstraction.
Infact, DFDs may be partitioned into levels that represent increasing information flow
and functional detail. Levels in DFD are numbered 0, 1, 2 or beyond. Here, we will

Software Engineering 34
see primarily three levels in the data flow diagram, which are: 0-level DFD, 1-level
DFD, and 2-level DFD.
0-level DFDM
It is also known as fundamental system model, or context diagram represents the
entire software requirement as a single bubble with input and output data denoted by
incoming and outgoing arrows. Then the system is decomposed and described as a
DFD with multiple bubbles. Parts of the system represented by each of these
bubbles are then decomposed and documented as more and more detailed DFDs.
This process may be repeated at as many levels as necessary until the program at
hand is well understood. It is essential to preserve the number of inputs and outputs
between levels, this concept is called leveling by DeMacro. Thus, if bubble "A" has
two inputs x1 and x2 and one output y, then the expanded DFD, that represents "A"
should have exactly two external inputs and one external output as shown in fig:

The Level-0 DFD, also called context diagram of the result management system is
shown in fig. As the bubbles are decomposed into less and less abstract bubbles, the
corresponding data flow may also be needed to be decomposed.

Software Engineering 35
1-level DFD
In 1-level DFD, a context diagram is decomposed into multiple bubbles/processes. In
this level, we highlight the main objectives of the system and breakdown the high-
level process of 0-level DFD into subprocesses.

Software Engineering 36
2-Level DFD
2-level DFD goes one process deeper into parts of 1-level DFD. It can be used to
project or record the specific/necessary detail about the system's functioning.

Software Engineering 37
Software Engineering 38
Requirements Analysis Using Data Flow Diagrams
(DFD):
Data Flow Diagrams (DFDs) are a visual modeling technique used in software
engineering to represent the flow of data and processes within a system. They are
also a valuable tool for analyzing system requirements. Here's how you can perform
requirements analysis using DFDs:

Software Engineering 39
1. Identify Key Stakeholders:

Before you start with DFDs, identify the key stakeholders who will be involved in
the requirements analysis. This typically includes end-users, business analysts,
and subject matter experts.

2. Gather Initial Requirements:

Begin by gathering the initial set of high-level requirements. These can be in the
form of user stories, business use cases, or textual descriptions of what the
system should do.

3. Create Context Diagram:

The first step in using DFDs is to create a context diagram. This diagram shows
the system as a single process or entity and its interactions with external entities
(e.g., users, other systems, data sources). This provides an overview of the
system's boundaries and external interfaces.

4. Decompose the Context Diagram:

Once you have the context diagram, you can start decomposing the system into
more detailed processes. Each process represents a specific function or task
within the system.

5. Identify Data Flows:

As you decompose processes, identify the data flows between them. Data flows
represent the transfer of data from one process to another. This helps in
understanding how data is shared and processed within the system.

6. Define Data Stores:

Data stores are repositories where data is stored within the system. Identify the
data stores and their relationships with processes and data flows.

7. Specify Data Transformations:

For each process, describe the data transformations that occur. What happens to
the data as it moves from input to output within a process? This helps in
understanding how data is processed or transformed.

8. Analyze Process Logic:

For each process, analyze the logic and rules governing it. What conditions
trigger the process? What are the expected outcomes? This analysis helps in
capturing detailed process requirements.

Software Engineering 40
9. Identify Constraints and Rules:

DFDs can also capture constraints and business rules that apply to the system.
These can include validation rules, security requirements, and any other specific
constraints.

10. Verify and Validate Requirements:


- Use the DFDs to verify and validate the requirements with stakeholders. This
involves ensuring that the DFDs accurately represent the system and its behavior
and that they align with the stakeholders' needs and expectations.
11. Document the Requirements:
- Translate the information captured in the DFDs into a formal requirements
document. This document should include a detailed description of the system's
processes, data flows, data stores, transformations, and constraints.
12. Iterate and Refine:
- Requirements analysis using DFDs is often an iterative process. You may need to
refine the DFDs and requirements as you gain a deeper understanding of the system
and as stakeholder feedback is incorporated.
Using Data Flow Diagrams for requirements analysis provides a visual
representation of how data and processes interact within a system. It helps in
uncovering requirements, understanding the system's behavior, and communicating
with stakeholders effectively. It's a valuable technique in the early stages of the
software development lifecycle.

Data Dictionary
Data Dictionary is the major component in the structured analysis model of the
system. It lists all the data items appearing in DFD. A data dictionary in Software
Engineering means a file or a set of files that includes a database’s metadata (hold
records about other objects in the database), like data ownership, relationships of the
data to another object, and some other data.
Example a data dictionary entry: GrossPay = regular pay + overtime pay
Case Tools is used to maintain data dictionary as it captures the data items
appearing in a DFD automatically to generate the data dictionary.

Components of Data Dictionary:


In Software Engineering, the data dictionary contains the following information:

Name of the item: It can be your choice.

Software Engineering 41
Aliases: It represents another name.

Description: Description of what the actual text is all about.

Related data items: with other data items.

Range of values: It will represent all possible answers.

Data Dictionary Notations tables :


The Notations used within the data dictionary are given in the table below as follows:

Notations Meaning

X = a+b X consists data elements a and b.

X = [a/b] X consists of either elements a or b.

X=aX X consists of optimal data elements a.

X = y[a] X consists of y or more events of data element a

X = [a] z X consists of z or less events of data element a

X = y [a] z X consists of some events of data elements between y and z.

Features of Data Dictionary :


Here, we will discuss some features of the data dictionary as follows.

It helps in designing test cases and designing the software.

It is very important for creating an order list from a subset of the items list.

It is very important for creating an order list from a complete items list.

The data dictionary is also important to find the specific data item object from the
list.

Uses of Data Dictionary :


Here, we will discuss some use cases of the data dictionary as follows.

Used for creating the ordered list of data items

Used for creating the ordered list of a subset of the data items

Used for Designing and testing software in Software Engineering

Used for finding data items from a description in Software Engineering

Importance of Data Dictionary:

Software Engineering 42
It provides developers with standard terminology for all data.

It provides developers to use different terms to refer to the same data.

It provides definitions for different data

Query handling is facilitated if a data dictionary is used in RDMS.

Advantages of Data Dictionary:

Consistency and Standardization: A data dictionary helps to ensure that all


data elements and attributes are consistently defined and named across the
organization, promoting standardization and consistency in data management
practices.

Data Quality: A data dictionary can help improve data quality by providing a
single source of truth for data definitions, allowing users to easily verify the
accuracy and completeness of data.

Data Integration: A data dictionary can facilitate data integration efforts by


providing a common language and framework for understanding data elements
and their relationships across different systems.

Improved Collaboration: A data dictionary can help promote collaboration


between business and technical teams by providing a shared understanding of
data definitions and structures, reducing misunderstandings and communication
gaps.

Improved Efficiency: A data dictionary can help improve efficiency by reducing


the time and effort required to define, document, and manage data elements and
attributes.

Disadvantages of Data Dictionary:

Implementation and Maintenance Costs: Implementing and maintaining a data


dictionary can be costly, requiring significant resources in terms of time, money,
and personnel.

Complexity: A data dictionary can be complex and difficult to manage,


particularly in large organizations with multiple systems and data sources.

Resistance to Change: Some stakeholders may be resistant to using a data


dictionary, either due to a lack of understanding or because they prefer to use
their own terminology or definitions.

Data Security: A data dictionary can contain sensitive information, and


therefore, proper security measures must be in place to ensure that unauthorized

Software Engineering 43
users do not access or modify the data.

Data Governance: A data dictionary requires strong data governance practices


to ensure that data elements and attributes are managed effectively and
consistently across the organization.

Entity-Relationship (ER) Diagrams:


Entity-Relationship (ER) diagrams are visual representations used to model and
describe the structure of a database. They consist of entities (objects) and
relationships between them, helping in conceptualizing the data schema.
Key Components of an ER Diagram:

1. Entity: Represents a real-world object, such as a person, place, or thing. It


corresponds to a table in a relational database.

2. Attribute: Describes a property or characteristic of an entity and corresponds to


a column in a database table.

3. Relationship: Indicates how entities are related to each other. Relationships


show how data from one entity connects to data from another.

4. Cardinality: Specifies the number of instances of one entity that can be


associated with another entity.

Types of Relationships:

One-to-One (1:1): One instance in Entity A is associated with one instance in


Entity B.

One-to-Many (1:N): One instance in Entity A is associated with many instances


in Entity B.

Many-to-One (N:1): Many instances in Entity A are associated with one instance
in Entity B.

Many-to-Many (N:N): Many instances in Entity A are associated with many


instances in Entity B.

Benefits of ER Diagrams:

Visualization: Provides a visual representation of the database structure,


making it easier to understand and communicate.

Design Aid: Helps in designing the database schema, including tables,


attributes, and relationships.

Software Engineering 44
Normalization: Supports the process of database normalization to reduce data
redundancy and improve data integrity.

Database Query Planning: Assists in understanding how data should be


queried and retrieved from the database.

Integration of Data Dictionaries and ER Diagrams:


Data dictionaries and ER diagrams can complement each other in database design
and system analysis. A data dictionary can provide detailed information about data
elements, while ER diagrams offer a visual representation of how these elements
and entities are related within the system. Together, they aid in creating a
comprehensive and well-documented data model, making it easier to design, build,
and manage databases and information systems.

Requirements Documentation:
Requirements documentation is a critical aspect of the software development
process. It involves capturing, organizing, and presenting detailed information about
the software's functional and non-functional requirements, as well as any additional
information necessary for understanding and implementing the project. Effective
requirements documentation is essential for ensuring that the software meets user
expectations, aligns with stakeholder needs, and serves as a reference throughout
the development and testing phases.
Key Elements of Requirements Documentation:

1. Introduction:

An introductory section provides an overview of the document, explaining its


purpose, scope, and intended audience.

2. Scope Statement:

The scope statement defines the boundaries of the project and specifies
what is included and excluded. It helps in managing project expectations.

3. Functional Requirements:

This section outlines the specific functions, features, and capabilities that the
software must provide. It includes detailed descriptions of how the system
should behave in response to various inputs or under specific conditions.

4. Non-Functional Requirements:

Software Engineering 45
Non-functional requirements describe the quality attributes of the software,
such as performance, security, usability, scalability, and reliability. They
specify how the system should perform rather than what it should do.

5. User Requirements:

User requirements capture the needs and expectations of end-users. These


requirements often take the form of user stories or use cases, describing the
system from a user's perspective.

6. System Requirements:

System requirements provide technical details, including hardware and


software specifications, data structures, data storage, and compatibility
requirements. They guide the implementation and deployment of the
software.

7. Constraints and Assumptions:

Constraints and assumptions refer to factors that limit or impact the project.
This section outlines any restrictions or assumptions made during the
requirement-gathering process.

8. Use Cases or Scenarios:

If applicable, use cases or scenarios describe how the system functions in


real-world situations. They provide detailed interactions between users and
the software.

9. Data Models:

Data models may include Entity-Relationship Diagrams (ERDs), data


dictionaries, or schema descriptions to represent the structure of data in the
system.

10. Business Rules:

Business rules outline specific guidelines, regulations, or policies that the


software must adhere to. These rules help ensure that the system operates
in compliance with business or industry standards.

11. Dependencies:

Dependencies describe any relationships or interdependencies between


different requirements. Understanding these relationships is important for
managing changes and project impact.

Software Engineering 46
12. Verification and Validation:

This section outlines the methods and criteria used to verify and validate the
requirements, ensuring that they are complete, accurate, and testable.

Importance of Requirements Documentation:

Clear Communication: Requirements documentation serves as a common


reference point for all project stakeholders, ensuring clear communication and
understanding of project goals.

Quality Assurance: Well-documented requirements contribute to the quality and


reliability of the final software product.

Change Management: Requirements documentation facilitates change


management by providing a baseline for tracking and evaluating changes.

Risk Management: Identifying and addressing issues in the documentation


early in the project reduces the risk of costly errors and changes later on.

Compliance and Legal Protection: In some industries, thorough requirements


documentation is essential for compliance with regulatory standards and for legal
protection.

Project Planning: Requirements documentation serves as a basis for project


planning, including resource allocation and project timelines.

Creating comprehensive and well-organized requirements documentation is a crucial


step in the software development process. It helps ensure that the software is built to
meet stakeholder needs, performs as intended, and can adapt to changing
requirements.

Software Requirements Specification (SRS) - Nature


and Characteristics:
A Software Requirements Specification (SRS) is a comprehensive document that
serves as the foundation of a software development project. It outlines the functional
and non-functional requirements of the software, providing a clear and unambiguous
description of what the system should do and how it should perform. Here are the
key characteristics and the nature of an SRS:

1. Detailed and Comprehensive:

An SRS is a detailed document that leaves no room for ambiguity. It covers


all aspects of the software's functionality and performance, ensuring that

Software Engineering 47
developers have a clear understanding of what is expected.

2. Functional and Non-Functional Requirements:

An SRS includes both functional requirements (what the system should do)
and non-functional requirements (how the system should perform). This
encompasses features, user interactions, performance criteria, security
requirements, and more.

3. Clear and Unambiguous:

The SRS uses clear and concise language to ensure that there is no room
for misinterpretation. Ambiguities and contradictions are eliminated during
the development of the document.

4. User-Centric:

The SRS focuses on meeting the needs and expectations of end-users and
stakeholders. It ensures that the software serves its intended purpose
effectively.

5. Traceability:

The SRS establishes traceability between requirements and their sources,


allowing for tracking and validation. This ensures that each requirement can
be linked back to user needs or business goals.

6. Structured Format:

SRS documents typically follow a structured format, including sections for


the introduction, scope, functional and non-functional requirements, use
cases, data models, and more. This format helps organize and present the
information systematically.

7. Feasibility Analysis:

The SRS often includes a feasibility analysis that examines whether the
project can be realistically completed within budget and time constraints.
This analysis may assess technical, operational, and economic feasibility.

8. Dependencies and Interactions:

The SRS identifies dependencies and interactions between different


requirements and components of the system. This helps in understanding
how changes in one part of the system can affect others.

9. Verification and Validation Criteria:

Software Engineering 48
Verification and validation criteria are specified to ensure that each
requirement is testable and that there is a method to verify that it has been
met.

10. Change Control:

The SRS includes a change control process, describing how changes to


requirements will be managed and assessed for impact.

11. Legal and Regulatory Considerations:

In some industries, the SRS may include information related to legal and
regulatory compliance to ensure that the software adheres to relevant
standards and guidelines.

12. End-User Involvement:

The SRS may involve end-users and stakeholders in its development,


ensuring that their input and feedback are incorporated.

13. Evolutionary Document:

The SRS may evolve throughout the project as requirements change or new
insights are gained. It is important to maintain version control and document
changes carefully.

14. Alignment with Project Goals:

The SRS aligns with the overarching goals and objectives of the project,
ensuring that the software supports the business or organizational strategy.

15. Basis for Project Planning:

The SRS serves as the basis for project planning, including resource
allocation, scheduling, and budgeting.

16. Quality Assurance:

Ensuring the SRS is accurate and complete is essential for maintaining the
quality and reliability of the final software product.

The nature of an SRS is such that it provides a comprehensive and well-structured


foundation for software development. It is a dynamic document that evolves with the
project and serves as a critical reference for all project stakeholders, from developers
and testers to project managers and clients.

Requirement Management:

Software Engineering 49
Requirement management is a critical process in software development and project
management. It involves the systematic and structured handling of requirements
throughout the project lifecycle. Effective requirement management ensures that
requirements are captured, documented, tracked, and maintained to meet the needs
and expectations of stakeholders and deliver a successful project. Here are the key
aspects and practices of requirement management:
1. Requirement Elicitation:

This is the process of gathering requirements from various stakeholders,


including users, customers, and subject matter experts. It often involves
techniques such as interviews, surveys, workshops, and observations to
understand user needs.

2. Requirement Analysis:

Once requirements are collected, they need to be analyzed for clarity,


consistency, and feasibility. This involves refining requirements and identifying
any potential conflicts or gaps.

3. Requirement Documentation:

Well-documented requirements are essential. A Software Requirements


Specification (SRS) is typically created to capture and describe requirements in
detail, ensuring that they are clear and unambiguous.

4. Requirement Prioritization:

Not all requirements are of equal importance. Prioritization helps in determining


which requirements are critical and should be addressed first. Techniques like
MoSCoW (Must have, Should have, Could have, Won't have) are often used.

5. Requirement Traceability:

Traceability ensures that each requirement is linked to its source and that there
is a mechanism to track changes and updates throughout the project.

6. Change Control:

Projects are dynamic, and requirements may change. A formal change control
process is established to evaluate and manage requested changes, assessing
their impact on the project scope, schedule, and budget.

7. Version Control:

Keeping track of different versions of requirements documents and ensuring that


all stakeholders are working with the latest version is crucial to avoid confusion

Software Engineering 50
and errors.

8. Requirement Validation:

Requirements are validated to ensure that they are accurate, complete, and
aligned with stakeholder needs. Validation often involves reviews, inspections,
and walkthroughs.

9. Requirement Verification:

Verification ensures that the software developed meets the specified


requirements. This involves testing and quality assurance activities to confirm
that the requirements are correctly implemented.

10. Requirement Communication:


- Effective communication of requirements is essential to ensure that all stakeholders
understand and are aligned with the project objectives. Various communication
channels and tools are used for this purpose.
11. Requirement Baselining:
- Baseline requirements are the approved, unchanging set of requirements that serve
as the foundation for the project. Any changes must go through the change control
process.
12. Requirement Metrics:
- Metrics and Key Performance Indicators (KPIs) are used to measure the progress
and quality of requirement management, providing insights into how well the project
is adhering to its requirements.
13. Requirement Reviews:
- Periodic reviews of requirements are conducted to assess their relevance,
accuracy, and alignment with project objectives. These reviews help in identifying
and resolving issues early.
14. Requirement Tools:
- Various software tools and platforms are available to support requirement
management, helping in documenting, tracking, and reporting on requirements
efficiently.

15. Requirement Alignment with Project Goals:


- Every requirement must align with the overarching project goals and business or
organizational strategy.
16. Requirement Maintenance:
- Requirement management doesn't end with project delivery. Requirements must be

Software Engineering 51
maintained and updated as necessary to accommodate changes in the software or
evolving stakeholder needs.
Requirement management is an ongoing process that ensures that a project stays on
track, delivers what stakeholders expect, and manages change effectively. It is an
integral part of project management, quality assurance, and the software
development lifecycle.

The IEEE (Institute of Electrical and Electronics Engineers) has developed standards
for various aspects of software engineering, including the Software Requirements
Specification (SRS). The standard that pertains to SRS is IEEE Std 830-1998, titled
"IEEE Recommended Practice for Software Requirements Specifications."

IEEE Std 830-1998 - Recommended Practice for


Software Requirements Specifications:
This IEEE standard provides guidelines and recommendations for creating software
requirements specifications. It is widely recognized and used in the software
engineering industry. Here are some key aspects of IEEE Std 830-1998:
1. Purpose:

The standard aims to establish a common framework for creating high-quality


SRS documents that effectively communicate the software requirements to all
project stakeholders.

2. Format and Structure:

IEEE Std 830-1998 outlines a specific format and structure for SRS documents,
including sections and subsections that should be included in the document. This
structured approach helps ensure consistency and completeness.

3. Content Guidelines:

The standard provides guidance on what should be included in each section of


the SRS. It covers topics such as system functionality, external interfaces,
performance requirements, design constraints, and more.

4. Language and Style:

IEEE Std 830-1998 recommends a clear and unambiguous language and style
to avoid misunderstandings and ambiguities in the requirements.

5. Requirements Attributes:

Software Engineering 52
The standard suggests using attributes for each requirement to provide
additional information, such as the source of the requirement, its priority, and its
verification method.

6. Traceability:

IEEE Std 830-1998 emphasizes the importance of traceability, indicating that


each requirement should be traceable to its source and to the design and test
cases.

7. Appendices:

The standard allows for the inclusion of appendices, which can provide
supplementary information, such as data dictionaries, use case descriptions, and
diagrams.

8. Review and Verification:

IEEE Std 830-1998 recommends that the SRS undergo reviews and verification
to ensure its accuracy and completeness.

9. Change Control:

The standard suggests establishing a formal change control process to manage


changes to the requirements throughout the project.

10. Examples:
- The standard provides examples and templates to help illustrate how to structure
and format an SRS document effectively.

11. References:
- IEEE Std 830-1998 may reference other IEEE standards and guidelines that are
relevant to software requirements engineering.
It's important to note that standards may evolve over time, and there may be more
recent versions or related standards that update or complement IEEE Std 830-1998.
Therefore, it's advisable to check for the latest version of the standard and any
supplementary standards that may provide additional guidance on SRS creation and
management.
Following the IEEE Std 830-1998 or other relevant IEEE standards can help software
development teams create well-structured and comprehensive Software
Requirements Specifications, contributing to successful project outcomes and
effective communication with stakeholders.

Software Engineering 53
Unit 2
Software Project Planning:
Software project planning is the process of defining the scope, objectives, and
approach for a software development project. It involves the creation of a detailed
plan that outlines the project's tasks, timelines, resource allocation, and budget.
Effective project planning is crucial for delivering software projects on time, within
budget, and meeting stakeholder expectations. Here are the key aspects and steps
involved in software project planning:
1. Project Initiation:

Define the project's purpose, objectives, and scope. Identify the key
stakeholders, project team members, and their roles. Determine the feasibility of
the project and its alignment with organizational goals.

2. Requirements Analysis:

Gather and analyze the project requirements, including functional and non-
functional requirements. Ensure a clear understanding of what the software
should achieve and the needs of the end-users.

3. Project Scope Definition:

Clearly define the scope of the project, specifying what is included and excluded.
This helps in managing expectations and avoiding scope creep.

4. Work Breakdown Structure (WBS):

Create a hierarchical breakdown of the project tasks and deliverables. This is


known as the Work Breakdown Structure (WBS) and helps in organizing and
planning project work.

5. Task Estimation:

Estimate the effort, time, and resources required for each task or activity. Use
estimation techniques like expert judgment, historical data, and parametric
modeling.

6. Resource Allocation:

Identify the required resources, including personnel, hardware, software, and


tools. Allocate resources based on task requirements and availability.

7. Project Scheduling:

Software Engineering 54
Develop a project schedule that includes task sequences, dependencies, and
durations. Use project management software to create Gantt charts or other
scheduling tools.

8. Risk Assessment and Management:

Identify potential risks that may impact the project's success. Develop a risk
management plan to mitigate and manage these risks.

9. Quality Planning:

Define the quality standards and processes that will be followed throughout the
project to ensure the software meets quality requirements.

10. Budget Planning:


- Develop a budget that includes cost estimates for resources, tools, and other
project-related expenses. Monitor and manage project expenditures.
11. Communication Plan:
- Define a communication plan that outlines how project information will be
communicated to stakeholders, team members, and other parties involved in the
project.
12. Change Management:
- Establish a change control process to handle requested changes to project scope,
requirements, or other project aspects. Evaluate changes for impact and approval.
13. Project Monitoring and Control:
- Develop methods and metrics for monitoring project progress. Use project
management tools to track task completion, identify issues, and make necessary
adjustments.
14. Documentation:
- Maintain project documentation, including project plans, status reports, meeting
minutes, and other relevant records. Ensure that project documents are well-
organized and accessible.
15. Stakeholder Engagement:
- Engage with stakeholders through regular updates, meetings, and feedback
sessions to ensure that their expectations and concerns are addressed.
16. Project Closure:
- Develop a closure plan for ending the project, including tasks like final testing,
documentation, knowledge transfer, and project evaluation. Celebrate project
achievements and capture lessons learned for future projects.

Software Engineering 55
Effective software project planning is an iterative process, with adjustments and
refinements made as the project progresses. It's essential to maintain clear
communication and coordination among team members and stakeholders and to
adapt the plan as necessary to achieve project success. Project managers and
teams often use project management methodologies and tools to facilitate the
planning and execution of software projects.

Project size estimation


Project size estimation is a crucial aspect of software engineering, as it helps in
planning and allocating resources for the project. Here are some of the popular
project size estimation techniques used in software engineering:
Expert Judgment: In this technique, a group of experts in the relevant field
estimates the project size based on their experience and expertise. This technique is
often used when there is limited information available about the project.
Analogous Estimation: This technique involves estimating the project size based
on the similarities between the current project and previously completed projects.
This technique is useful when historical data is available for similar projects.
Bottom-up Estimation: In this technique, the project is divided into smaller modules
or tasks, and each task is estimated separately. The estimates are then aggregated
to arrive at the overall project estimate.
Three-point Estimation: This technique involves estimating the project size using
three values: optimistic, pessimistic, and most likely. These values are then used to
calculate the expected project size using a formula such as the PERT formula.
Function Points: This technique involves estimating the project size based on the
functionality provided by the software. Function points consider factors such as
inputs, outputs, inquiries, and files to arrive at the project size estimate.
Use Case Points: This technique involves estimating the project size based on the
number of use cases that the software must support. Use case points consider
factors such as the complexity of each use case, the number of actors involved, and
the number of use cases.
Each of these techniques has its strengths and weaknesses, and the choice of
technique depends on various factors such as the project’s complexity, available
data, and the expertise of the team.
Estimation of the size of the software is an essential part of Software Project
Management. It helps the project manager to further predict the effort and time which

Software Engineering 56
will be needed to build the project. Various measures are used in project size
estimation. Some of these are:

Lines of Code

Number of entities in ER diagram

Total number of processes in detailed data flow diagram

Function points

1. Lines of Code (LOC): As the name suggests, LOC counts the total number of
lines of source code in a project. The units of LOC are:

KLOC- Thousand lines of code

NLOC- Non-comment lines of code

KDSI- Thousands of delivered source instruction

The size is estimated by comparing it with the existing systems of the same kind. The
experts use it to predict the required size of various components of software and then
add them to get the total size.
It’s tough to estimate LOC by analyzing the problem definition. Only after the whole
code has been developed can accurate LOC be estimated. This statistic is of little
utility to project managers because project planning must be completed before
development activity can begin.
Two separate source files having a similar number of lines may not require the same
effort. A file with complicated logic would take longer to create than one with simple
logic. Proper estimation may not be attainable based on LOC.
The length of time it takes to solve an issue is measured in LOC. This statistic will
differ greatly from one programmer to the next. A seasoned programmer can write
the same logic in fewer lines than a newbie coder.
Advantages:

Universally accepted and is used in many models like COCOMO.

Estimation is closer to the developer’s perspective.

Both people throughout the world utilize and accept it.

At project completion, LOC is easily quantified.

It has a specific connection to the result.

Simple to use.

Software Engineering 57
Disadvantages:

Different programming languages contain a different number of lines.

No proper industry standard exists for this technique.

It is difficult to estimate the size using this technique in the early stages of the
project.

When platforms and languages are different, LOC cannot be used to normalize.

2. Number of entities in ER diagram: ER model provides a static view of the


project. It describes the entities and their relationships. The number of entities in ER
model can be used to measure the estimation of the size of the project. The number
of entities depends on the size of the project. This is because more entities needed
more classes/structures thus leading to more coding.
Advantages:

Size estimation can be done during the initial stages of planning.

The number of entities is independent of the programming technologies used.

Disadvantages:

No fixed standards exist. Some entities contribute more to project size than
others.

Just like FPA, it is less used in the cost estimation model. Hence, it must be
converted to LOC.

3. Total number of processes in detailed data flow diagram: Data Flow


Diagram(DFD) represents the functional view of software. The model depicts the
main processes/functions involved in software and the flow of data between them.
Utilization of the number of functions in DFD to predict software size. Already existing
processes of similar type are studied and used to estimate the size of the process.
Sum of the estimated size of each process gives the final estimated size.
Advantages:

It is independent of the programming language.

Each major process can be decomposed into smaller processes. This will
increase the accuracy of the estimation

Disadvantages:

Studying similar kinds of processes to estimate size takes additional time and
effort.

Software Engineering 58
All software projects are not required for the construction of DFD.

4. Function Point Analysis: In this method, the number and type of functions
supported by the software are utilized to find FPC(function point count). The steps in
function point analysis are:

Count the number of functions of each proposed type.

Compute the Unadjusted Function Points(UFP).

Find the Total Degree of Influence(TDI).

Compute Value Adjustment Factor(VAF).

Find the Function Point Count(FPC).

The explanation of the above points is given below:

Count the number of functions of each proposed type: Find the number of
functions belonging to the following types:

External Inputs: Functions related to data entering the system.

External outputs: Functions related to data exiting the system.

External Inquiries: They lead to data retrieval from the system but don’t
change the system.

Internal Files: Logical files maintained within the system. Log files are not
included here.

External interface Files: These are logical files for other applications which
are used by our system.

Compute the Unadjusted Function Points(UFP): Categorise each of the five


function types like simple, average, or complex based on their complexity.
Multiply the count of each function type with its weighting factor and find the
weighted sum. The weighting factors for each type based on their complexity are
as follows:

Function type Simple Average Complex

External Inputs 3 4 6

External Output 4 5 7

External Inquiries 3 4 6

Internal Logical Files 7 10 15

External Interface Files 5 7 10

Software Engineering 59
Find Total Degree of Influence: Use the ’14 general characteristics’ of a system
to find the degree of influence of each of them. The sum of all 14 degrees of
influence will give the TDI. The range of TDI is 0 to 70. The 14 general
characteristics are: Data Communications, Distributed Data Processing,
Performance, Heavily Used Configuration, Transaction Rate, On-Line Data Entry,
End-user Efficiency, Online Update, Complex Processing Reusability, Installation
Ease, Operational Ease, Multiple Sites and Facilitate Change. Each of the above
characteristics is evaluated on a scale of 0-5.

Compute Value Adjustment Factor(VAF): Use the following formula to


calculate VAF

VAF = (TDI * 0.01) + 0.65

Find the Function Point Count: Use the following formula to calculate FPC

FPC = UFP * VAF

Advantages:

It can be easily used in the early stages of project planning.

It is independent of the programming language.

It can be used to compare different projects even if they use different


technologies(database, language, etc).

Disadvantages:

It is not good for real-time systems and embedded systems.

Many cost estimation models like COCOMO use LOC and hence FPC must be
converted to LOC.

Size Estimation in Software Development: Lines of


Code and Function Count:
Size estimation in software development is a critical process that helps in assessing
the scale and complexity of a project. It involves quantifying the size of the software
in terms of lines of code (LOC) and function points (FP). These metrics are valuable
for project planning, resource allocation, cost estimation, and measuring productivity.
Here's an overview of both size estimation methods:

Software Engineering 60
1. Lines of Code (LOC):
Definition: Lines of code (LOC) is a metric that measures the number of lines or
statements in the source code of a software application. It's a simple and widely used
method for estimating the size of a software project.
Pros:

Simplicity: It's straightforward and easy to understand.

Widely Accepted: LOC is a standard metric and is widely accepted in the


software development industry.

Useful for Cost Estimation: It can be used to estimate project costs and
schedule.

Cons:

Not Always Accurate: LOC may not accurately represent the complexity or
functionality of the software.

Dependent on Coding Style: The number of lines of code can vary based on
the coding style, making it subjective.

2. Function Points (FP):


Definition: Function points (FP) are a standardized measure of the functionality
provided by a software application. They assess the complexity of the software
based on user interactions, data inputs and outputs, and system functionality.
Pros:

Objective Measurement: FP provides an objective measurement of the


software's functionality, making it less dependent on coding style.

Effective for Comparisons: It's useful for comparing the complexity and size of
different software projects.

Supports Project Planning: FP can be used for estimating effort, resources,


and project duration.

Cons:

Complex Calculation: Calculating function points can be more complex


compared to LOC.

Requires Expertise: It often requires expertise to accurately identify and classify


user interactions and data elements.

Software Engineering 61
Key Steps in Estimating Size with Function Points:

1. Identify User Inputs: Determine the types and quantity of user inputs (external
inputs, external outputs, and external inquiries).

2. Identify User Outputs: Identify the types and quantity of user outputs.

3. Identify User Inquiries: Identify user inquiries and their types.

4. Identify Logical Files: Determine the logical files used by the application.

5. Identify External Interfaces: Consider any external interfaces that the


application uses.

6. Calculate Function Points: Calculate the function points based on the identified
elements and their weights according to the Function Point Analysis guidelines.

Both LOC and FP have their merits and are often used in conjunction for more
accurate size estimation. The choice of which method to use may depend on project
characteristics and goals. FP is particularly useful for assessing the functionality of a
software system, while LOC is more closely related to coding effort and project
management. Accurate size estimation is crucial for effective project planning, cost
estimation, and resource allocation in software development.

Cost Estimation Models in Software Development:


Cost estimation is a critical aspect of software development project management.
Accurate cost estimation helps in budgeting, resource allocation, and project
planning. Several models and techniques are used for cost estimation in the software
development process. Here are some of the most widely recognized cost estimation
models:

1. COCOMO (Constructive Cost Model):

COCOMO is a widely used software cost estimation model developed by


Barry Boehm. It comes in three variants: Basic COCOMO, Intermediate
COCOMO, and Detailed COCOMO. These models consider factors such as
project size, complexity, and the development environment to estimate effort
and cost.

2. Function Point Analysis (FPA):

Function Point Analysis estimates the size of a software project based on the
functionality it provides to users. The function points are then converted into
effort and cost estimates using established conversion factors.

Software Engineering 62
3. Use Case Points (UCP):

UCP is a method that estimates software development effort based on the


number and complexity of use cases in the project. It considers factors like
actors, use cases, and transactions.

4. Estimation by Analogy:

This method involves comparing the current project with previous similar
projects and using historical data to estimate costs. It's based on the
assumption that past project experiences can be applied to the current
project.

5. PERT (Program Evaluation and Review Technique):

PERT is a technique that uses a three-point estimation approach,


incorporating optimistic, most likely, and pessimistic estimates to calculate an
expected value. It's often used for estimating project durations, which can be
translated into cost estimates.

6. Expert Judgment:

Expert judgment involves seeking input and insights from experienced


individuals or teams who have expertise in software development. Experts
assess the project's requirements, complexity, and other factors to estimate
costs.

7. Parametric Models:

Parametric models use mathematical formulas and historical data to


estimate project costs. One example is the Putnam Model, which considers
factors like project size, productivity, and complexity to estimate effort and
cost.

8. Bottom-Up Estimation:

This method involves breaking the project into smaller components,


estimating the cost of each component, and then aggregating the costs to
derive the overall project cost. It's particularly useful for detailed cost
estimation.

9. Top-Down Estimation:

Top-down estimation starts with an overall project estimate and then breaks
it down into smaller components. It's useful for high-level cost estimation
before detailed project planning.

Software Engineering 63
10. Expert Estimation with Delphi Technique:

The Delphi Technique involves gathering estimates from experts and using a
systematic approach to achieve consensus on project cost estimates. It's
often used in situations where there is uncertainty or limited historical data.

11. Monte Carlo Simulation:

Monte Carlo simulation involves running multiple simulations to estimate


project costs. It takes into account uncertainty and variability in project
parameters to produce a range of possible cost outcomes.

12. Machine Learning Models:

Machine learning can be applied to historical project data to create predictive


models for cost estimation. These models use features like project size,
complexity, and resource availability to predict costs.

The choice of a cost estimation model or technique depends on the project's nature,
available data, and the level of detail required. It's common to use multiple models
and compare their estimates to ensure accuracy. Additionally, ongoing monitoring
and refinement of cost estimates are essential as the project progresses and more
information becomes available.

COCOMO (Constructive Cost Model):


COCOMO, which stands for "Constructive Cost Model," is a widely recognized and
influential software cost estimation model developed by Dr. Barry Boehm in the late
1970s. COCOMO is designed to estimate the effort, time, and cost required for
software development projects. It has evolved into several versions, with the two
most prominent ones being Basic COCOMO and Intermediate COCOMO.
1. Basic COCOMO:
Basic COCOMO is a simple and early version of the model. It estimates project effort
based on lines of code (LOC) and project type. It considers three modes, each with a
different level of complexity:

Organic Mode: Suitable for relatively small and straightforward projects with
experienced developers. Effort is primarily based on LOC.

Semi-Detached Mode: Suitable for projects with moderate complexity. LOC, as


well as some level of innovation and complexity, are considered.

Embedded Mode: Suitable for large and complex projects, often involving real-
time or mission-critical systems. LOC, innovation, complexity, and other factors

Software Engineering 64
are considered.

2. Intermediate COCOMO:
Intermediate COCOMO is a more detailed version of the model. It provides a
framework for estimating effort, project duration, and cost based on a range of cost
drivers and factors, including project attributes, product attributes, hardware
attributes, and personnel attributes. The formula for Intermediate COCOMO is:
Effort (E) = a * (KLOC)^b * EAF
Where:

E is the effort in person-months.

a and b are constants specific to the project type.

KLOC is the estimated size of the software in thousands of lines of code.

EAF (Effort Adjustment Factor) accounts for various project-specific and


environmental factors that can impact effort.

Intermediate COCOMO allows for a more nuanced and tailored estimation, taking
into account the specific characteristics of the project. It considers factors such as
personnel capability, development flexibility, and the use of modern tools and
techniques.
3. Detailed COCOMO:
Detailed COCOMO is the most comprehensive version of the model, and it offers a
highly detailed estimation process. It takes into account additional factors like
software reuse, documentation, and quality control. This version of COCOMO is
particularly suitable for very large and complex projects.
Key Advantages of COCOMO:

It provides a structured and systematic approach to software cost estimation.

It allows for tailoring the estimation process to project-specific characteristics.

It offers a range of cost drivers and parameters for a more accurate estimation.

Key Limitations of COCOMO:

It relies heavily on lines of code (LOC) as a primary input, which may not
accurately capture the complexity and functionality of modern software.

It may require a significant amount of data and expertise to make accurate


estimates.

Software Engineering 65
COCOMO estimates are based on historical data, and the accuracy of the model
depends on the relevance of that data to the current project.

COCOMO remains a valuable tool for initial software project cost estimation and
serves as a foundation for more advanced models and techniques in the field of
software engineering. It can be particularly useful for comparing different project
scenarios and making informed decisions about project planning and resource
allocation.

Putnam Resource Allocation Model


The Putnam Resource Allocation Model, developed by Lawrence H. Putnam in the
1970s, is a widely used software cost estimation model. It focuses on estimating the
amount of effort, time, and resources required for a software development project.
The model is particularly suited for large and complex projects. The Putnam model
takes a different approach compared to other models like COCOMO, emphasizing
the relationship between project size, effort, and the number of resources applied.
Key Components of the Putnam Resource Allocation Model:

1. Project Size (S): The size of the software project is a critical factor in the
Putnam model. It is often measured in thousands of source lines of code (KLOC)
or function points (FP), depending on the context of the project.

2. Effort per Size (E/S): The Putnam model assumes that the effort required to
complete a project is proportional to its size. This factor represents the effort
required for each unit of project size (e.g., person-months per KLOC).

3. Productivity (P): Productivity is defined as the reciprocal of Effort per Size (1 /


(E/S)). It indicates how many lines of code (or function points) can be produced
per person-month.

4. Resource Constraint (R): The resource constraint represents the availability of


resources, primarily in terms of person-months. It reflects the maximum number
of person-months that can be allocated to the project.

Key Formulas in the Putnam Model:

1. Effort (E): The effort required for the project is calculated as follows:
E=S/P
Where:

E is the effort in person-months.

Software Engineering 66
S is the project size.

P is the productivity.

2. Schedule (T): The project schedule is estimated by dividing the effort by the
number of available resources:
T=E/R
Where:

T is the schedule in months.

E is the effort.

R is the resource constraint.

Advantages of the Putnam Resource Allocation Model:

Focuses on the relationship between project size, effort, and productivity.

Provides a simple and intuitive approach to cost estimation.

Suitable for large and complex projects where resource allocation is a critical
factor.

Limitations of the Putnam Model:

Assumes a linear relationship between size, effort, and productivity, which may
not hold true for all types of projects.

Does not account for variations in productivity that can occur during different
project phases.

The model may require extensive historical data to accurately estimate


productivity and effort.

The Putnam Resource Allocation Model is a valuable tool for estimating the effort
and schedule for software development projects, particularly for projects with well-
established productivity rates and resource constraints. However, as with any
estimation model, its accuracy depends on the quality of the data and the
applicability of its assumptions to the specific project at hand.

Validating Software Estimates:


Software estimates are crucial for project planning, resource allocation, and
budgeting. Validating these estimates is essential to ensure that they are accurate
and reliable. Validating estimates helps in managing expectations, reducing the risk

Software Engineering 67
of project overruns, and ensuring the successful completion of software projects.
Here are some key methods and best practices for validating software estimates:

1. Historical Data Analysis:

Compare current project estimates with historical data from similar projects.
This analysis can reveal patterns and trends that help in validating the
accuracy of the estimates.

2. Expert Judgment:

Seek input and validation from experienced professionals, including project


managers, developers, and subject matter experts. Their insights can help
assess the feasibility and accuracy of the estimates.

3. Peer Review:

Conduct peer reviews of the estimates. Bringing in team members or


stakeholders to review and challenge the estimates can help identify
potential issues and provide alternative perspectives.

4. Prototyping:

In some cases, creating a prototype or proof of concept can help validate


certain aspects of the estimates, especially in terms of functionality,
complexity, and technical feasibility.

5. Benchmarking:

Compare the estimates to industry benchmarks or standards. Benchmarking


can help gauge the reasonableness of the estimates in relation to the
industry norms.

6. Analogous Estimation:

Compare the current project with past projects that are similar in nature.
Analogous estimation involves adjusting past project data to account for
differences and validating the current estimates.

7. Use of Estimation Tools:

Utilize software estimation tools and techniques, such as COCOMO,


Function Point Analysis, or machine learning models. These tools can
provide quantitative data to validate the estimates.

8. Simulation and Monte Carlo Analysis:

Software Engineering 68
Employ simulation techniques to test the sensitivity of the estimates to
various parameters and uncertainties. Monte Carlo analysis, in particular,
can provide a range of possible outcomes to assess the reliability of the
estimates.

9. Contingency Planning:

Develop contingency plans that account for potential deviations from the
estimates. This proactive approach helps in managing risks and mitigating
the impact of uncertainties.

10. Progress Tracking:

As the project progresses, track actual effort, costs, and schedule against
the initial estimates. Continuous monitoring and adjustment help in validating
and refining the estimates.

11. Independent Estimation:

Consider having an independent third party or team provide their own


estimates for the project. Comparing these estimates with the in-house
estimates can reveal discrepancies and potential areas for validation.

12. Stakeholder Involvement:

Engage stakeholders throughout the project to validate their expectations


and ensure that the estimates align with their needs and objectives.

13. Lessons Learned:

Document lessons learned from past projects, especially if they deviated


significantly from their estimates. Use these lessons to improve future
estimating practices.

14. Documentation and Transparency:

Maintain clear and transparent documentation of the estimating process,


assumptions, and constraints. This documentation facilitates validation and
understanding.

15. Regular Reviews:

Conduct regular reviews of the estimates at key project milestones to


validate and update them based on evolving project conditions and insights.

Validating software estimates is an ongoing process that involves continuous


monitoring, adjustment, and stakeholder communication. It is essential for

Software Engineering 69
maintaining project control, managing risks, and ensuring successful project delivery.

Risk Management in Software Development:


Risk management is a critical component of software development, aimed at
identifying, assessing, mitigating, and monitoring potential risks that could impact a
project's success. Effective risk management helps minimize unexpected issues,
cost overruns, and project delays. Here are the key aspects and steps involved in
risk management for software development:

1. Risk Identification:

Identify potential risks that may affect the project. Risks can be technical
(e.g., software bugs), external (e.g., changes in requirements), or operational
(e.g., resource constraints).

2. Risk Assessment:

Evaluate the potential impact and probability of each identified risk. High-
impact, high-probability risks require more attention than low-impact, low-
probability risks.

3. Risk Prioritization:

Prioritize risks based on their severity and the level of impact they may have
on the project. This helps in allocating resources and focus to the most
critical risks.

4. Risk Mitigation Planning:

Develop mitigation plans for the high-priority risks. These plans should
outline specific actions to reduce the likelihood or impact of the risks.
Mitigation strategies can include code reviews, testing, contingency planning,
and resource allocation.

5. Risk Monitoring and Tracking:

Continuously monitor the identified risks and their status throughout the
project's lifecycle. Regularly review the effectiveness of mitigation measures
and adjust them as necessary.

6. Risk Response Planning:

For risks that cannot be completely mitigated, develop response plans to


manage the consequences. Response plans can include contingency
planning, crisis management, and communication strategies.

Software Engineering 70
7. Risk Documentation:

Maintain a risk register or risk log that documents each identified risk, its
assessment, mitigation plans, and tracking information. This serves as a
reference for the project team.

8. Communication:

Effective communication is essential in risk management. Ensure that all


stakeholders are aware of potential risks, mitigation plans, and response
strategies. Transparent communication helps in managing expectations.

9. Risk Reviews:

Conduct periodic risk reviews to reassess the project's risk landscape. New
risks may emerge, and the significance of existing risks may change as the
project progresses.

10. Contingency Planning:

Develop contingency plans for high-impact risks. Contingency plans define


actions that will be taken if a risk materializes, ensuring that the project can
continue without severe disruption.

11. Risk Ownership:

Assign ownership of specific risks to responsible individuals or teams. This


ensures accountability and clear lines of responsibility for managing risks.

12. Risk Reporting:

Provide regular updates on the status of risks to project stakeholders.


Reports should include information on risk assessment, mitigation progress,
and any changes in risk profiles.

13. Lessons Learned:

After project completion, conduct a lessons-learned review to analyze how


risks were managed and to capture insights for future projects.

14. Change Control:

Any proposed changes to the project, such as scope changes or schedule


adjustments, should go through a formal change control process. This
process assesses the potential impact of changes on project risks.

Effective risk management is an iterative and ongoing process that adapts to the
evolving nature of software development projects. It is a proactive approach that can

Software Engineering 71
significantly contribute to project success by reducing the likelihood of negative
outcomes and enhancing project predictability.

Software Design: Cohesion and Coupling


In software design, cohesion and coupling are two essential principles that play a
critical role in creating maintainable, modular, and efficient software systems. Let's
explore these concepts in more detail:
1. Cohesion:
Cohesion refers to the degree to which the components (modules, classes,
functions) within a software system are focused on performing a single, well-defined
task or responsibility. In other words, it measures how closely related the elements
within a module are. High cohesion is generally a desirable characteristic of a well-
designed system.
There are several types of cohesion, ordered from the lowest to the highest:

Coincidental Cohesion: This is the lowest level of cohesion, where the


elements within a module are not related to each other and appear to be put
together coincidentally. Such modules are difficult to understand and maintain.

Logical Cohesion: In this case, the elements within a module are grouped
together because they share a logical relationship. For example, a module that
contains file I/O functions may exhibit logical cohesion.

Temporal Cohesion: Elements within a module are grouped together because


they are used at the same time, even though they may perform different
functions. For example, a module that contains initialization and cleanup code
may have temporal cohesion.

Procedural Cohesion: Elements within a module are grouped together because


they are part of a common procedure or algorithm. While this is better than the
previous types, it's still not the highest form of cohesion.

Communicational Cohesion: In this case, elements within a module are


grouped together because they operate on the same data or exchange
information. This is better than procedural cohesion but still has room for
improvement.

Sequential Cohesion: Elements within a module are grouped together because


they must be executed in a specific order. This is more focused than
communicational cohesion.

Software Engineering 72
Functional Cohesion: This is the highest level of cohesion. Elements within a
module are grouped together because they collectively perform a single, well-
defined function or task. Modules with functional cohesion are easier to
understand, maintain, and reuse.

Aim for achieving functional cohesion in your software design, as it results in more
modular and maintainable code.
2. Coupling:
Coupling refers to the degree of interconnectedness between modules or
components within a software system. It measures how one module relies on the
functionality of another. Low coupling is generally preferred as it leads to more
flexible and maintainable systems.
There are different levels of coupling:

Tight Coupling: In this scenario, modules are highly dependent on each other
and are closely connected. Changes in one module can have a significant impact
on others. Tight coupling reduces the system's flexibility and maintainability.

Loose Coupling: In loose coupling, modules are less dependent on each other,
and changes in one module are less likely to affect others. This results in more
flexibility and ease of maintenance.

Reducing coupling and achieving loose coupling is a crucial design goal in software
development. One way to achieve this is by using well-defined interfaces and
ensuring that modules communicate through those interfaces rather than directly with
each other.
In summary, cohesion and coupling are fundamental principles in software design
that impact the quality and maintainability of software systems. High cohesion and
low coupling are desirable design characteristics that lead to more modular,
understandable, and flexible software architectures.

Function-Oriented Design in Software Engineering:


Function-oriented design is an approach to software design that emphasizes
breaking down a system into functional components or modules, with each module
responsible for performing a specific function or task. This design paradigm is
particularly suitable for systems with well-defined functions, and it is often associated
with procedural programming and structured programming.
Here are the key principles and concepts of function-oriented design:

Software Engineering 73
1. Modularity: Function-oriented design promotes modularity, where the software
system is divided into separate, self-contained modules, each responsible for a
specific function. These modules can be designed, developed, tested, and
maintained independently.

2. Functional Decomposition: The software system is decomposed into a


hierarchy of functions or procedures. At each level of the hierarchy, functions are
broken down into smaller, more manageable sub-functions until the system's
functionality is adequately represented.

3. Top-Down Design: In function-oriented design, the design process often starts


from the top level, representing the overall system, and proceeds to the lower
levels, where functions are further detailed. This top-down approach helps in
structuring the design and understanding the system's architecture.

4. Abstraction: Abstraction is a key concept in function-oriented design. Functions


are designed to abstract specific operations, data processing, or tasks, which
makes the design more understandable and manageable.

5. Information Hiding: Modules in function-oriented design often encapsulate their


internal details, exposing only necessary interfaces to the rest of the system.
This concept of information hiding helps in reducing complexity and
dependencies.

6. Structured Programming: Function-oriented design aligns well with structured


programming practices. It encourages the use of structured constructs like loops,
conditionals, and subroutines (functions or procedures) to create clear and
maintainable code.

7. Reuse: Modular functions can be reused across the system or in other projects,
promoting code reusability and reducing redundant development efforts.

8. Documentation: Since each module represents a specific function, it's easier to


document the purpose, input, and output of each module, contributing to better
system documentation.

9. Testing and Debugging: Smaller, modular functions are easier to test and
debug, which simplifies the software development and maintenance process.

Function-oriented design is often used for systems where the primary focus is on
data processing, algorithmic operations, and structured procedures. It is well-suited
for scientific and engineering applications, data processing systems, and embedded
software.

Software Engineering 74
While function-oriented design offers benefits in terms of modularity and
maintainability, it may not be the ideal choice for all types of software systems,
particularly those with complex user interfaces or object-oriented requirements. In
such cases, other design paradigms like object-oriented design may be more
appropriate.
Overall, function-oriented design remains a valuable approach in software
engineering, particularly when designing systems where a clear separation of
functionality into modules is crucial for achieving efficiency and maintainability.

Object-Oriented Design (OOD) in Software


Engineering:
Object-Oriented Design (OOD) is a widely used and effective software design
paradigm that revolves around the concept of objects. It is a design approach that
models real-world entities and their interactions as objects, which can encapsulate
data and behavior. OOD is closely associated with object-oriented programming
(OOP) languages such as Java, C++, and Python. Here are the key principles and
concepts of Object-Oriented Design:
1. Objects:

Objects are instances of classes and represent real-world entities, concepts, or


data structures. They encapsulate both data (attributes) and behavior (methods)
related to the entity they represent.

2. Classes:

Classes serve as blueprints or templates for creating objects. They define the
structure and behavior of objects of a certain type. Classes can inherit attributes
and methods from other classes, fostering reusability and hierarchical
organization.

3. Encapsulation:

Encapsulation is the practice of bundling data and methods that operate on the
data within a class, making data private and providing controlled access through
methods (getters and setters). This ensures data integrity and modularity.

4. Inheritance:

Inheritance allows a class (sub-class or derived class) to inherit attributes and


methods from another class (super-class or base class). This promotes code
reuse and enables the creation of hierarchies of related classes.

Software Engineering 75
5. Polymorphism:

Polymorphism allows objects of different classes to be treated as objects of a


common super-class. This concept enables method overriding and dynamic
method dispatch, allowing for flexibility in method implementation.

6. Abstraction:

Abstraction involves simplifying complex reality by modeling classes based on


the essential characteristics of objects. It focuses on what an object does rather
than how it does it.

7. Modularity:

OOD encourages modular design, where complex systems are broken down into
smaller, more manageable components (objects and classes). Each module
encapsulates a specific piece of functionality.

8. Reusability:

Objects and classes can be reused in different contexts, fostering code reuse
and reducing redundancy. Libraries, frameworks, and design patterns are
examples of reusable components in OOD.

9. Association and Composition:

OOD supports modeling associations between objects and the composition of


objects into larger structures. These relationships can be represented through
attributes and methods within classes.

10. Design Patterns:


- Design patterns are well-established solutions to common design problems in
software development. OOD encourages the use of design patterns to address
recurring design challenges.
11. UML (Unified Modeling Language):
- UML is a standardized notation for visualizing, specifying, and documenting the
artifacts of software systems. It includes diagrams like class diagrams, sequence
diagrams, and use case diagrams that facilitate OOD.
Object-Oriented Design is well-suited for modeling complex systems, promoting code
maintainability, and supporting a modular and hierarchical design. It's particularly
valuable for software projects where objects in the problem domain can be naturally
mapped to software objects. OOD is widely used in various domains, including web
development, game development, and enterprise software.

Software Engineering 76
It's important to note that Object-Oriented Design is just one of several design
paradigms in software engineering. The choice of design paradigm depends on the
nature of the project, the problem domain, and the specific requirements.

User Interface Design


User interface is the front-end application view to which user interacts in order to use
the software. The software becomes more popular if its user interface is:

Attractive

Simple to use

Responsive in short time

Clear to understand

Consistent on all interface screens

There are two types of User Interface:

1. Command Line Interface: Command Line Interface provides a command


prompt, where the user types the command and feeds to the system. The user
needs to remember the syntax of the command and its use.

2. Graphical User Interface: Graphical User Interface provides the simple


interactive interface to interact with the system. GUI can be a combination of
both hardware and software. Using GUI, user interprets the software.

User Interface Design Process:

Software Engineering 77
The analysis and design process of a user interface is iterative and can be
represented by a spiral model. The analysis and design process of user interface
consists of four framework activities.

1. User, task, environmental analysis, and modeling: Initially, the focus is based
on the profile of users who will interact with the system, i.e. understanding, skill
and knowledge, type of user, etc, based on the user’s profile users are made into
categories. From each category requirements are gathered. Based on the
requirements developer understand how to develop the interface. Once all the
requirements are gathered a detailed analysis is conducted. In the analysis part,
the tasks that the user performs to establish the goals of the system are
identified, described and elaborated. The analysis of the user environment
focuses on the physical work environment. Among the questions to be asked
are:

Where will the interface be located physically?

Will the user be sitting, standing, or performing other tasks unrelated to the
interface?

Does the interface hardware accommodate space, light, or noise


constraints?

Software Engineering 78
Are there special human factors considerations driven by environmental
factors?

2. Interface Design: The goal of this phase is to define the set of interface objects
and actions i.e. Control mechanisms that enable the user to perform desired
tasks. Indicate how these control mechanisms affect the system. Specify the
action sequence of tasks and subtasks, also called a user scenario. Indicate the
state of the system when the user performs a particular task. Always follow the
three golden rules stated by Theo Mandel. Design issues such as response time,
command and action structure, error handling, and help facilities are considered
as the design model is refined. This phase serves as the foundation for the
implementation phase.

3. Interface construction and implementation: The implementation activity


begins with the creation of prototype (model) that enables usage scenarios to be
evaluated. As iterative design process continues a User Interface toolkit that
allows the creation of windows, menus, device interaction, error messages,
commands, and many other elements of an interactive environment can be used
for completing the construction of an interface.

4. Interface Validation: This phase focuses on testing the interface. The interface
should be in such a way that it should be able to perform tasks correctly and it
should be able to handle a variety of tasks. It should achieve all the user’s
requirements. It should be easy to use and easy to learn. Users should accept
the interface as a useful one in their work.

made by yashs using chatgpt so content maybe wrong :)


updated version here: https://yashnote.notion.site/Software-Engineering-
973dbb6cce4a44098a92231036aefdf4?pvs=4

Software Engineering 79

You might also like