Abdul Performance
Abdul Performance
Abdul Performance
PROFESSIONAL SUMMARY
Performance Test
Senior performance Engineer with 15+ years of significant
Architect
experience in system Analysis, Data validation, script design Load
test executions, Performance Testing, Observability Testing and Performance Test
Engineering of major software applications in various industries Manager
such as Banking, Finance, Retail and HealthCare sectors.
Proficient in testing and tuning java, .net, LDAP, web, DB, Citrix Performance Test
applications and CI CD Implementation, Defect lifecycle Lead
management with tools (HP-ALM) and Defect tracking tool(JIRA) Performance Testing
Strong experience in using Load testing tools such as Micro
focuses Load Runner, Performance center, Apache JMeter, Performance
LoginVSI and Blaze meter. monitoring
Good understanding on principles Queuing theory and Capacity
planning. Performance tuning
Well-versed with monitoring tools SiteScope, Splunk, Dynatrace, LoadRunner
Prometheus and Wily Introscope
Well versed with other load testing tools like OpenSTA and Cloud Performance Center
based tools and Cloud based technologies like AWS,AZURE and LoginVSI
GCP platforms to integrate the CI/CD process with JULES/JENKINS
INTEGRATION JMeter
Implemented CI/CD framework integration with Blazemeter Dynatrace
Worked with protocols such as Web, Webservices, REST API’s,
Oracle NCA, SAP, AJAX TC Wily Introscope
Specialized in load testing Virtual desktop environments and VDI load testing
cloud platforms
Solved major production issues and reduced critical business Troubleshooting
hurts
Executed performance scripts on different browsers and
reported defects/results to the team.
Involved in developing Performance Test Plans, Scenario design,
workload design, Test Environment setup and analyzing Test
Results.
Experience in Testing Database Applications (ORACLE, GRAFANA
DB, DB2, MS SQL Server, Sybase and MS Access).
Excellent testing experience in all phases of Performance Testing
Life Cycle (PTLC) and agile methodologies.
Expertise in backend testing Batch Job processes that extract
data from homogeneous or heterogeneous sources.
Skilled in developing and maintaining test plans and scripts.
Good knowledge on all phases of Project Life Cycle/SDLC
including Requirement gathering, Analysis, Design, Development,
Implementation, Testing, Software quality standards,
configuration management, change management and Quality
1
procedures.
Good understanding of data warehousing concepts and data
analysis.
Strong knowledge of relational database design concepts.
Strong ability to identify performance defects.
Well versed with RDBMS and excellent SQL skills to build complex
queries.
Versatile team player with excellent analytical and interpersonal
skills with an aptitude to learn new technologies.
Expertise in Project documentation, Testing and Quality audit
reviews.
Extensively worked on MS Office Tools (Excel, Word, PowerPoint
and Access).
Has Team Building/handling experience.
Excellent communication, interpersonal skills and presentation
skills, which helped achieve excellent results and meet client
expectations, all through. Have an ability to communicate ideas,
issues and solutions clearly to both functional and technology
preferences.
KEY ACHIEVEMENTS
Able to execute and provide thought leadership to various large scale performance testing
projects
Assisted in tuning various system components and utilized various diagnostic and monitoring
tools to detect, isolate, and resolve performance issues
Liaised with Development teams, technology leads, project managers, plan, schedule and execute
the performance tests to meet the project deadlines
Mentored many junior engineers on load testing with multiple tools and techniques
Received numerous letters of recommendation and appreciation from customers & management
TECHNICAL SKILLS
2
EXPERIENCE SUMMARY:
Client: NORTHERN TRUST BANK MAY 2023–TILL DATE
Performance Test Lead
The project is primarily involved in testing the migrated test environment from Oracle 12c Remediation
to Oracle 19x.This project consists of 23 TLA’s which has web-based applications, Web services, REST API’s
and Batch jobs which are migrating to 19c environment. The objective of this performance testing is to
compare the performance metrics with the new updates in the migrated environment. Also, the
comparison for response time, CPU and memory utilization before it is deployed to production.
Responsibilities:
Leading the team size of 4 performance testers.
Creation and maintenance of Test scripts & executing the test scenarios for UI, web services and
Batch jobs using Control-M.
Guiding and mentoring the offshore team members on various parameters like Creating Effective test
Plan, Test scripts, Test Scenarios, Test Executions, raising defects and dealing with client calls and
report walkthrough calls with the product owners.
Raising the performance defects in Azure Dev-ops platform.
Configuring the load tests with JMeter in Azure Dev-ops.
Co-coordinating with the Application team on performance test gathering requirements.
Involved in all the phases of performance testing engagement from plan to report analysis
Service virtualization in AZURE AD platform.
Schedule and automate the execution of tests using Azure Pipelines.
Interacting with the developers, stake holders and product owners on final reports walkthrough.
Involved in the load testing, Volume testing for UI/Web services and Batch Jobs using Control-M
scheduler.
Designed the test scenarios based on the business requirements.
Involved in analyzing AWR reports, Server health utilizations and performed deep dive analysis on the
utilizations during the load tests.
Worked with TDM team to execute and design the SQL queries to generate required test data for
Load/Stress and Endurance testing.
Performed various load tests to identify the memory leaks in the applications.
RACPAD is the largest domain in acquiring logistics and warehouses, involved in testing the migration of
Legacy environment to AWS cloud infrastructure. This project has UI applications, REST API’s and DB
objects which are migrating to new Cloud based environment and few changes to the existing
applications which are customized on-premise environment. The objective of this performance testing is
to capture the performance metrics with the new updates in the monthly releases and implement the
changes in the CI/CD framework. Also, the capacity planning and scalability with the current
infrastructure.
Responsibilities:
3
Involved in planning phase by interacting with application owners to gather the performance
requirements.
Creation and maintenance of Test scripts & executing the test scenarios for UI, web services and REST
API’s integration.
Guiding and mentoring the offshore team members on various parameters like Creating Effective test
Plan, Test scripts, Test Scenarios, Test Executions, raising defects and dealing with client calls and
report walkthrough calls with the product owners.
Execute performance, Stress and Endurance test runs for holiday readiness.
Involved in performing distributed load testing on AWS solutions to automate the process of testing
at scale and to help to identify potential performance bottlenecks.
Configuring and installing JMeter on the AWS EC2 instances for load distribution and simulation
Analyze the performance test results – Back-end performance using Dynatrace, Log
Analysis using Splunk and front-end performance using APM tools.
Analyze JVM heap memory usage, Garbage collection activity, thread pools statistics.
Analyze historical traffic patterns and design workload model for holiday season.
Performance strategy design for multiple applications within tailored brands and Jos A bank.
Production performance analysis daily basis is using Dynatrace, Akamai mpulse and
Grafana.
Execute performance tests from AWS and data center.
Report performance test results to various stakeholders.
The project is primarily involved in testing the migrated test environment from Oracle 12c to Oracle
19x.This project has web-based applications, Web services, REST API’s and Batch jobs which are migrating
to new Cloud based environment and few changes to the existing applications which are customized in a
virtual environment. The objective of this performance testing is to compare the performance metrics
with the new updates in the migrated cloud environment. Also, the comparison for response time, CPU
and memory utilization before it is deployed to production.
Responsibilities:
Leading the team size of 10 performance testers.
Creation and maintenance of Test scripts & executing the test scenarios for UI, web services and REST
API’s integration.
Guiding and mentoring the offshore team members on various parameters like Creating Effective test
Plan, Test scripts, Test Scenarios, Test Executions, raising defects and dealing with client calls and
report walkthrough calls with the product owners.
Co-coordinating with the Application team on performance test gathering requirements.
Involved in all the phases of performance testing engagement from plan to report analysis.
Interacting with the developers, stake holders and product owners on final reports walkthrough.
Worked on CI/CD and Blazemeter integration with Jules pipeline/Jenkins in GAIA private cloud
platform.
Automation of the framework with PYTHON scripting.
Involved in the load testing, Volume testing for UI/Webservices and Batch Jobs using Control-M
scheduler.
4
Designed the test scenarios based on the business requirements.
Involved in analyzing Server health utilizations and performed deep dive analysis on the utilizations
during the load tests.
Worked with TDM team to execute and design the SQL queries to generate required test data for
Load/Stress and Endurance testing.
Performed various load tests to identify the memory leaks in the applications.
Project Name - BSC SA Testing-BSC. The project is to test the BSC web portals and Batch jobs applications
like member, producer, and provider and employer portals performance.
Responsibilities:
Requirement Gathering, Understanding the requirements, creating test plan document, suitable
scenarios based on the critical transactions, analyzing the production logs to get the benchmark
response times to be reached (SLA).
Interacting with the Development team and setting up the standing and working sessions with the
issues observed during the performance testing phase.
Creating the workload model based on the Vuser count and the no. of transactions to be achieved in
a particular duration and execute these scenarios in Controller.
Analyzing the statistics of the run and helping the team in using the custom codes to debug the
scripts in case of errors during the run.
Reporting to the concerned team.
Consolidating the project status to the customers on daily basis.
Project: PCATS provides the enhanced platform to support tracking of core data across the application
including, but not limited to, process level characteristics, contract level information, and pipeline
activities and related reporting. This will enable the capture and tracking of the data to replace the
current spreadsheet-based process. Users will use the system to create, update and report on transition
activities. Users will have access to streamlined interface to capture, update and view transition activities,
pipeline reporting and view steady state process data as well The Objective of this engagement is to
identify the behavior of the key performance indicators and the application sustainability in the users
prospective.
Responsibilities:
Individual Contribution to the overall engagement.
Performance requirements analysis.
Prepare performance test plan for end-end performance testing activities.
Understand the technology stack.
Identified the business flows and documented the same.
Interact with various groups to gather performance requirements.
Monitor hardware utilization.
5
Prepared test execution design using performance center.
Prepared test execution detailed report.
Observations in the test execution report and provide recommendations.
Project: HomeDepot Link provides suppliers and business partners with access to information and
systems used for conducting business with The Home Depot. The site provides users with business
related documentation, contact information, alerts, action items, and access to other HomeDepot
applications. Active suppliers and business partners that do not currently have an account on
HomeDepot Link can submit a request to their Partner Guardian by “Registering” for an account.
Provides performance activities end to end testing solutions.
Technologies: J2EE, JQuery, Windows (Load Generation), Linux, Apache, DB2.Oracle, WebSphere
Responsibilities:
Thought leadership to the overall engagement.
Performance requirements analysis.
Prepare performance test plan for end-end performance testing activities.
Understand the technology stack.
Understand sequence diagrams.
Interact with various groups to gather performance requirements.
Workload analysis for various components including third party systems.
Manage a team of 5 resources.
Monitor hardware utilization using SiteScope.
Component level performance testing.
Prepared test execution design using performance center.
Prepared test execution detailed report.
Observations in the test execution report and provide recommendations if any.
Project: Liberty Mutual Agency Corporation's companies offer their products and services exclusively
through professionally licensed independent insurance agents. With Liberty Mutual Agency Corporation's
companies, policyholders and independent insurance agents can expect: Effective relationships
Meaningful partnerships in the marketplace robust product set and risk appetite Consistent, disciplined
underwriting High quality products, services, and outcomes.
Business scenario. (Liberty Mutual Is using BMC Remedy tool for their Business Solution of which they
need to have performance testing solution using Load Runner for their Business Scenarios.
Responsibilities:
Performance requirements analysis.
Prepare performance strategy and plan document.
Conduct performance test runs.
6
Designed and deployed performance monitor.
Validate test data provided by the customer.
Designed the load test for the different scenarios in ALM.
Conduct POC’s to identify suitable tools for various aged client server applications and applications
hosted on workspace.
Technologies: Loadrunner 11.0 with HTTP, AJAX TRUE Client protocols, WebSphere, IIS, NMON, Nimbus,
WebSphere, Oracle 10g
Project: The objective of this engagement is for analyzing and understanding the performance
characteristics of, Banner web application Build 2 from an end-user perspective. Three scenarios were
identified as part of performance testing engagement: Lookup classes, Add/Drop Courses using CRNs,
Add/Drop classes from class search. The goal of the test is to check the Banner web application using mix
of these scenarios with the sudden ramp up of concurrent users.
Responsibilities:
Understand the performance requirements.
Automate the identified business scenarios.
Execute the load tests with anticipated user loads using Appcloud.
Analyze the results and report to the customer.
Analyze the results and tune the system to meet performance SLA’s.
Technologies: JMeter, Appcloud, WebSphere, Linux, Oracle 10g, Apache, Big IP load balancer, Java, J2EE
Project: The Objective of this is to performance test their IBS application of BW Reports with different
scenario flow namely Engagement WIP Analysis Report, WIP Summary Report, Opportunity Summary
Report, Staff Performance Work Date, Claim Monitoring, Engagement Transaction Date, Debtors Detail,
New Entity Report. The objective of this engagement is to test their portal with500 concurrent users and
fine tune the system to meet the performance SLA’s
Responsibilities:
Performance requirements analysis.
Prepare performance strategy and plan document.
Conduct baseline, benchmark tests.
Conduct load test with 500 concurrent users using Performance Center and monitor Servers.
Analyze the results and tune the system to meet performance SLA’s
Technologies: Windows (Load Generation), IIS, Oracle JRE1.5(Sun)and Load runner 11.0, Performance
Center 9.52: Windows server 2003
7
Client: Northern Eastern University-USA OCT 2011 – FEB 2012
Role: Senior Test Engineer
Project: Northeastern University and App Labs are working towards analyzing and understanding the
performance characteristics of Banner web application from an end-user perspective. Three scenarios
were identified as part of performance testing engagement: Lookup classes, Add/Drop Courses using
CRNs, Add/Drop classes from class search. The goal of the test is to check the Banner web application
using mix of these scenarios with the concurrent users.
Responsibilities:
Technical Understanding of Application and architecture.
Walkthrough of the Application.
Involved in Test Plan Preparation.
Involved in Test Data Preparation.
Walkthrough the Scenario for testing.
Documentation of Step-by-Step workflow of Scenario.
Establish performance goals and criteria for the application.
Designed Performance Scenarios as per the business.
Designed Simulation Scripts.
Designed and deployed performance monitor.
Validate test data provided by Customer.
Analyzed client side and server-side performance counters information.
Prepared test execution design using Appcloud Instances.
Prepared test execution detailed report.
To analyze Observations in the test execution report and provide recommendations if any.
Project: Safeway Just for U International is a Retail Application have three user Interface Applications.
Safeway, Vons and Dominick’s Banners, which deals with the three pages like cc page, ycs page and
personalized page which users can book the orders and they will have offers by selecting the clips in the
cart. All the three Banners will have coupons, stores and list of available products.
Responsibilities:
Technical Understanding of Application and architecture.
Identifying the business Scenarios.
Conducted POC on Application using Loadrunner tool.
Involved in validating the Test data.
Finalizing the Scenarios for testing.
Establish performance goals and criteria for the application.
Designed Performance Scenarios.
Designed Simulation Scripts.
Designed and deployed performance monitor.
Validate test data provided by Customer.
8
Analyzed client-side performance counters information.
Prepared test execution detailed report.
Observations in the test execution report and provide recommendations if any.
Other Projects: