Lab Guide: 44+ Labs For Amazon AWS Associates Amazon AWS Lab Guide
Lab Guide: 44+ Labs For Amazon AWS Associates Amazon AWS Lab Guide
Lab Guide: 44+ Labs For Amazon AWS Associates Amazon AWS Lab Guide
Faisal Khan
Amazon AWS Lab Guide
Lab guide
Contents
Access and tour AWS console ........................................................................................................... 18
Lab Details: .............................................................................................................................................. 18
Tasks: ........................................................................................................................................................ 18
Introduction to AWS Identity Access Management(IAM) ............................................................... 18
Lab Details: .............................................................................................................................................. 18
Introduction: ............................................................................................................................................ 19
Tasks: ........................................................................................................................................................ 19
Launching Lab Environment............................................................................................................... 19
Lab Steps: ................................................................................................................................................ 20
Create New IAM User .......................................................................................................................... 20
Create New IAM Group ....................................................................................................................... 23
2. Set Group Name: .............................................................................................................................. 23
Adding IAM User to IAM Group: ........................................................................................................ 23
Completion and Conclusion: .............................................................................................................. 24
Completion and conclusion: ............................................................................................................... 27
Introduction to Amazon Simple Storage Service (S3) .................................................................... 28
Lab Details: .............................................................................................................................................. 28
Introduction: ............................................................................................................................................ 28
Tasks: ........................................................................................................................................................ 29
Launching Lab Environment............................................................................................................... 29
Lab Steps: ................................................................................................................................................ 30
Create S3 Bucket ............................................................................................................................ 30
Upload a file to S3 bucket ............................................................................................................ 30
Change Bucket Permission ......................................................................................................... 31
Create a Bucket Policy.................................................................................................................. 34
Test Public access ......................................................................................................................... 35
Completion and Conclusion................................................................................................................ 36
How to enable versioning Amazon S3 .............................................................................................. 36
Lab Details: ........................................................................................................................................... 36
Introduction: .......................................................................................................................................... 36
Tasks: ..................................................................................................................................................... 36
Steps: ........................................................................................................................................................ 37
Create a S3 Bucket .............................................................................................................................. 37
Enable Versioning on S3 bucket ........................................................................................................ 37
Upload first version of object .............................................................................................................. 38
Creating S3 Lifecycle Policy ............................................................................................................... 41
Lab Details: .............................................................................................................................................. 41
Tasks: ........................................................................................................................................................ 41
Architecture Diagram ............................................................................................................................ 42
Launching Lab Environment............................................................................................................... 42
Steps: ........................................................................................................................................................ 43
Create an S3 Bucket............................................................................................................................ 43
Upload an Object.................................................................................................................................. 43
Creating a Lifecycle Rule .................................................................................................................... 44
Completion and Conclusion................................................................................................................ 48
Introduction to Amazon CloudFront.................................................................................................... 48
Introduction to Amazon Elastic Compute Cloud (EC2) ...................................................................... 56
Introduction............................................................................................................................................ 56
What is EC2 ..................................................................................................................................... 56
Launching Lab Environment............................................................................................................... 57
Steps: ........................................................................................................................................................ 57
Launching an EC2 Instance ............................................................................................................... 57
Allocating Elastic IP and Associating it to EC2 Instance................................................................ 62
Lab Details: .............................................................................................................................................. 62
Tasks: ........................................................................................................................................................ 62
Launching Lab Environment............................................................................................................... 62
Steps.......................................................................................................................................................... 63
Launching an EC2 Instance ............................................................................................................... 63
SSH into EC2 Instance........................................................................................................................ 65
Install an Apache Server ..................................................................................................................... 65
Create and publish page ..................................................................................................................... 66
Associating an Elastic IP Address with a Running Instance ......................................................... 67
Completion and Conclusion................................................................................................................ 69
Creating and Subscribing to SNS Topics, Adding SNS event for S3 bucket ....................................... 70
Lab Details: .............................................................................................................................................. 70
Introduction: ............................................................................................................................................ 70
What is SNS?............................................................................................................................................ 70
Launching Lab Environment............................................................................................................... 71
Steps: ........................................................................................................................................................ 71
Create SNS Topic ................................................................................................................................ 71
Subscribe to SNS Topic ...................................................................................................................... 72
Create S3 Bucket ................................................................................................................................. 73
Update SNS Topic Access Policy ...................................................................................................... 73
Create S3 Event ................................................................................................................................... 74
Testing the SNS Notification............................................................................................................... 75
Completion and Conclusion ................................................................................................................ 76
How to Create a static website using Amazon S3 ............................................................................... 76
Lab Details: ....................................................................................................................................... 76
Introduction:....................................................................................................................................... 76
Tasks: ................................................................................................................................................. 77
Launching Lab Environment............................................................................................................... 77
Creating a Bucket ................................................................................................................................. 77
Enable Static Website Hosting ........................................................................................................... 78
Test the website ................................................................................................................................... 80
Test the website error page ................................................................................................................ 81
Completion and Conclusion ................................................................................................................ 81
Accessing S3 with AWS IAM Roles ....................................................................................................... 82
Lab Details ............................................................................................................................................... 82
Introduction ............................................................................................................................................. 82
IAM Policy.............................................................................................................................................. 82
Policy Types .......................................................................................................................................... 82
Identity-Based-Policy .................................................................................................................... 82
Resource-Based-Policy ................................................................................................................ 83
IAM Role ................................................................................................................................................ 83
Simple storage service(S3)................................................................................................................. 83
Summary of Lab session ..................................................................................................................... 84
Launching Lab Environment............................................................................................................... 84
Steps.......................................................................................................................................................... 85
Creating IAM Role ................................................................................................................................ 85
Launching EC2 Instance ..................................................................................................................... 86
Viewing S3 bucket................................................................................................................................ 88
Accessing S3 bucket via EC2 Instance ............................................................................................ 88
AWS S3 Multipart Upload using AWS CLI............................................................................................ 91
Lab Details: .............................................................................................................................................. 91
Tasks: ........................................................................................................................................................ 91
Launching Lab Environment............................................................................................................... 91
Steps.......................................................................................................................................................... 92
Create an IAM Role ............................................................................................................................. 92
Create a S3 Bucket ........................................................................................................................ 93
Launching an EC2 Instance ............................................................................................................... 94
SSH into EC2 Instance........................................................................................................................ 96
View the Original file in EC2 ............................................................................................................... 96
Split the Original file ............................................................................................................................. 96
Create Multipart Upload ...................................................................................................................... 97
Uploading Each Chunks / split Files .................................................................................................. 97
Create a Multipart JSON file ............................................................................................................... 98
Complete Multipart Upload ............................................................................................................... 100
View the File in S3 Bucket ................................................................................................................ 101
Completion and Conclusion.............................................................................................................. 101
Using AWS S3 to Store ELB Access Logs ............................................................................................ 102
Lab Details ............................................................................................................................................. 102
Introduction ........................................................................................................................................... 102
Elastic Load Balancer ........................................................................................................................ 102
Storing ELB Access logs in S3......................................................................................................... 102
Lab Tasks:.............................................................................................................................................. 103
Launching Lab Environment............................................................................................................. 103
Steps........................................................................................................................................................ 104
Creating Security group for Load balancer: ................................................................................... 104
Steps to create Web-servers ............................................................................................................ 104
Creating Load balancer ..................................................................................................................... 106
Configuring Load Balancer to store Access logs in S3 bucket ................................................... 109
Testing the working of Load balancer to Store the Access Logs ................................................ 110
Completion and Conclusion.............................................................................................................. 112
Introduction to AWS Relational Database Service............................................................................ 113
Lab Details: ............................................................................................................................................ 113
Task Details ........................................................................................................................................... 113
Prerequisites: ........................................................................................................................................ 113
Introduction to AWS Elastic Load Balancing .................................................................................... 120
Lab Details: ............................................................................................................................................. 120
Introduction: ............................................................................................................................................ 120
Tasks: ....................................................................................................................................................... 121
Launching Lab Environment............................................................................................................. 121
Steps: ...................................................................................................................................................... 122
Launching EC2 Instances 1 .............................................................................................................. 122
Launching EC2 Instances 2 .............................................................................................................. 124
Creating Load Balancer and Target Group .................................................................................... 125
Testing the Elastic Load Balancer ................................................................................................... 128
Completion and Conclusion .............................................................................................................. 130
Creating an application load balancer from AWS CLI ...................................................................... 130
Lab Details ............................................................................................................................................. 130
Introduction ........................................................................................................................................... 130
AWS Elastic Load Balancer .............................................................................................................. 130
Application Load Balancer ................................................................................................................ 131
Lab Tasks ............................................................................................................................................ 132
Launching Lab Environment............................................................................................................. 133
Steps........................................................................................................................................................ 133
Creating EC2 Instance ...................................................................................................................... 133
Creating another EC2 Instance ........................................................................................................ 135
Creating an Application Load Balancer in AWS CLI ..................................................................... 137
SSH into EC2 and Connect to Your Database ...................................................................... 137
Creating Load Balancer .............................................................................................................. 140
Creating 2 Target Groups ................................................................................................................. 141
Register the Targets with the respective Target groups .............................................................. 143
Creating Listeners for Default rules ................................................................................................. 144
Creating Listeners for other rules ................................................................................................... 145
Verifying health of the Target Groups ............................................................................................. 146
Introduction to Amazon Auto Scaling ................................................................................................ 150
Launching Lab Environment............................................................................................................. 150
Note : If you have completed one lab, make sure to signout of the aws account before starting
new lab. If you face any issues, please go through FAQs and Troubleshooting for Labs. ......... 151
Steps: ...................................................................................................................................................... 151
Creating Launch Configurations...................................................................................................... 151
Create an Auto Scaling Group ......................................................................................................... 153
Test Auto Scaling Group: .................................................................................................................. 157
Completion and Conclusion .............................................................................................................. 157
Using CloudWatch for Resource Monitoring, Create CloudWatch Alarms and Dashboards ......... 158
Lab Details: ............................................................................................................................................ 158
Task Details ........................................................................................................................................... 158
Launching Lab Environment............................................................................................................. 158
Steps: ...................................................................................................................................................... 159
Launching an EC2 Instance ............................................................................................................. 159
SSH into EC2 Instance and install necessary Softwares ............................................................ 160
Create SNS Topic .............................................................................................................................. 160
Subscribe to SNS Topic .................................................................................................................... 161
Using CloudWatch ............................................................................................................................. 162
Check CPUUtilization Metrics ................................................................................................... 162
Create CloudWatch Alarm ................................................................................................................ 163
Testing CloudWatch Alarm by Stressing CPUUtilization ............................................................. 165
Checking Notification Mail................................................................................................................. 166
Checking CloudWatch Alarm Graph ............................................................................................... 167
Create a CloudWatch Dashboard .................................................................................................... 168
Completion and Conclusion .............................................................................................................. 169
Introduction to AWS Elastic Beanstalk .............................................................................................. 170
Lab Details: ............................................................................................................................................ 170
Tasks: ...................................................................................................................................................... 170
Adding a Database to Elastic Beanstalk Environment ..................................................................... 173
Lab Details: ......................................................................................................................................... 173
Tasks: ...................................................................................................................................................... 173
MySQL Server Setup ................................................................................................................... 173
Launching Lab Environment: ............................................................................................................ 173
Steps: ...................................................................................................................................................... 174
Create Elastic Beanstalk Environment ................................................................................... 174
Adding Database to Beanstalk Environment ................................................................................. 175
Test the RDS Database Connection ............................................................................................... 178
Connecting from local Linux/IOS Machine............................................................................ 179
Connecting from local Windows Machine ............................................................................. 180
Completion and Conclusion:............................................................................................................. 181
Blue/Green Deployments with Elastic Beanstalk ............................................................................. 182
Lab Details ............................................................................................................................................. 182
Introduction ........................................................................................................................................... 182
AWS Elastic Beanstalk ...................................................................................................................... 182
Blue/Green deployments with Elastic Beanstalk ........................................................................... 182
Advantages .................................................................................................................................... 183
Lab Tasks ............................................................................................................................................ 183
Launching Lab Environment............................................................................................................. 183
Steps........................................................................................................................................................ 184
Creating an Elastic BeanStalk Application ................................................................................... 184
Creating Elastic Beanstalk Blue environment.............................................................................. 185
Creating Elastic Beanstalk Green environment ........................................................................... 188
Swapping the URLs from Blue to Green ........................................................................................ 192
Introduction to AWS DynamoDB ....................................................................................................... 195
DynamoDB & Global Secondary Index............................................................................................... 200
Lab Details: ............................................................................................................................................ 200
Introduction ........................................................................................................................................... 200
Definition: ............................................................................................................................................. 200
DynamoDB Tables. ............................................................................................................................ 200
DynamoDB- Primary Keys. ............................................................................................................... 200
What is an Index in DynamoDB ....................................................................................................... 201
Local Secondary Index ...................................................................................................................... 201
Global Secondary Index .................................................................................................................... 201
Case Study: Creating a Global Secondary Index.......................................................................... 202
Launching Lab Environment............................................................................................................. 202
Steps........................................................................................................................................................ 203
Create DynamoDB Table .................................................................................................................. 203
Create Item.......................................................................................................................................... 205
Use Global Secondary Index to Fetch Data................................................................................... 210
Completion and Conclusion.............................................................................................................. 211
Import CSV Data into DynamoDB ...................................................................................................... 213
Lab Details ............................................................................................................................................. 213
Introduction ........................................................................................................................................... 213
Amazon DynamoDB .......................................................................................................................... 213
Lab Tasks ............................................................................................................................................... 213
Launching Lab Environment: ........................................................................................................... 213
Steps........................................................................................................................................................ 214
Create DynamoDB Table .................................................................................................................. 214
Create a S3 bucket and upload CSV File ....................................................................................... 215
Creating Lambda Function ............................................................................................................... 216
Test the CSV Data Import using Mock test in Lambda ................................................................ 218
Adding Event Triggers to S3 Bucket ............................................................................................... 220
Test the S3 event Trigger to import data to dynamoDB Table .................................................... 222
Import JSON file Data into DynamoDB .............................................................................................. 225
Lab Details ............................................................................................................................................. 225
Introduction ........................................................................................................................................... 225
Amazon DynamoDB .......................................................................................................................... 225
Lab Tasks ............................................................................................................................................... 225
Launching Lab Environment............................................................................................................. 225
Steps........................................................................................................................................................ 226
Create DynamoDB Table .................................................................................................................. 226
Create a S3 bucket and upload CSV File ....................................................................................... 227
Creating Lambda Function ............................................................................................................... 228
Test the JSON Data Import using Mock test in Lambda .............................................................. 230
Adding Event Triggers in Lambda for S3 Bucket .......................................................................... 233
Test the Lambda S3 Trigger to import data to dynamoDB Table ............................................... 234
Creating Events in CloudWatch .......................................................................................................... 237
Lab Details: ............................................................................................................................................ 237
Task Details ........................................................................................................................................... 237
Launching Lab Environment............................................................................................................. 237
Steps: ...................................................................................................................................................... 238
Launching an EC2 Instance ............................................................................................................. 238
Create SNS Topic .............................................................................................................................. 239
Subscribe to SNS Topic .................................................................................................................... 239
Create CloudWatch Events .............................................................................................................. 240
Test CloudWatch Event..................................................................................................................... 241
Completion and Conclusion .............................................................................................................. 243
Launch Amazon EC2 instance, Launch Amazon RDS Instance, Connecting RDS from EC2 Instance
.............................................................................................................................................................. 244
Lab Details: ............................................................................................................................................ 244
Tasks: ...................................................................................................................................................... 244
Launching Lab Environment: ........................................................................................................... 244
Lab Steps: .............................................................................................................................................. 245
Completion and Conclusion: ............................................................................................................ 250
Introduction to Amazon Lambda ....................................................................................................... 251
Lab Details: ............................................................................................................................................ 251
Tasks: ...................................................................................................................................................... 251
Launching Lab Environment............................................................................................................. 251
Steps: ...................................................................................................................................................... 252
Completion and Conclusion.............................................................................................................. 257
Launch an EC2 Instance with Lambda ............................................................................................... 258
Lab Details: ............................................................................................................................................ 258
Launching Lab Environment............................................................................................................. 258
Steps: ...................................................................................................................................................... 259
Create an IAM Policy ......................................................................................................................... 259
Create an IAM Role ........................................................................................................................... 260
Create a Lambda Function ............................................................................................................... 261
Configure Test Event ......................................................................................................................... 263
Provision EC2 Instance using Lambda Function .......................................................................... 263
Check the EC2 instance launched .................................................................................................. 264
Completion and Conclusion .............................................................................................................. 264
Configuring DynamoDB Streams Using Lambda .............................................................................. 265
Lab Details ............................................................................................................................................. 265
Introduction ........................................................................................................................................... 265
Amazon DynamoDB .......................................................................................................................... 265
Amazon DynamoDB Streams .......................................................................................................... 265
Lab Tasks ............................................................................................................................................... 267
Launching Lab Environment: ........................................................................................................... 267
Steps........................................................................................................................................................ 268
Create DynamoDB Table .................................................................................................................. 268
Creating Items and Inserting Data into DynamoDB Table ........................................................... 269
Creating Lambda Function ............................................................................................................... 271
Adding Triggers to DynamoDB Table ............................................................................................. 273
Making Changes to the DynamoDB Table and verifying trigger ................................................. 274
AWS Lambda Versioning and alias from the CLI .............................................................................. 278
Lab Details ............................................................................................................................................. 278
Introduction ........................................................................................................................................... 278
Lamda .................................................................................................................................................. 278
Lambda Version and Alias ................................................................................................................ 278
Summary of the Lab session ............................................................................................................ 279
Launching Lab Environment............................................................................................................. 279
Steps........................................................................................................................................................ 279
Creating IAM Role .............................................................................................................................. 279
Login to EC2 Server........................................................................................................................... 281
Creating a Lambda function in CLI .................................................................................................. 282
Updating and Invoking the lambda function ................................................................................... 284
Publishing Lambda version in CLI ................................................................................................... 285
Creation and Deletion of Lambda Alias .......................................................................................... 287
Deleting Lambda Function ................................................................................................................ 289
Completion and Conclusion.............................................................................................................. 289
Introduction to Amazon CloudFormation ......................................................................................... 290
Tasks: ...................................................................................................................................................... 290
Steps: ...................................................................................................................................................... 291
Create Cloudformation Stack ........................................................................................................... 291
Testing ................................................................................................................................................. 293
Completion and Conclusion .............................................................................................................. 294
AWS EC2 Provisioning - Cloudformation .......................................................................................... 295
Tasks: ...................................................................................................................................................... 295
Steps: ...................................................................................................................................................... 295
Understand the Cloudformation Template ..................................................................................... 295
Create Cloudformation Stack to provision EC2 Instance ............................................................. 297
Check the New EC2 instance Provisioned..................................................................................... 298
Completion and Conclusion .............................................................................................................. 299
How to Create Virtual Private Cloud (VPC) with AWS CloudFormation......................................... 300
Lab Details: ............................................................................................................................................ 300
Tasks: ...................................................................................................................................................... 300
Launching Lab Environment: ................................................................................................................ 300
Lab Steps: ............................................................................................................................................... 301
Creating Subnets using VPC_Template cloudformation stack ................................................... 301
Creating Subnets using VPC_II_Template cloudformation stack ............................................... 302
Completion and Conclusion .............................................................................................................. 304
Create a VPC using AWS CLI commands............................................................................................ 305
Lab Details ............................................................................................................................................. 305
Tasks: ...................................................................................................................................................... 305
Launching Lab Environment............................................................................................................. 305
Steps: ...................................................................................................................................................... 306
Create an IAM Role .............................................................................................................................. 306
Launching an EC2 Instance .............................................................................................................. 308
SSH into EC2 Instance ........................................................................................................................ 310
Create a VPC using AWS CLI ............................................................................................................ 310
Create a Subnet using AWS CLI ...................................................................................................... 311
Create a Internet gateway using AWS CLI .................................................................................... 312
Attach Internet Gateway to VPC using AWS CLI ......................................................................... 312
Create a custom Route table for your VPC using AWS CLI...................................................... 313
Create a public route in the Route table that point to the Internet gateway using AWS CLI
.................................................................................................................................................................. 314
Associate the Subnet to your Route table using AWS CLI ....................................................... 314
View the New VPC ................................................................................................................................ 315
Completion and Conclusion.............................................................................................................. 315
AWS Cloudformation Nested Stacks .................................................................................................. 316
Lab Details ............................................................................................................................................. 316
Introduction ........................................................................................................................................... 316
Cloudformation ................................................................................................................................... 316
Template .............................................................................................................................................. 316
Stack .................................................................................................................................................... 317
Nested Stack ....................................................................................................................................... 317
Lab Tasks ............................................................................................................................................... 317
Launching Lab Environment............................................................................................................. 318
Case Study ............................................................................................................................................. 318
Steps........................................................................................................................................................ 318
Understand the Cloudformation Template ..................................................................................... 318
Template for Autoscaling group .............................................................................................. 319
Template for a Load balancer ................................................................................................... 319
Template for Nested stack ......................................................................................................... 320
This template is used for creating the nested stack using the above two
stacks Nested_ASG.yaml and Nested_LB.yaml. Here we are attaching the Autoscaling
group with the load balancer. ................................................................................................... 320
Editing Nested_stack.yaml file ......................................................................................................... 320
Creating a web server with Autoscaling group and Load balancer using Cloudformation
Nested stack ....................................................................................................................................... 323
Check the resources created by Nested Stack ............................................................................. 325
Checking for Auto Scaling group ............................................................................................ 325
Checking for Launch configuration ........................................................................................ 326
Checking for EC2 instance ........................................................................................................ 327
Checking for Load Balancer...................................................................................................... 327
Testing working of a Load balancer................................................................................................ 328
Deploying Lambda Functions using CloudFormation ...................................................................... 331
Lab Details ............................................................................................................................................. 331
Introduction ........................................................................................................................................... 331
Amazon CloudFormation ................................................................................................................... 331
Amazon Lambda ................................................................................................................................ 332
Lab Tasks ............................................................................................................................................... 332
Launching Lab Environment............................................................................................................. 333
Steps........................................................................................................................................................ 333
Cloudformation Template .................................................................................................................. 333
Template for S3 stack........................................................................................................................ 334
Template for EC2 stack ..................................................................................................................... 335
Creating S3 Stack and testing the Lambda function .................................................................... 337
Creating EC2 Stack and testing the Lambda function ................................................................ 341
Introduction to Amazon Aurora ......................................................................................................... 346
Lab Details: ............................................................................................................................................ 346
MySQL Server Setup ................................................................................................................... 346
Launching Lab Environment: ........................................................................................................... 346
Steps: ...................................................................................................................................................... 347
Create RDS Database Instance....................................................................................................... 347
Connecting to Amazon Aurora MySQL RDS Database on a DB Instance. .............................. 349
Connecting from local Linux/IOS Machine............................................................................ 349
Connecting from local Windows Machine ............................................................................. 350
Execute Database Operations ......................................................................................................... 351
Completion and Conclusion: ............................................................................................................ 353
Build Your Own New Wordpress Website Using AWS Console....................................................... 354
Introduction to Simple Queuing Service (SQS) ................................................................................. 371
Lab Details ............................................................................................................................................. 371
Introduction ........................................................................................................................................... 371
SQS(Simple Queueing Service)........................................................................................................ 371
A Simple Use Case .............................................................................................................................. 372
Tasks ....................................................................................................................................................... 373
Launching Lab Environment............................................................................................................. 373
Steps........................................................................................................................................................ 374
Create FIFO and Standard Queue using console ........................................................................ 374
What is Long Polling & Configuring Long Polling ...................................................................... 379
Let's try to make changes for Long Polling in our existing queue ............................................. 379
What is Visibility TimeOut & Configuring Visibility TimeOut ................................................... 380
What is Delay Queue & Configuring Delay Queue ...................................................................... 381
Purge Queue & Delete Queue ........................................................................................................... 383
SQS points to remember .................................................................................................................... 385
Completion and Conclusion.............................................................................................................. 385
Creating a User Pool in AWS Cognito ................................................................................................. 385
Lab Details ............................................................................................................................................. 385
Lab Tasks ............................................................................................................................................... 385
Launching Lab Environment............................................................................................................. 386
Steps........................................................................................................................................................ 386
Creating a User Pool ........................................................................................................................... 386
Name and Attributes .......................................................................................................................... 387
Policies ................................................................................................................................................. 389
MFA and Verifications ....................................................................................................................... 390
Message Customizations .................................................................................................................. 391
Tags: .................................................................................................................................................... 392
.............................................................................................................................................................. 393
Devices ................................................................................................................................................ 393
App Client ............................................................................................................................................ 393
Customize Workflows ........................................................................................................................ 393
Review: ................................................................................................................................................ 394
Completion and Conclusion.............................................................................................................. 396
API Gateway - Creating Resources and Methods .............................................................................. 397
Lab Details ............................................................................................................................................. 397
Introduction ........................................................................................................................................... 397
Amazon API Gateway ....................................................................................................................... 397
Lab Tasks ............................................................................................................................................... 397
Launching Lab Environment............................................................................................................. 397
Steps........................................................................................................................................................ 398
Create an API......................................................................................................................................... 398
Creating a Resource .......................................................................................................................... 399
Completion and Conclusion.............................................................................................................. 399
Build API Gateway with Lambda Integration.................................................................................... 400
Lab Details ............................................................................................................................................. 400
Introduction ........................................................................................................................................... 400
Amazon API Gateway ....................................................................................................................... 400
Lab Tasks ............................................................................................................................................... 400
Launching Lab Environment............................................................................................................. 401
Steps........................................................................................................................................................ 401
Create a Lambda Function ................................................................................................................ 401
Creating a Resource .......................................................................................................................... 402
Deploy API ......................................................................................................................................... 404
Completion and Conclusion .............................................................................................................. 406
Mount Elastic File system(EFS) on EC2 ............................................................................................. 407
Lab Details: ............................................................................................................................................ 407
Tasks: ...................................................................................................................................................... 407
Launching Lab Environment............................................................................................................. 407
Steps........................................................................................................................................................ 408
Launching two EC2 Instances .......................................................................................................... 408
Creating Elastic FIle System ............................................................................................................ 409
Mount the File System to MyEFS-1 Instance ................................................................................ 412
Mount the File System to MyEFS-2 Instance ................................................................................ 413
Testing the File System ..................................................................................................................... 414
Completion and Conclusion.............................................................................................................. 415
Create AWS EC2 Instance and run AWS CLI Commands .................................................................. 416
Lab Details ............................................................................................................................................. 416
Tasks: ...................................................................................................................................................... 416
Launching Lab Environment............................................................................................................. 416
Create an IAM Role ........................................................................................................................... 417
Launching an EC2 Instance ............................................................................................................. 418
SSH into EC2 Instance...................................................................................................................... 420
AWS CLI command to create KeyPair ............................................................................................ 420
AWS CLI command to create Security Group ............................................................................... 421
AWS CLI command to create EC2 .................................................................................................. 421
View the EC2 instance that is been created .................................................................................. 422
AWS CLI command to Delete the EC2 instance ........................................................................... 422
Completion and Conclusion: ............................................................................................................ 423
Lambda Function to Shut Down and Terminate an EC2 Instance ................................................... 424
Lab Details: ............................................................................................................................................ 424
Tasks ....................................................................................................................................................... 424
Launching Lab Environment............................................................................................................. 424
Steps........................................................................................................................................................ 425
Launching two EC2 Instance ............................................................................................................ 425
Create an IAM Role ........................................................................................................................... 426
Create a Lambda Function ............................................................................................................... 427
Configure Test Event ......................................................................................................................... 429
Performing Stop and Terminate action on EC2 Instances........................................................... 429
Check the EC2 instances Status ..................................................................................................... 430
Performing Stop and Terminate action again ................................................................................ 430
Check the EC2 instances Status again .......................................................................................... 431
Completion and Conclusion .............................................................................................................. 431
S3 Bucket event trigger lambda function to send Email notification .............................................. 432
Lab Details ............................................................................................................................................. 432
Architecture Diagram:......................................................................................................................... 432
Tasks: ...................................................................................................................................................... 432
Flow Chart .............................................................................................................................................. 432
Launching Lab Environment............................................................................................................. 432
Steps: ...................................................................................................................................................... 433
Create an IAM Role ........................................................................................................................... 433
Create a S3 Bucket .............................................................................................................................. 434
Upload objects to S3 Bucket............................................................................................................. 434
Create a Email verification using SES ............................................................................................ 435
Verify the Email address .................................................................................................................... 436
Create a Lambda Function ................................................................................................................ 437
Configuring the S3 Bucket Event .................................................................................................... 439
Testing the lab ...................................................................................................................................... 439
Completion and Conclusion .............................................................................................................. 441
Running Lambda on a Schedule ......................................................................................................... 442
Lab Details ............................................................................................................................................. 442
AWS Lambda ......................................................................................................................................... 442
Lab Tasks ............................................................................................................................................... 443
Launching Lab Environment............................................................................................................. 443
Steps: ...................................................................................................................................................... 443
Create an EC2 Instance .................................................................................................................... 443
Create an IAM Role ........................................................................................................................... 445
Create a Lambda Function ............................................................................................................... 447
Creating CloudWatch Events ........................................................................................................... 447
Testing the Lambda ........................................................................................................................... 448
Completion and Conclusion .............................................................................................................. 450
Access and tour AWS console
Lab Details:
1. This AWS Lab is provided for practicing logging into AWS. Once logged in, students can navigate
through the AWS console on their own to get themselves familiarized with AWS console.
Understand how AWS console looks. Search for various AWS resources. Understand various
AWS resources location and how they are categorized.
2. Duration: 00:15:00 Hrs
3. AWS Region: US East (N. Virginia)
Tasks:
1. Login to AWS Management Console.
2. Since it is a tour, navigate around to see the AWS console.
3. Search for AWS resources.
4. Understand the Navigation inside AWS console properly.
Steps:
Note: The user does not have any access to work with any of the services that are displayed.
Introduction:
What is IAM?
• Stands for Identity and Access Management.
• Web service that helps the user to securely control access to AWS resources.
• Used to control who is authenticated and authorized to use AWS resources.
• The first "identity" is the creation of account in AWS portal. On providing the email and
password an Identity is created, and that's the "root user" holding all the permissions to
access all resources in AWS.
• The primary resources in IAM are - user, group, role, policy, and identity provider.
• IAM User is an entity that you create in AWS.
• It represents the person or service who uses the IAM user to interact with AWS.
• IAM Group is a collection of IAM Users.
• You use groups to specify permissions for a collection of users, which can make those
permissions easier to manage for those users.
• IAM Roles is like user, in that it's an identity with permission policies that determine what
the identity can and cannot do in AWS.
• IAM Role does not have any credentials associated with it.
• IAM Role is intended to be assumable by anyone who needs it.
• IAM can be used from AWS CLI, AWS SDK and AWS Management Console.
Tasks:
1. Login to AWS Management Console.
2. Create 4 IAM Users.
3. Create 2 IAM Groups.
4. Add IAM Users to different IAM Groups.
5. Attach IAM policies to the IAM Groups.
on , this will open your AWS Console Account for this lab in a new
tab. If you are asked to logout in AWS Management Console page, click on here link and
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Lab Steps:
Create New IAM User
2. Select the in left panel and click on the to create a new IAM
user.
3. In Add user page,
o In Set user details Section,
▪ User name: John
o Click on
o Lets Attach some existing IAM policies to the Group.
o Attach Policy: Select two policies
▪ AWSCodeDeployFullAccess
▪ AWSCodeDeployRole
▪ Copy the provided role name in the policy type search box to get the
mentioned roles easily.
o Click on .
o Click on the
o Group Name: HR-Team
o Click on .
o Attach Policy:
▪ Billing
o Click on .
o Once you click on the tab, you will see This group does not
contain any users.
the .
2. Repeat the same steps to add Ted and Rita in the HR-Team group.
To add a shortcut
• Click on Pushpin Icon again. Now they will be available directly on navigation bar for easy
and direct access. Depending on the frequency of using services, you can keep most used
resource in that place for easy access.
To choose a Region
1. In the AWS Console Home page, search for a service like EC2 or VPC and go to service
page console.
2. On the navigation bar, select the name of the currently displayed Region. It will be N. Virginia
in our case.
3. Click on any other say Asia Pacific (Seoul) region to navigate your AWS resources in that
region.
o Note: AWS resources created in one region will not be visible when you select
another region in the AWS console. We will understand more about this in future labs
Lab Details:
1. This lab walks you through to Amazon Simple Storage Service. Amazon S3 has a simple web
services interface that you can use to store and retrieve any amount of data, at any time, from
anywhere on the web. In this lab we will demonstrate AWS S3 by creating a sample S3 bucket,
uploading an object to S3 bucket and setting up bucket permission and policy.
2. Duration: 00:30:00 Hrs
3. AWS Region: US East (N. Virginia)
Introduction:
What is S3?
• S3 stands for Simple Storage Service.
• It provides object storage through a web service interface.
• Each object is stored as a file with its metadata included and is given an ID number.
• Objects uploaded to S3 are stored in containers called “Buckets”, whose names are
“unique” and they organize the Amazon S3 namespace at the highest level.
• These buckets are region specific.
• You can assign permissions to these buckets, in order to provide access or restrict data
transaction.
• Applications use this ID number to access an object.
• Developers can access an object via a REST API.
• Supports upload of objects.
• Uses the same scalable storage infrastructure that Amazon.com uses to run its global e-
commerce network.
• Designed for storing online backup and archiving of data and applications on AWS.
• Its mainly designed with the minimal features that can easily set and also to create the
web-scale computing in an easy way.
• Storage classes provided are:
1. Standard
2. Standard_IA i.e., Standard Infrequent Access
3. Intelligent_Tiering
4. OneZone_IA
5. Glacier
6. Deep_Archive
7. RRS i.e., Reduced Redundancy Storage (Not recommended by AWS)
• Data access is provided through S3 Console which is a simple web-based interface.
• Data stored can be either Public or Private based on user requirement.
• Data stored can be encrypted.
• We can define life-cycle policies which can help in automation of data transfer, retention
and deletion.
• Amazon Athena can be used to "query" S3 data as per demand.
Tasks:
1. Login to AWS Management Console.
2. Create an S3 bucket.
3. Upload an object to S3 Bucket.
4. Access the object on the browser.
5. Change S3 object permissions.
6. Setup the bucket policy and permission and test the object accessibility.
on , this will open your AWS Console Account for this lab in a new tab. If
you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Lab Steps:
Create S3 Bucket
1. Make sure you are in N.Virginia Region.
the section.
o Click on the .
o Close the pop up window if its still open.
o Click on the .
o Click on the .
o Browse any local image or the image downloaded by name smiley.jpg.
• Select
• Click
• Return to the browser tab that displayed Access Denied and refresh the page.
• You can see your image is loaded successfully and publicly accessible now.
Create a Bucket Policy
1. In the previous step, you granted read access only to a specific object. If you wish to make
all objects inside bucket to be available publically, you can achieve this by creating a bucket
policy.
2. Go to the bucket list and click on your bucket name - mys3bucketwhizlabs.
4. In below policy update your bucket ARN in Resource key value and copy the policy code.
{
"Id": "Policy1",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1",
"Action": ["s3:GetObject"],
"Effect": "Allow",
"Resource":"replace-this-string-from-your-bucket-arn/*",
"Principal": "*"
}
]
}
• Click on
o Click on the .
o Click on the .
o Browse the image whizlabs_logo.png from your local.
o Click on the .
2. Once the image uploaded successfully, copy the image link and open into the browser.
o https://mys3bucketwhizlabs.s3.amazonaws.com/whizlabs-logo.jpg
3. You can see your image is loaded successfully and publicly accessible.
Introduction:
What is Versioning?
• Versioning is a means for keeping multiple variants of the same object in the bucket.
• Versioning is used to preserve, retrieve, and restore every version of every object stored in
S3 bucket.
• Versioning is done at S3 Bucket level.
• Versioning can be enabled from : AWS Console / SDKs / API.
• Versioning once enabled, we cannot completely disable it.
• Alternative for it is, placing the bucket in a "versioning-suspended" state.
• Drawback of having multiple versions of an object is, you are billed since the objects are
getting stored in S3.
• In order to avoid having multiple versions of the same object, S3 has a feature called
Lifecycle Management, which allows us to decide on what to do when multiple versions are
piling upon an object.
• One advantage of versioning is, we can provide permissions on versioned objects i.e., we
can define which version of an object is public and which one is private.
Tasks:
1. Login to AWS Management Console.
2. Create a S3 bucket.
3. Enable object versioning on bucket.
4. Upload a text file to S3 Bucket.
5. Test object versioning with update text file and upload.
Steps:
Create a S3 Bucket
1. Make sure you are in N.Virginia Region.
o Click on the .
o Close the pop up window if its still open.
2. Click on .
3. Choose .
o In the previous step, you granted read access only to a specific object. If
you wish to grant access to an entire bucket, you should need to create
a bucket policy.
o Go to the bucket list and click on your bucket name.
o click the tab to configure:
▪ In below policy update your bucket ARN in Resource key value and copy the policy
code.
{
"Id":"Policy1",
"Version":"2012-10-17",
"Statement":[
{
"Sid":"Stmt1",
"Action":[
"s3:GetObject"
],
"Effect":"Allow",
"Resource":"replace-this-string-from-your-bucket-arn/*",
"Principal":"*"
}
]
}
3. You have successfully uploaded the test file into the Bucket and tested the
versioning.
Tasks:
1. Login to AWS Management Console.
again.
4. Make sure you are in N.Virginia Region.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Create an S3 Bucket
the section.
• Click on the .
• Close the pop up window if its still open.
Upload an Object
1. Click on tab.
• Object Creation: Select ‘Transition to One Zone-AI after’ from the drop-down list.
• Days after creation: Enter ‘35’.
8. Again Click .
• Select ‘Transition to Glacier after’ from the drop-down list.To move the object to
Glacier after. Enter ’90’ days.
• Click on the checkbox, saying ‘I acknowledge that this lifecycle rule will increase
the one-time lifecycle request cost if it transitions small objects.’
• Click on .
Note:
• Initially when the Object is uploaded, it will be in Standard storage class.
• When we create LifeCycle Policy, Object which you uploaded using Lifecycle rule
will be migrated to One Zone-IA after 35 days. It means the Object will be
available only in Single Availability Zone after 35 days.
• Next, Object will be migrated to Glacier after 90 days. It means, Object will be in
archive state. You need to retrieve the object from Glacier first in order to access
it.
• Note: Leave the days as 7 (Default value) means the objects which are not
properly uploaded will be deleted after 7 days.
12. Click on .
13. Before saving verify the configurations. And you can edit it,if you want to change
anything.Click .
14. Click on .
15. The LifeCycle for the Object will be enabled/Created, if there are no errors in it.
Completion and Conclusion
1. You have successfully used AWS management console to created a Lifecycle
rule for the object in S3 bucket.
2. You have configured the details while creating Lifecycle rule.
on , this will open your AWS Console Account for this lab in a new tab.If
you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Upload Image and make it Public
1. Make sure you are in N.Virginia Region.
o Click on .
5. Make Image Public :
o Click on the image name. You can see the image details like Owner, size, link, etc.
o Open image Link in a new tab.
6. A sample Image URL: https://999886072153.s3.amazonaws.com/Whizlabs_logo.jpg
o You will see AccessDenied message, meaning the object is not publicly accessible.
o Return to S3 Management Console, and goto S3 bucket and open your uploaded
image again.
3. Click .
4. Open the Image URL again or refresh the one already open.
5. If you can see your uploaded image in the browser, it means your image is publicly
accessible. If not, check your object permission again to make sure it is accessible by
everyone.
3. Click on the .
the .
8. You can see the column shows for your distribution. After
Amazon CloudFront has created your distribution, the for your distribution will
on .
o Click on the .
o Now we need to setup our custom error page:
▪ Click on the .
o Navigate back to Distributions and wait for your distribution to complete state to
change Deploy.
▪ Note: This process will take around 15 minutes.
o Once state changed to Deploy, for testing your error page,
▪ Enter the URL of an image which does not exist in your S3 bucket with
CloudFront domain name
▪ d1hmlwhed8zk6q.cloudfront.net/abc.jpg
▪ If you can see your HTML error page in the browser, means you successfully
setup your custom error page.
o Click on .
o
2. Go to the distribution list and wait for your distribution to complete State to
change .
o Once state changed to , for testing restriction enter and access image
through CloudFront in the browser.
▪ d1hmlwhed8zk6q.cloudfront.net/Whizlabs_logo.jpg
o You can see an error message :
▪ 403: Error The Amazon CloudFront distribution is configured to block
access from your country.
o If you see the error means you successfully restricted image access from your
country.
Introduction
What is EC2
• AWS defines it as Elastic Compute Cloud.
• It’s a virtual environment where “you rent” to have your environment created, without
purchasing.
• Amazon refers these VMs as Instances.
• Preconfigured templates for your instances i.e., Images called as AMI (Amazon Machine
Image) are available to quick start the job.
• Allows you to install custom applications, services and all those that you use for your activity.
• Scaling of infrastructure i.e., up or down is easy based on the demand you face.
• AWS provides multiple configurations of CPU, memory, storage etc., through which you can
pick the flavor that's required for your environment.
• No limitation on storage. You can pick the storage based on the flavor of the instance that
you are working on.
• Temporary storage volumes are provided, which are called as Instance Store
Volumes. Data stored in this gets deleted once the instance is terminated.
• Persistent storage volumes are available and are referred as EBS (Elastic Block Store).
• These instances can be placed at multiple locations which are referred as Regions and
Availability Zones (AZ).
• You can have your Instances distributed across multiple AZs i.e., with-in a single Region, so
that if an instance "fails", AWS automatically remaps the address to another AZ.
• Instances deployed in one AZ can be "migrated" to another AZ.
• To manage instances, images, and other EC2 resources, you can optionally assign your own
metadata to each resource in the form of tags.
• Tag is a label that you assign to an AWS resource. It contains a key and an optional value,
both of which are defined by you.
• Each AWS account comes with a set of "default limits" on the resources on a per-Region
basis.
• For any increase in the limit you need to contact AWS.
• To work with the created instances, we use Key Pairs.
Tasks:
1. Login to AWS Management Console.
2. Create an Amazon Linux Instance from an Amazon Linux AMI
3. Find your instance in the AWS Management Console.
4. SSH into your instance.
5. Install a Web server on the server
6. Create and publish a sample test.html file.
7. Test the page with public IP address of EC2 Instance created.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps:
the section.
2. Make sure you are in N.Virginia Region. Navigate to on the left panel
and Click on
3. Choose an Amazon Machine Image
(AMI):
the
5. Configure Instance Details: No need to change anything in this step, click
on
o Click on
8. Configure Security Group:
o To add SSH,
▪ Choose Type:
▪ Source: Custom(Allow specific IP address) or Anywhere (From ALL IP
addresses accessible).
o For HTTP,
▪ Click on
▪ Choose Type: HTTP
▪ Source: (Allow specific IP address)
▪ Click on
▪ Choose Type: HTTPS
12. Note down the sample IPv4 Public IP Address of the EC2 instance. A sample is shown in
below screenshot.
Tasks:
1. Login to AWS Management Console.
4. SSH into your instance and install a Web server on the server
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
on in the section.
2. Click on
(AMI):
the
5. Configure Instance Details: No need to change anything in this step, click
on
6. Add Storage: No need to change anything in this step, click
on
o Key : Name
o Value : MyEC2Server
o Click on
8. Configure Security Group:
o To add SSH,
▪ Choose Type:
▪ Click on
▪ Choose Type: HTTP
▪ Click on
▪ Choose Type: HTTPS
o sudo -s
2. Now run the updates using the following command:
o yum -y update
3. Once completed, let's install and run an apache server
o cd /var/www/html/
2. Create a sample test.html file using nano editor:
o nano test.html
3. Enter sample HTML content provided below in the file and save the file
with Ctrl+X, click Y to confirm the save then press Enter to confirm filename.
o <HTML>Hi Whizlabs, I am a public page</HTML>
4. Restart the web server by using the following command:
on under section.
3. Click on
3. Now you can see that the instance is associated with the Elastic IP address.
4. Go to the EC2 Instance and check the IPv4 Public IP and it should be the same
as Elastic IP.
5. Now, we will check the web page by entering the Elastic IP address instead of
the previous Public IP.
o Sample URL: 3.208.115.72/test.html
2. You have logged into EC2 instance by SSH, installed Apache server and
published a page.
3. You have allocated an Elastic IP address and associated it to the running
instance.
4. You have checked the web page with Elastic IP address which works.
Creating and Subscribing to SNS Topics, Adding
SNS event for S3 bucket
Lab Details:
1. This lab walks you through the creation and subscription of an Amazon SNS Topic. Using
AWS S3 bucket you will test the subscription.
2. Duration: 00:30:00 Hrs
3. AWS Region: US East (N. Virginia)
Introduction:
What is SNS?
• Stands for Simple Notification Service.
• Provides a low-cost infrastructure for the mass delivery of messages, predominantly to
mobile users.
• SNS acts as a single message bus that can message to a variety of devices and
platforms.
• SNS uses the publish/subscribe model for push delivery of messages.
• SNS enables us to decouple microservices, distributed systems, and serverless
applications using fully managed pub/sub.
• Publishers communicate asynchronously with subscribers by producing and sending a
message to a topic, which is a logical access point and communication channel.
• Subscribers i.e., web servers, email addresses, SQS queues etc., consume or receive
the message or notification over one of the supported protocols when they are
subscribed to the topic.
• Recipients subscribe to one or more "topics" within SNS.
• Using SNS topics, the publisher systems can fan out messages to a large number of
subscriber endpoints for parallel processing, including Amazon SQS queues, AWS
Lambda functions, and HTTP/S webhooks.
• SNS is reliable in delivering messages with durability.
• SNS can help in automatically scale the workload.
• Using topic policies, you can keep messages private and secure.
Task Details
1. Login to AWS Management Console.
2. Create SNS Topic
3. Subscribe to SNS Topic
4. Create S3 bucket
5. Update SNS Topic Access Policy
6. Create S3 Event
7. Testing the SNS Notification
on , this will open your AWS Console Account for this lab in a new tab. If
you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Create SNS Topic
the section.
2. Make sure you are in N.Virginia Region.
2. Click on .
3. Under Details:
o Protocol : Select Email
o Endpoint : Enter your <Mail Id>
o Note: Make sure you give proper mail id as you would receive a SNS notification
mail to this mail id.
4. You will receive a subscription mail to your mail id.
Create S3 Bucket
1. Navigate to AWS S3 by clicking on Services on top left corner. S3 is available
under Storage.
2. Click on .
3. Under Name and region:
o Bucket name : Enter unique bucket name mys3buckettestingsns
4. Click on .
5. Copy the name of your S3 bucket in a notepad.
4. Click on at the top right corner to edit the Access Policy of the SNS topic.
5. Expand Access Policy.
6. Update the bucket policy as shown below.
{
"Version":"2008-10-17",
"Id":"__default_policy_ID",
"Statement":[
{
"Sid":"__default_statement_ID",
"Effect":"Allow",
"Principal":{
"AWS":"*"
},
"Action":[
"SNS:GetTopicAttributes",
"SNS:SetTopicAttributes",
"SNS:AddPermission",
"SNS:RemovePermission",
"SNS:DeleteTopic",
"SNS:Subscribe",
"SNS:ListSubscriptionsByTopic",
"SNS:Publish",
"SNS:Receive"
],
"Resource":"arn:aws:sns:us-east-1:757712384777:mysnsnotification",
"Condition":{
"ArnLike":{
"aws:SourceArn":"arn:aws:s3:*:*:mys3buckettestingsns"
}
}
}
]
}
• Make sure you change the Bucket name and ARN as per your names which you have
copied in the notepad.
• Click on .
• Now your SNS topic has access to send notification event based on S3 bucket event.
Create S3 Event
1. Navigate back to S3 page.
2. Click on mys3buckettestingsns bucket.
3. Go to Properties tab.
4. Under Advanced settings, click on Events.
5. Click on .
o Name : Enter name for notification myemaileventforput.
o Events : Select and Check PUT.
o Send to : SNS Topic
o SNS : mysnsnotification
o Click on .
6. Now S3 bucket has been enabled with event notification for putting new objects through SNS
topic mysnsnotification.
Introduction:
What is Static Website?
• These are the most basic type of website and are easiest to create.
• Static web page is a web page that is delivered to the user's web browser exactly as stored.
• It holds fixed content, where each page is coded in HTML and displays the same information
to every visitor.
• No web programming or database design is required when working with them.
• They are safe bet when comes to security, since we do not have any interaction with
databases or relying on plugins.
• They are reliable, i.e., if any attack happens on the server the redirection to the nearest
safest node happens and thus provides you with the content.
• Accessing them is fast, due to non-existence of databases or plugins.
• Hosting of the website is cheap due to non-existence of any other components.
• Scaling of the website is easy, and can be done by just increasing the bandwidth.
Tasks:
1. Login to AWS Management Console.
2. Create a S3 bucket and upload a sample HTML page to the bucket.
3. Enable static website settings to S3 bucket.
4. Make the bucket public.
5. Test the website URL.
active, Now click on the button, this will open your AWS
Console Account for this lab in a new tab.If you are asked to logout in AWS
Management Console page, click on here link and then click
on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Creating a Bucket
1. Navigate to S3 by clicking on the menu at the top. Search and click
on .
2. Create Bucket
3. Click on .
4. Copy the Endpoint to your clipboard and save it in a Text Editor for later use.
o It will look similar to: http://bucketname.s3-website-us-east-
1.amazonaws.com
5. In the Static website hosting dialog box
• Click
• Index document : Type index.html
• Error document : Type error.html
• Click on .
6. Now download below two HTML files and upload them to your s3 bucket.
Download index.html
Download error.html
Introduction
IAM Policy
1. An IAM (Identity and access management) policy is an entity in AWS, which when attached
to an AWS resource, it defines their permissions.
2. Policies are stored in AWS in JSON format and are attached to resources as identity-
based policies in IAM.
3. You can attach an IAM policy to one of the AWS entities such as an IAM group, user, or role.
4. Thus IAM policy gives us the greater advantage of restricting the users or groups to give their
required privileges.
Policy Types
There are mainly two important types of policies such as:
• Identity-Based-Policies
• Resource-Based-Policies
Identity-Based-Policy
1. Identity-based policies that you can attach to an AWS identity such as user, group of users, or role.
2. These policies control what actions should an entity like user or group can perform, on which
resources, and under what conditions.
3. Identity-based policies further classified as below:
• AWS Managed Policies
• Custom Managed Policies
AWS Managed Policies
1. AWS Managed policies are those policies that are created and managed by AWS itself.
2. In case you are new to using policies, you can start using AWS managed policies first.
Custom Managed Policies
1. Custom managed policies are the ones that are created and managed by you in your AWS
account.
2. Customer managed policies provide us with more precise control than AWS managed
policies.
3. You can create and edit an IAM policy in the visual editor or by creating the JSON policy
document directly
4. You can create your own IAM policy using the following
link https://awspolicygen.s3.amazonaws.com/policygen.html
Resource-Based-Policy
1. Resource-based policies are those policies that we attach to a resource such as an Amazon S3
bucket.
2. Resource-based policies grant the specified permission to perform specific actions on particular
resources and define under what conditions should these policies apply to them.
3. Resource-based policies are in line with other policies.
4. There are currently no managed resource-based policies.
5. At present, there is only one type of resource-based policy called a role trust policy, which is
attached to an IAM role.
6. An IAM role is both an identity and a resource that supports resource-based
policies.
IAM Role
1. An IAM role is an AWS IAM identity that we can create in our AWS account that has specific
permissions.
2. It is similar to an IAM user, that determines what the identity can and cannot do in AWS.
3. Instead of attaching a role to a particular user or group, it can be attached to anyone who
needs it.
4. The advantage of having a role is that we do not have standard long-term credentials such
as a password or access keys associated with it.
5. Instead, when resources assume a particular role, it provides us with temporary security
credentials for our role session.
6. we can use roles to access to users, applications, or services that don't have access to our AWS
resources.
7. We can attach one or more policies with roles depending on our requirements.
8. For example, we can create a role with s3 full access and attach it to the Ec2 instance to access
S3 buckets.
2. Launch lab environment by clicking on . This will create an AWS environment with
the resources required for this lab.
on , this will open your AWS Console Account for this lab in a new tab.
4. If you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
Creating IAM Role
chose EC2 service for the role and then click on as shown
below in the screenshot.
chose .
5. Click on .
• Key : Name
• Value : ec2S3role
• Click on .
6. In Create Role Page,
• Role Name : Enter S3Role
Note : You can create Role in your desired name and attach to EC2 instance.
• Role description : Enter IAM Role to access S3
• Click on .
7. You have successfully created the role to access s3 shown in the below image
the section.
• Make sure you are in N.Virginia Region.
• Now under the left sub-menu click on 'Instances' and then Click
on
• Choose an Amazon Machine Image
(AMI):
• Choose an Instance Type: select and then click on
the
• Configure Instance Details:
o Scroll down to the IAM role and then select the role that we have created in the above
step.
• Click on
• Click on :
• Configure Security Group:
o Choose Create new security group
o Name : S3server-SG
o To add SSH
▪ Choose Type :
▪ Source : Custom - 0.0.0.0/0
o Click on
• Review and Launch: Review all settings and click on .
• Key Pair : This step is most important, Select Create a new key pair and click
on .
• Click on .
• Navigate to Instances. Once the Instance State changes from pending to running, EC2
instnace is ready.
• You can see the Instance is running as shown below
Viewing S3 bucket
8. Now Repeat the step 5 and create some more files like new.txt, smile.txt and upload it to the
S3 bucket using below commands
o touch new.txt smile.txt
o aws s3 mv new.txt s3://whizlabs7577123847772
o aws s3 mv smile.txt s3://whizlabs7577123847772
9. You can confirm the files uploaded to S3 bucket by navigating to AWS console
10. Now you can also list the files uploaded to S3 bucket from CLI from the EC2 instance using below
command
o aws s3 ls s3://whizlabs7577123847772
Completion and Conclusion
1. You have successfully created an IAM role to access s3 by granting s3 full access.
2. You have created EC2 instance with IAM role attached.
3. Upload file to s3 bucket in CLI from the EC2 instance.
4. Upload file to s3 bucket from AWS console.
AWS S3 Multipart Upload using AWS CLI
Lab Details:
1. This Lab walks you through the steps on how to upload a file to S3 bucket using
Multipart.
2. Duration: 01:00:00 Hrs
Tasks:
1. Login to AWS Management Console.
2. Create an S3 bucket
5. Create a directory
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
the section.
on as shown.
5. Type S3fullaccess in the search bar and then
choose .
6. Click on Next:Tags.
• Click on
8. You have successfully created the role to access s3 shown in the below image
Create a S3 Bucket
1. Make sure you are in the N.Virginia Region.
the section.
on in the section.
(AMI):
the
5. Configure Instance Details:
o Scroll down to the IAM role and then select the role that we have created
in the above step.
6. Scroll down to
o Under User data: section, Enter the following script, which will copy a
video file from Default S3 bucket to EC2 instance.
#!/bin/bash
sudo su
yum update -y
mkdir /home/ec2-user/whizlabs/
aws s3 cp s3://labtask69/video.mp4 /home/ec2-user/whizlabs/
o Then Click on
on
o Click on
9. Configure Security Group:
▪ Choose Type:
12. Key Pair : This step is the most important part of EC2 creation.
12. In the Description tab, Copy the IPv4 Public IP Address of the EC2
instance Multipart_Server
o cd whizlabs/
o ls -l
Note: This file is 145.8 MB in size, so we use the multi part feature to
upload this file to s3.
o ls -lh
Info: Here xaa,...,xad are the chunk files which are named alphabetically. Each
file is 40MB in size but not the last one. The number of chunk files depends on
the size of your original file and the bytes value your mentioning.
Note: Please copy the UploadId into a text file, Like Notepad.
Note: Copy the ETag id and Part number to your Notepad in your local
machine.
• Now repeat the above CLI command for each chunk file [Replace --part-
number & --body values with the above table values]
• Press UP Arrow Key to get back the previous command and no need to enter
Upload ID, just change Part Number, Body Value.
• Each time you upload each chunk/ part please don’t forget to save the Etag
value.
o nano list.json
2. Copy the below JSON Script and paste it in the list.json file.
Note: Replace the ETag ID according to the part number, which you have
got when uploading each part/ chunk.
{
"Parts": [
{
"PartNumber": 1,
"ETag": "\"2771bc3662b381da1259fdf39904045a\""
},
{
"PartNumber": 2,
"ETag": "\"9fdc79d796e33027565ac06358af966d\""
},
{
"PartNumber": 3,
"ETag": "\"eb9311b12d3c23b7543f08364bfe079b\""
},
{
"PartNumber": 4,
"ETag": "\"327c0ca55097aea8cb65c8bc8eee8b4f\""
}
]
}
3. Save the File list.json
the section.
3. On the S3 Page, Click on the Bucket name s3multipart-final
o Note: Choose the bucket name which you created in the beginning,
if s3multipart-final was not available.
2. You have successfully created EC2 Instance and copied the original file from S3
to EC2.
3. You have successfully splitted and uploaded the multiple individual parts.
Using AWS S3 to Store ELB Access Logs
Lab Details
1. This lab walks you through the steps to create ELB and store ELB access logs in
S3 Bucket.
2. In this lab, you will create two EC2 instances and attach them to the Elastic load
balancer.
3. Enabling Access logs in ELB to store S3 Bucket.
Introduction
Elastic Load Balancer
1. Load Balancer, a service that allows you to distribute the incoming
application or network traffic across multiple targets, such as
Amazon EC2 instances, containers, and IP addresses, in multiple Availability
Zones.
2. AWS currently offers three types of load balancers namely
6. There is no additional charge for access logs. You are charged storage costs for
Amazon S3, but not charged for the bandwidth used by Elastic Load Balancing
to send log files to Amazon S3.
7. Incase of managing multiple environments it would be better to store the logs in
separate S3 bucket so that it will be easy to find the easy with specific
environment.
Lab Tasks:
1. Launching two web servers installed with apache service
2. Launching Elastic load balancer by attaching the web servers by enabling the s3
access log feature at the time of creating load balancer
3. Testing the working of load balancer
5. You could see the Log files generated and you can download those files to your
local system to analyse the log files.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
and click on .
2. Configure the security group as follows
• Once given the above details, click on and a security group for load
balancer will be created.
2. Click on .
3. Choose an Amazon Machine Image (AMI):
• Number of instances :1
• Auto-assign Public IP : Select Enable
• Click on .
• Under the User data section, Enter the following script, which creates an HTML
page served by Apache HTTPD web server.
#!/bin/bash
yum install httpd24 -y
service httpd start
chkconfig httpd on
echo "RESPONSE COMING FROM SERVER A" > /var/www/html/index.html
6. Now click on
on
• Click on :
9. Configure Security Group:
• Name : Enter webserver-SG
• Description : Type security group for webserver
• To add SSH
o Source : Choose
• To add HTTP
o Choose Type : Select HTTP
o Source : LoadBalancer-SG
10. After that click on .
11. Key Pair : This step is most important, Create a new key Pair
on .
6. In the next step ignore the warning and click
on .
7. Configure Security Settings:
• Select an existing security group and chose the security group LoadBalancer-
SG that we created in the above step as shown below
8. configure Routing
• Target Group: Select New target group (default)
o Name : Enter web-server-TG
o Target Type : Select Instance
o Protocol : Choose HTTP
o Port : Enter 80
o Note: The target group is used to route requests to one or more
registered targets
• Health check:
o Protocol : HTTP
o Path : /index.html
o Note: The load balancer periodically sends pings, attempts connections, or
sends requests to test the EC2 instances. These tests are called health
checks
• Create an index.html in the default Apache document root /var/www/html of
web servers to pass the health check. It will be created in the future steps.
9. Registering Targets
click on .
2. Now click on and then click on Edit attributes and enable the
access log feature
3. Check the box next to Access log and enter the name of bucket where you need
to store the ELB access logs. For example here the bucket
name is whizlabs34675
4. Check the box Create this location for me to create the S3 bucket in the same
region of your ELB.
5. Incase of getting the error the bucket already exists as shown below, try with
the different name for the bucket.
6. Finally click on , Now navigate to S3 console and you can see the
new bucket created as shown below
2. Refresh the browser couple of times and you will see the request is serving from
both the servers .i.e you can see the output as RESPONSE COMING FROM
SERVER A & RESPONSE COMING FROM SERVER B implies that load is
shared between the two web servers via Application Load Balancer.
3. Now navigate to S3 console and enter into the bucket that you created to store
ELB access logs and you can find the access logs under AWSLogs folder
4. Now access the load balancer URL and see whether the access logs registered
in the bucket.you could see the new folder created under AWSLogs folder as
shown below
5. You can download the generated access log files which is in zip format to your
local system to review the file.select the file and click on button
as shown below
7. your log file entry will look like something given below
2. You have created ELB and attached the web servers with ELB and enabled the
Access logs to store it in S3 bucket.
3. Downloaded the log file and reviewed the log files.
Introduction to AWS Relational Database Service
Lab Details:
1. This lab walks you through to the creation and testing of an Amazon Relational Database Service
(Amazon RDS) database. We will create an RDS MySql Database and test the connection using
MySQL Workbench.
2. Duration: 00:50:00 Hrs
3. AWS Region: US East (N. Virginia)
Task Details
1. Create RDS Database Instance
2. Connecting to RDS Database on a DB Instance using the MySQL Workbench
3. Test Connection.
Prerequisites:
1. For testing this lab, it is necessary to download MySql GUI Tool as MySql Workbench go to
the Download MySQL Workbench page. Based on your OS select respective option
under Generally Available (GA) Releases. Download and Install.
on , this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Create RDS Database Instance
the section.
o
4. Let’s configure the database.
5. Select Engine:
o Select the checkbox in the bottom of the page to see only those settings available
the .
8. Navigate to .
9. On the RDS console, the details for new DB instance appear. The DB instance has a status
of creating until the DB instance is ready to use. When the state changes to Available, you
can connect to the DB instance. It can take up to 20 minutes before the new instance status
becomes Available.
• Click on to make sure that you are able to connect to the database
properly.
• Click on ok and ok again to save the connection.
4. Double click on it to open the database. Enter the database password if promted.
5. After successfully connecting and opening the database, you can create tables and perform
various queries over the connected database.
6. Navigate to Schemas tab to see the available databases and you can start doing database
operations. More details on database operations are available here.
Lab Details:
1. This lab walks you through AWS Elastic Load Balancing. Elastic Load Balancing
automatically distributes incoming application traffic across multiple Amazon EC2 instances
in the cloud. In this lab, we will demonstrate elastic load balancing with 2 EC2 Instances.
2. Duration: 00:30:00 Hrs
3. AWS Region: US East (N. Virginia)
Introduction:
What is Elastic Load Balancing?
• ELB is a service that automatically distributes incoming application traffic and scale
resources to meet traffic demands.
• ELB helps in adjusting capacity according to incoming application and network traffic.
• ELB can be enabled within a single availability zone or across multiple availability zones
to maintain consistent application performance.
• ELB offers features like:
o Detection of unhealthy EC2 instances.
o Spreading EC2 instances across healthy channels only.
o Centralized management of SSL certificates.
o Optional public key authentication.
o Support for both IPv4 and IPv6.
• ELB accepts incoming traffic from clients and routes requests to its registered targets.
• When unhealthy target or instance is detected, ELB stops routing traffic to it and
resumes only when the instance is healthy again.
• ELB monitors the health of its registered targets and ensures that the traffic is routed
only to healthy instances.
• ELB's are configured to accept incoming traffic by specifying one or more listeners. A
listener is a process that checks for connection requests.
• Listener is configured with a protocol and port number from client to the ELB, and vise-
versa i.e., back from ELB to target.
• ELB supports 3 types of load balancers.
o Application Load Balancers
o Network Load Balancers
o Classic Load Balancers
• Each load balancers are configured in different ways.
• Application and Network Load Balancers, you register targets in target groups and route
traffic to target groups.
• Classic Load Balancers, you register instances with the load balancer.
• AWS recommends users working with Application Load Balancer to use multiple
Availability Zones.
• Reason for this recommendation is, even if one availability zone fails load balancer can
continue to route traffic to the next available one.
• We can have our load balancer to be either internal or internet-facing.
• The nodes of an internet-facing load balancer have Public IP addresses, and the DNS
name is publicly resolvable to the Public IP addresses of the nodes.
• Due to this, internet-facing load balancers can route requests from clients over the
Internet.
• The nodes of an internal load balancer have only Private IP addresses, and the DNS
name is publicly resolvable to the Private IP addresses of the nodes.
• Due to this, internal load balancers can only route requests from clients with access to
the VPC for the load balancer.
• Both internet-facing and internal load balancers route requests to your targets using
Private IP addresses.
• Hence your targets do not need Public IP addresses to receive requests from an internal
or an internet-facing load balancer.
Tasks:
1. Login to AWS Management Console.
2. Launch two EC2 Instances. Using Bash script to install Apache httpd and publish sample
HTML page.
3. Register them with ELB.
4. Create an application ELB with public IP.
5. Simulate a shutdown of EC2 to test by using the public DNS of the ELB.
on , this will open your AWS Management Console Account for this lab
in a new tab.If you are asked to logout in AWS Management Console page, click on here link
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Launching EC2 Instances 1
the
on
6. Configure Instance Details:
o Auto-assign Public IP : Select Enable
o Under User data: section, Enter the following script, which creates an HTML page
served by Apache httpd web server.
#!/bin/bash
sudo su
yum update -y
yum install httpd -y
systemctl start httpd
systemctl enable httpd
echo "<html><h1>Welcome to Whizlabs Server 1</h1><html>" >>
/var/www/html/index.html
o Click on .
9. Configure Security Group : Select Create a new security group,
o Security group name: Enter MyWebserverSG
o Description : Enter My EC2 Security Group
o To add SSH,
▪ Choose Type:
▪ Source: Anywhere (From ALL IP addresses accessible).
▪ Choose Type:
▪ Source: Anywhere (From ALL IP addresses accessible).
o Click on .
10. Review and Launch : Review all your select settings and click on the .
11. Key Pair: This step is most important,Select Create a new key Pair from the dropdown list
and Enter MyWebKey
13. Click on .
14. Your instances are now launching, navigate to EC2 instance page.
1. click on the
on
4. Configure Instance Details:
o Click on .
7. Configure Security Group : Select Select an existing security group,
• Click on .
8. Review and Launch : Review all your select settings and click on the .
9. Key Pair: This step is most important,Select Choose an Existing Key pair from the
dropdown list and Choose MyWebKey from the list.
2. Click on .
the .
4. The next five screens will require configuration modification from defaults. If a field is not
mentioned, leave it as default or empty.
o Configure Load Balancer:
▪ Name: Enter MyLoadBalancer
5. Configure Routing:
o Target group: Select New Target Group
o Target group name : Enter MyTargetGroup
o Leave other settings as default.
o Under Health check settings :
▪ Path : /index.html
o Under Advanced health check settings:
▪ Healthy threshold : Enter “3”
▪ Unhealthy threshold: 2 (Default)
▪ Timeout: 5 seconds (Default)
▪ Interval: Enter “6” seconds
▪ Success codes: 200 (Default)
o Click on
6. Register Targets:
We need to the two EC2 instances in the target group of the load balancer.
• Under Instances, Select the Two EC2 instances (MyEC2Server1, MyEC2Server2) from the
list.
• Click on
8. You can see the message Successfully created load balancer. Click on .
Introduction
AWS Elastic Load Balancer
• Elastic Load Balancer is used to manage load balance between multiple EC2
instances running in multiple availability zones on AWS cloud
• It distributes the load across the targets to which the instances are associated
• It enables us to have the increased availability of the application in multiple
availability zones
• It’s a fully managed service which can distribute the incoming traffic to the AWS
resources in different availability zones
• It monitors the health of the targets and it routes traffic accordingly to the healthy
targets
• The load balancer can accept the incoming traffic by configuring listeners with a
protocol and port number
• The target group can be configured with a protocol and port number to route the
traffic to that particular target only if the target health is healthy
• Elastic load balancer supports Scaling which can be done automatically as the
traffic to the application changes.
• Modification of targets from the load balancer can be done without disturbing the
other requests at any point of time
Lab Tasks
1. Go to Console and Manually Create another 2 EC2 instances in default
VPC but in different availability Zone
2. Open the Console and Go to EC2 dashboard and SSH into the already available
EC2 instance with its public IP(an instance will be available initially at the time of
your lab launch)
3. From SSH using AWS CLI command configure the instance in region US-
N.Virginia region (us-east-1)
4. Using AWS CLI command create an Application Load Balancer
5. Using AWS CLI command Create 2 Target groups in default VPC which routes
the traffic based on the application traffic
6. Using AWS CLI command Register each EC2 instance(which you have
launched) with each Target group separately
7. Using AWS CLI command Create a default listener rule
8. Using AWS CLI command Create another 2 rules , each rule to route the traffic
to separate target group based on paths
9. Using AWS CLI command verify the health of the targets
10. Copy the DNS URL of the load balancer, access the URL from the browser
and verify that the routing is done according to the rules and also verify the
contents of the target group.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
2. Click on
the .
5. In Configure Instance Details Page:
• Click on .
• Under the User data section, Enter the following script, which will make the server
as web server
#!/bin/bash
yum install httpd-24 -y
yum install -y httpd24-devel
service httpd start
chkconfig httpd on
touch /var/www/html/index.html
echo “REQUEST SERVED FROM INSTANCE1” >>
/var/www/html/index.html
chmod 777 /var/www/html/index.html
mkdir -p /var/www/html/images
touch /var/www/html/images/test.html
echo “REQUEST SERVED FROM IMAGES PATH OF
INSTANCE1” >> /var/www/html/images/test.html
• Now click on
6. In the Add Storage Page : No need to change anything in this page it will have a
• Click on
• Key : Name
• Value : Instance1
• Click on
8. On the Configure Security Group page:
9. Review and Launch : Review all your select settings and click on .
10. Key Pair - This step is most important, Create a new key Pair and click
2. Click on
the .
5. In Configure Instance Details Page:
• Click on .
• Under the User data section, Enter the following script, which will make the server
as web server
#!/bin/bash
yum install httpd-24 -y
yum install -y httpd24-devel
service httpd start
chkconfig httpd on
touch /var/www/html/index.html
echo “REQUEST SERVED FROM INSTANCE2” >>
/var/www/html/index.html
chmod 777 /var/www/html/index.html
mkdir -p /var/www/html/work
touch /var/www/html/work/test.html
echo “REQUEST SERVED FROM WORK PATH OF INSTANCE2”
>> /var/www/html/work/test.html
• Now click on
6. In the Add Storage Page : No need to change anything in this page it will have a
• Click on
• Key : Name
• Value : Instance2
• Click on
8. On the Configure Security Group page:
• Click on Select an Existing security Group
• Select the Security group name→loadbalancer_SG(which will have inbound port
22 and 80 open for all traffic)
9. Review and Launch : Review all your select settings and click on .
10. Key Pair - Select Choose an Existing key pair(from the dropdown) and Choose
the same key which you have created for Instance1.
4. For Mac/Linux users SSH into the whizlabs_instance by opening the terminal
and then execute the below command
o ssh→ ssh whizlabs_user@54.174.250.43
o Enter password → Whizlabs@321
5. For Windows user , SSH into the whizlabs_instance by downloading putty from
the link https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html and
then enter the following command in the Host Name ( or IP address ) section
• Host Name : whizlabs_user@54.174.250.43
• Enter password : Whizlabs@321
• Port : 22
6. Once SSH into the whizlabs_instance, configure your server by executing
below command to eliminate adding region in each command
• aws configure
7. Press Enter for both AWS Access key and AWS Secret key and Enter us-east-1
in the Default Region field.
• AWS Access key ID : Press Enter
• AWS Secret Key : Press Enter
• Default region name : us-east-1
• Default output format : Press Enter
Creating Load Balancer
1. Go to EC2 dashboard and Select the instance named Instance1 and copy
its Subnet ID from the description page to a text pad
2. Use Create-listener command along with the default listener ARN to create
a listener rule2 to forward the request to TG2 if its URL has to serve which
has work in its path, also kindly make sure the priority for this rule is different
from the listener rule 1
aws elbv2 create-rule --listener-arn arn:aws:elasticloadbalancing:us-east-
1:757712384777:listener/app/whizlabs-LB/c70ba276f3e59d69/12d71a05863d10c1 --priority
5 --conditions Field=path-pattern,Values='/work/*' --actions
Type=forward,TargetGroupArn=arn:aws:elasticloadbalancing:us-east-
1:757712384777:targetgroup/TG2/1dd2532226541d8c
2. Similarly Use describe-target-health command to verify the health status of the target
group TG2
aws elbv2 describe-target-health --target-group-arn arn:aws:elasticloadbalancing:us-east-
1:757712384777:targetgroup/TG2/1dd2532226541d8c
3. Now try to access the DNS Name along with images in its path as given below
and verify that it has to serve the request from TG1 but from the path images i.e
if the DNS URL path contains the name images then it has to route the traffic
to the target group TG1
Note→ Kindly use your Load Balancer DNS Name and kindly append
/images/test.html at the end
4. Now try to access the DNS Name along with work in its path as given below and
verify that it has to serve the request from TG2 but from the path work i.e if
the DNS URL path contains the name work then it has to route the traffic to
the target group TG2
DNS Name→ whizlabs-LB-379815337.us-east-
1.elb.amazonaws.com/work/test.html
Note→ Kindly use your Load Balancer DNS Name and kindly append
/work/test.html at the end
will be active, Now click on the button, this will open your AWS
Console Account for this lab in a new tab.If you are asked to logout in AWS
Management Console page, click on the here link and then click
on again.
Note : If you have completed one lab, make sure to signout of the aws account before
starting new lab. If you face any issues, please go through FAQs and Troubleshooting
for Labs.
Steps:
3. Click on
Note : If you did not find the AMI,from the left panel check "Shared with me" under the
ownership.
Choose
5. For the Configure details step, do the following:
o Name : Enter whizlabs
o Choose .
6. In Add Storage, No need to change anything just click
on
7. Configure Security Group:
o SSH will be added by default when you select the create a new security
group. A sample screenshot added below for your reference. Next, we
have to add rules for HTTP and HTTPS.
o For HTTP,
▪ Click on
▪ Choose Type: HTTP
▪ Source:
o For HTTPS,
▪ Click on
▪ Choose Type: HTTPS
▪ Source:
o Click on Review.
8. Key pair: We won't need to connect to instances as part of this lab. Therefore,
you can select Proceed without a key pair.
9. Now Click on
the .
• Choose .
• Select
• Now choose the launch configuration which you created in previous steps.
• Click on
4. Configure Auto Scaling group details
• Choose .
5. Configure scaling policies:
o click on .
o click on .
7. Now click on
9. Add Tags : Enter tags in key-value pair for identification of your autoscaling group.
• Click on
4. Once your instance is stopped, after 1-2 min you can see that the as per the auto
scaling group policy, your stopped instance will be terminating automatically, and
a new instance is launched to fulfill the policy condition. A sample screenshot
provided below.
3. You have stopped the EC2 instance to check that an Instance is created
automatically as per the requirement.
Using CloudWatch for Resource Monitoring,
Create CloudWatch Alarms and Dashboards
Lab Details:
1. This lab walks you through the various CloudWatch features available which are used for
resource monitoring.
2. Duration: 00:45:00 Hrs
3. AWS Region: US East (N. Virginia)
Task Details
1. Create EC2 Instance.
2. Create SNS Topic. Subscribe to your Mail Id.
3. Check EC2 CPUUtiliztaion Metrics in CloudWatch Metrics.
4. Create CloudWatch Alarm.
5. Stress CPU to trigger SNS Notification Email from CloudWatch Alarm.
6. Create a CloudWatch Dashboard and add various widgets.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
the section.
2. Make Sure you are in N.Virginia Region.
3. Click on .
4. Choose an Amazon Machine Image (AMI):
the
8. Add Tags:Click on
o Key : Name
o Value : MyEC2Server
o Click on
9. Configure Security Group:
o To add SSH,
▪ Choose Type: SSH
▪ Source: Anywhere
13. Note down the Instance-ID of the EC2 instance. A sample is shown in below screenshot.
the section.
2. Make sure you are inN.Virginia Region.
3. Click on Topicsin left panel.
4. Under Details:
o Name : MyServerMonitor
o Display name : MyServerMonitor
2. Click on .
3. Under Details:
o Protocol : Select Email
o Endpoint : Enter your <Mail Id>
o Note:Make sure you give proper mail id as you would receive a SNS notification mail
to this mail id.
4. You will receive a subscription mail to your mail id.
Using CloudWatch
Check CPUUtilization Metrics
the section.
2. Make sure you are inN.Virginia Region.
3. Click on Metrics in the Left Panel.
4. You should be able to see EC2under All Metrics. If EC2 is not visible, please wait for 5-10
minutes, CloudWatch usually takes around 5-10 minutes after the creation of EC2 to start
getting metric details.
5. Click on EC2. Select Per-Instance Metrics.
6. Here you can see various metrics. Select CPUUtilization metrics to see the graph.
7. Now on top you can see CPUUtilization graph which is at zero since we have not stressed
the CPU yet.
4.
o Click on .
5. In Next Page, Configure the following details:
o Under Metrics
▪ Period : 1 Minute
o Under Conditions
▪ Threshold type : Choose Static
▪ Whenever CPUUtilization is… : Choose Greater
▪ than :30
o Click on .
4. Open another Terminal in your local and SSH back in EC2 instance - MyEC2Server.
5. Run below command to see the CPUUtilization:
o Top
6. You can now see that %Cpu(s) is 100. By running stress command we have manually
increased CPUUtilization of EC2 Instance.
7. After 400 Seconds, %Cpu will reduce back to 0.
2. Click on .
o Dashboard name: MyEC2ServerDashboard
3.
o Click on .
4. Depending on how many times you triggered stress, you will see the graph with
Percentage details over timeline.
4. You can also add multiple Widgets to the same Dashboard by clicking
on .
Tasks:
1. Login to AWS Management Console.
2. Create your first Elastic Beanstalk Application and test.
Launching Lab Environment
1. Make sure to signout of the existing AWS Account before you start new lab session (if you
have already logged into one). Check FAQs and Troubleshooting for Labs , if you face
any issues.
on , this will open your AWS Console Account for this lab in a new tab.If
you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
1. Navigate to Elastic Beanstalk by clicking on the menu in the top, then click
on in the section.
2. Make sure you are in N.Virginia Region.
3. Once in Elastic Beanstalk, you’ll be presented with a getting started screen. All you need to
Tasks:
1. Login to AWS Management Console.
2. Create simple java application in Elastic Beanstalk.
3. Configure and add new RDS using Beanstalk environment configuration.
4. Get access to RDS database and perform database operations.
Prerequisites:
MySQL Server Setup
• Windows users need to
• Download MySQL Workbench and install.
o MySQL Workbench will be used for connecting to database and execute SQL
commands.
• Linux Users can need to install mysql. Run following command to install mysql in your local:
o brew install mysql
o Note: If you do not have brew please install brew or other means to install MySQL
on , this will open your AWS Console Account for this lab in a new
tab. If you are asked to logout in AWS Management Console page, click on here link and
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Create Elastic Beanstalk Environment
1. Navigate to Elastic Beanstalk by clicking on the menu in the top, then click
on in the section.
2. Make sure you are in N.Virginia Region.
3. Once in Elastic Beanstalk, you’ll be presented with a getting started screen. Click on
8. You can also change other important configuration options for your application
o Click on .
4. Database creation shall begin. There are several processes that needs to be completed for
the creation of RDS Database.
o Now click on
NOTE: We are only editing the Security group of the RDS for Testing purpose. With the
Security Group created by Elastic Beanstalk EC2 can communicate to RDS
Internally. If you need to SSH into EC2 or Connect to RDS from EC2 using SSH, then we
need to attach extra security group to EC2 (from AWS Documentation).
3. Click on
4. Enter Following Details:
o Connection Name : Enter Beanstalk Database
o Connection Method: Select Standard (TCP/IP)
o Hostname : Enter Endpoint link Example: aahge2dzlgbj5v.cdegnvsebaim.us-east-
1.rds.amazonaws.com
o Port : Enter 3306
o Username : Enter WhizlabsAdmin
o Password : Click on Store in Keychain and enter password.
▪ Password: Whizlabs123
5. Click on .
Introduction
AWS Elastic Beanstalk
1. AWS Elastic Beanstalk is a Platform as a Service(PaaS) offered by Amazon
which is an easy to use service for deploying and scaling web applications and
services with full control over the resources.
2. It allows us to use a wide selection of application platforms.
8. Once the deployment is done then simply route the traffic from blue
environment to green environment by swapping their CNAMEs.
9. After the routing is complete , if you face any issues in the green environment
then Elastic BeanStalk gives us an option to easily roll back to the blue
environment.
Advantages
• Zero downtime while updating the environments and swapping.
• It’s Easy to roll back to the older version if you face any issues in the new
environment.
Lab Tasks
1. Create an Elastic BeanStalk application.
3. Access the blue environmet’s URL and verify whether you get PHP application
page.
4. Create a Green environment with Node.js application.
5. Access the green environmet’s URL and verify whether you get Node.js
application page.
6. From the Green environment initiate swap environment URL.
7. Now verify whether the green environment’s URL is swapped with blue
environment’s URL.
8. Now access the new URL of green environment and verify whether you are
getting the Node.js application page.
on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
3. Provide the Application Name and Description as below and Click on Create
2. Select Environment Tier page Select Web server environment and then Click
on Select
3. In the Environment information section provide the Environment
name as whizlabs-green-env (Environment name should be unique, if you face
any errors while naming then provide some other environment name).
6. The environment will take some 5-10 Mins to provision the resources. Be patient
until the environment is setup and you can see the resources being provisioned
one-by-one.
7. Once the whizlabs-green-env is setup you will get a page with its Health
status as ok and with a URL.
8. Click on the URL on your top right corner and you will be navigated to the
Node.js application page as shown below.
Swapping the URLs from Blue to Green
1. Now we have two environments namely whizlabs_blue_environment with
PHP and another environment named whizlabs_green_environment with
Node.js
2. Here the two environments are different and you are going to swap the URLs.
4. Now we are going to swap the green environment with that of blue environment
for this Choose the Environment name as whizlabs-blue-environment from
the dropdown in the Select an Environment to Swap section and then Click
on Swap.
5. The swap will take a few seconds to complete and you can see
the Successfully completed status under Recent Events.
6. Once the Swap is completed kindly note the URL of
the whizlabs_green_env will be replaced with that of
the whizlabs_blue_environment.
NOTE: After swapping the URL’s if the page contents doesn’t change i.e if it
shows the same PHP page then clear the browser cache or try to access the
URL from some other browser.
3. You have successfully accessed the blue environmet’s URL and verified that
the content of the URL is a PHP application page .
4. You have successfully created a Green environment with Node.js application.
5. You have successfully accessed the green environmet’s URL and verified that
the content of the URL is Node.js application page.
6. From the Green environment you have successfully initiated swap
environment URL.
7. You have successfully verified that the green environment’s
URL is swapped with that of blue environment’s URL.
8. You have successfully accessed the new URL of green environment and
verified that the content of the URL is Node.js application page.
Introduction to AWS DynamoDB
Lab Details:
1. This lab walks you through to Amazon DynamoDB features. In this lab, we will
create a table in Amazon DynamoDB to store information and then query that
information from the DynamoDB table.
2. Duration: 00:30:00 Hrs
Introduction
What is AWS DynamoDB?
Definition:
• DynamoDB is fast and flexible NoSQL database and it's for applications that need
consistent single digit millisecond latency at any scale.And it's a fully managed
database and it supports both document and key value data models.
• It has a really flexible data model.So that means that you don't need to define your
database schema upfront and it has really reliable performance as well.
• And all of these attributes make it a really good fit for mobile gaming, ad-tech, IoT and
many other applications.
DynamoDB Tables:
DynamoDB tables consist of
• Item (Think of a row of data in a table).
• Attributes ((Think of a column of data in a table).
• Supports key-value and document data structures.
• Key= the name of the data. Value= the data itself.
• Document can be written in JSON, HTML or XML.
Tasks:
1. Login to AWS Management Console.
2. Create a DynamoDB table.
3. Insert data on that DynamoDB table.
4. Search for an item in the DynamoDB table.
on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
Create DynamoDB Table
o Add sort key : check and enter name respective field and select .
o The combination of Primary Key and Sort Key uniquely identifies each item in a
DynamoDB table.
o Click on .
4. Your table will be created within 2-3 minutes.
3. In query window, Enter partition key and sort key which you want to search.
o Partition Key :4
o Sort Key : Sarah
o Click on .
4. You will be able to see the result table with your filtered record. A sample screenshot is given
below:
5. You can also search with only Partition Key only or Sort Key. Try some test cases and play
around.
Introduction
Definition:
• DynamoDB is fast and flexible NoSQL database and it's for applications that
need consistent single digit millisecond latency at any scale.And it's a fully
managed database and it supports both document and key value data models.
• It has a really flexible data model.So that means that you don't need to define
your database schema upfront and it has really reliable performance as well.
• And all of these attributes make it a really good fit for mobile gaming, ad-tech, IoT
and many other applications.
DynamoDB Tables.
DynamoDB tables consist of
• Item (Think of a row of data in a table).
• Attributes ((Think of a column of data in a table).
• Supports key-value and document data structures.
• Key= the name of the data. Value= the data itself.
• Document can be written in JSON, HTML or XML.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in the AWS Management
Steps
Create DynamoDB Table
2. Once you select the create item option, you’ll see Username, OrderId but we
need 2 more attributes in our table. So click on then
on .
5. Add the parameters ReturnDate and UserAmount. Click
6. Once the GSI is active you can check in tab the parameter
attached to it. Check the index type.
Note: It will take 5-10 minutes to be active.
7. Move to tab and click on insert data into the table.
And click on .
Note: Refresh the console once if the newly added attributes are not
displayed in the field.
• UserName : HarryPotter
OrderId: 20160630-28176
ReturnDate: 20190513
UserAmount: 88.30
• UserName : Ron
OrderId: 20170609-25875
ReturnDate: 20190628
UserAmount: 116.86
• UserName : Ron
OrderId: 20170609-4177
ReturnDate: 20190731
UserAmount: 27.89
• UserName : Voldemort
OrderId: 20170609-17146
ReturnDate: 20190511
UserAmount: 114.00
• UserName : Voldemort
OrderId: 20170609-18618
ReturnDate: 20190615
UserAmount: 122.45
9. In case you added a wrong value, you can edit with the edit option of the column.
10. Once you have added all the data in the table, please review it.
Use Global Secondary Index to Fetch Data
1. Now with the help of GSI we will try to fetch the data from the table, avoiding full
scan. which will lead to better performance and resource utilization saving. We’ll
be adding filter conditions on return date and try to fetch the data.
2. Lets try with the Scan option and search data. We need a ReturnDate
(PartitionKey ) and check who the users returned item on that date and with sort
key we can qualify the amount as per the requirement.
• Select the “Scan” option which is to the left upper corner.
• In this example we are trying to fetch the users who return their orders in the
month of may.
• Select the “Index” option which we created and add a filter condition on
“ReturnDate” select data type “String” because our attributes are in “String” data
type, select clause “Between”-- “20190501” and “20190531”
3. Lets try with the Query option and search data.We need a ReturnDate
(PartitionKey ) and check who the users returned item on that date and with sort
key we can qualify the amount as per the requirement.
This global secondary index could enable use cases, such as finding all the returns
entered on various dates, that would require full table scans without the index.
2. You will practice using Amazon DynamoDB, Amazon Lambda function and S3
bucket.
3. Duration: 01:00:00 Hrs
Introduction
Amazon DynamoDB
• Amazon DynamoDB is a fully managed NoSQL database service where
maintenance, administrative burden, operative and scaling are taken care off.
• We Don't need to provide the specifications of how much we are going to save.
• It provides single digit latency even for terabytes of data and hence it is used for
applications where very fast reads are required.
• It is used in applications like gaming where data needs to be captured and
changes take place very quickly.
Lab Tasks
1. Create an Amazon DynamoDB table.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps
Create DynamoDB Table
under section of
on .
• Your table will be created within 2-3 minutes.
4. The DynamoDB Table will be ready to use when the Status becomes Active you
can verify the Status of the table by Navigating to Tables menu in the Dynamodb
Dashboard.
5. Download students.csv file to your local by clicking here. Open students.csv file
in your local system to see the data provided. This data will be imported to
DynamoDB Table.
6. This CSV file contains the comma separate values of students.
8. Once the File is successfully uploaded, you will be able to see the file inside the
bucket.
Console .
2. Make sure you are in N.Virginia region.
3. Click on and
• Choose one of the following options to create your function. Select Author from
Scratch
• Function Name : Enter csv_s3_dynamodb
• Runtime : Select Python 3.7 (Choose from Dropdown)
• Click on Choose or Create an execution Role and then Select Use an existing
Role
o Choose whizlabs_import_to_dynamodb_role from the dropdown menu
• Click on
4. Once the function is created, it will open the main page of Lambda function.
4. Once the lambda function is successfully executed, you will be able to see
detailed success message.
5. Navigate to DynamoDB Table whizlabs_students_table to see the imported
data.
3. Click on the Properties Tab and scroll down to Events in Advanced Settings.
4. Click on Events.
o Name : csv_upload
o All Object create events : check
o Suffix : Enter .csv
o Send to : Lambda Function
o Lambda : Select csv_s3_dynamodb
o Click on Save.
6. Now everytime a CSV file is Uploaded it will trigger lambda to import CSV data
from S3 bucket file to dynamoDB Table.
5. You can see that CSV data has been successfully imported to DynamoDB Table.
2. You will practice using Amazon DynamoDB, Amazon Lambda function and S3
bucket.
3. Duration: 01:00:00 Hrs
Introduction
Amazon DynamoDB
• Amazon DynamoDB is a fully managed NoSQL database service where
maintenance, administrative burden, operative and scaling are taken care off.
• We Don't need to provide the specifications of how much we are going to save.
• It provides single digit latency even for terabytes of data and hence it is used for
applications where very fast reads are required.
• It is used in applications like gaming where data needs to be captured and
changes take place very quickly.
Lab Tasks
1. Create an Amazon DynamoDB table.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
under section of
on .
• Your table will be created within 2-3 minutes.
4. The DynamoDB Table will be ready to use when the Status becomes Active you
can verify the Status of the table by Navigating to Tables menu in the Dynamodb
Dashboard.
8. Once the File is successfully uploaded, you will be able to see the file inside the
bucket.
Console
2. Make sure you are in N.Virginia region.
3. Click on and
• Choose one of the following options to create your function. Select Author from
Scratch
• Function Name : Enter json_s3_dynamodb
• Runtime : Select Python 3.7 (Choose from Dropdown)
• Click on Choose or Create an execution Role and then Select Use an existing
Role
o Choose whizlabs_import_json_file_to_dynamodb_role from the
dropdown menu
• Click on
4. Once the function is created, it will open the main page of Lambda function.
4. Once the lambda function is successfully executed, you will be able to see
detailed success message.
5. Navigate to DynamoDB Table whizlabs_company_table to see the imported
data.
Adding Event Triggers in Lambda for S3 Bucket
1. Navigate back to Lambda Page.
o Bucket : jsons3dynamo
o Event type : All object create events
o Suffix : Enter .json
o Click on Add.
5. Now Everytime a json file with extension .json is uploaded to the s3
bucket jsons3dynamo , it will trigger json_s3_dynamodb lambda function and
uploads data to dynamodb table.
Task Details
1. Create EC2 Instance.
2. Create SNS Topic. Subscribe to your Mail Id.
3. Create CloudWatch Event Rule.
4. Event : Stop and start the EC2 server to simulate SNS Notification Email from
CloudWatch Event.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Launching an EC2 Instance
This EC2 Instance will be used for checking Various features in CloudWatch.
the section.
2. Make Sure you are in N.Virginia Region.
3. Click on .
4. Choose an Amazon Machine Image (AMI):
the
8. Add Tags:Click on
o Key : Name
o Value : MyEC2Server
o Click on
9. Configure Security Group:
o To add SSH,
▪ Choose Type: SSH
▪ Source: Anywhere
the section.
2. Make sure you are inN.Virginia Region.
3. Click on Topicsin left panel.
4. Under Details:
o Name : MyServerMonitor
o Display name : MyServerMonitor
2. Click on .
3. Under Details:
o Protocol : Select Email
o Endpoint : Enter your <Mail Id>
o Note:Make sure you give proper mail id as you would receive a SNS notification mail
to this mail id.
4. You will receive a subscription mail to your mail id.
▪ Click on .
▪ Select SNS Topic from the target dropdown
▪ Topic : MyServerMonitor
o Click on .
4. In Step 2: Configure rule details Page, Under Rule definition,
o Name : MyEC2StateChangeEvent
o Description : MyEC2StateChangeEvent
o State : check(default)
o Click on .
Tasks:
1. Login to AWS Management Console.
2. Create an EC2 instance..
3. Create an Amazon RDS instance.
4. Create a connection to the Amazon RDS database on EC2 instance.
5. Create a Database and Add new tables and data to Database to test.
on , this will open your AWS ConsoleAccount for this lab in a new tab. If
you are asked to logout in AWS Management Console page, click on here link and then
click on again.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Lab Steps:
Launch EC2 Instance
1. Click on
2. Choose an Amazon Machine Image
(AMI):
the .
4. In Configure Instance Details Page:
o Network : Select default available VPC
o Subnet : Default selected
o Auto-assign Public IP : Enable - It should be enabled as public IP is needed for
connecting to EC2 via SSH.
o Click on
o Key : Name
o Value : MyPublicServer
o Click on
▪ Choose Type:
▪ Source: Custom(Allow specific IP address) or Anywhere (From ALL IP
addresses accessible).
o For HTTP,
▪ Click on
▪ Choose Type: HTTP
▪ Click on “ ”
▪ Choose Type: HTTPS
8. Review and Launch : Review all your select settings and click on .
9. Key Pair - This step is most important, Create a new key Pair and click
2. Click .
o Note: Make sure Only enable options eligible for RDS Free Usage
Tier is checked which is at the bottom of the page for this lab to work. If not some
configurations which are not part of free tier will not work and you will face issues.
o Click .
o Note: Make sure you note down all the details you entered. DB Instance Identifier,
Username, Password etc.. They will be used while connecting from EC2.
• Under Configure advanced settings, In the Network Security section, configure the
following:
o Virtual Private Cloud (VPC) : Select same default VPC which was available while
creating EC2
o Subnet Group : default
o Public accessibility : No
o VPC security groups : Create new VPC security group
o Leave other parameters as default.
• Under Database Options,
o Database name : Enter a database name - myrdsdatabase
o Leave other parameters as default.
• In the Backup section,
o For Backup retention period, select 0 days
o Leave other parameters as default.
• Enable deletion protection : uncheck
• Leave other parameters as default.
•
o Select the PublicEC2_SG.
o Click on .
SSH into EC2 and Connect to Your Database
1. SSH into EC2 instance. For more details go through SSH into EC2 instance from Mac or
Windows systems.
2. Once connected to the server:
o Change to root user: sudo su
o Install MySQL : yum install mysql
3. Connect to MySQL RDS Instance with following command
o Syntax: mysql -h <<mysql-instance-dns>> -P 3306 -u <<username>>-p
o In our Case: mysql -h mydbinstance.cdegnvsebaim.us-east-1.rds.amazonaws.com -
P 3306 -u rdsuser -p
4. Provide the password which was created during RDS instance creation.
5. You will enter the MYSQL Command line.
6. Lets create a simple database and table to see the working.
o Create a Database
▪ CREATE DATABASE SchoolDB;
o You can see the created database with following command
▪ show databases;
o Switch the database SchoolDB.
▪ use SchoolDB;
o Create a sample Table of Subjects.
▪ CREATE TABLE IF NOT EXISTS subjects (
subject_id INT AUTO_INCREMENT,
subject_name VARCHAR(255) NOT NULL,
teacher VARCHAR(255),
start_date DATE,
lesson TEXT,
PRIMARY KEY (subject_id)
) ENGINE=INNODB;
•
o Enter show tables; to see the table created.
o Insert some details into the table
▪ INSERT INTO subjects(subject_name, teacher) VALUES ('English',
'John Taylor');
▪ INSERT INTO subjects(subject_name, teacher) VALUES ('Science',
'Mary Smith');
▪ INSERT INTO subjects(subject_name, teacher) VALUES ('Maths', 'Ted
Miller');
▪ INSERT INTO subjects(subject_name, teacher) VALUES ('Arts', 'Suzan
Carpenter');
o Let's check the items added in the Table
▪ select * from subjects;
•
o Try out some more SQL commands and playaround to understand more.
o Once completed, run exit; to come out of MySQL client.
• You have successfully completed the lab.
• Once you have completed the steps click on End Lab from your whizlabs dashboard.
Tasks:
1. Login to AWS Management Console.
2. Create two S3 buckets. One for the source and One for the destination.
3. Create a Lambda function to copy the object from one bucket to another bucket.
4. Test the Lambda Function.
on , this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Create Two Amazon S3 Buckets
the section.
2. Create the 2 Amazon S3 Buckets
3. Create Source Bucket
o Click on .
o Bucket Name : mysourcebucket12345
▪ Note: Note that every S3 bucket name is unique globally. So create bucket
with available name.
o Region : US East (N. Virginia)
o Click on .
o Bucket Name : mydestinationbucket12345
▪ Note: Note that every S3 bucket name is unique globally. So create bucket
with available name.
o Region : US East (N. Virginia)
7. Now we have two S3 buckets(Source and Destination). We will make use of AWS Lambda
function to copy the content from source bucket to destination bucket.
2. Go to and select .
4. Click on the tab and copy paste the below policy statement in the editor:
o Policy JSON:
{
"Version":"2012-10-17",
"Statement":[
{
"Effect":"Allow",
"Action":[
"s3:GetObject"
],
"Resource":[
"arn:aws:s3:::mysourcebucket12345/*"
]
},
{
"Effect":"Allow",
"Action":[
"s3:PutObject"
],
"Resource":[
"arn:aws:s3:::mydestinationbucket12345/*"
]
}
]
}
• Only Edit source and destination bucket ARN based on bucket created by you. Make sure
you have /* after the arn name.
• Click on .
• In Create Policy Page:
o Policy Name : mypolicy.
o Click on .
o Filter Policies: Now you can see a list of policies, search for your policy by name.
Search for the name mypolicy created by you.
o Click on the
• Role Name:
o Role name : myrole
1. Go to menu, click on
2. Make sure you are in US East (N. Virginia) region.
o Choose .
o Function name : mylambdafunction
o Runtime : Select NodeJs
o Role : In the permissions section, click on
the to choose use an existing role.
o Existing role : Select myrole
o Click on
4. Configuration Page: Here we need to configure our lambda function.
5. If you scroll down a little bit, you can see the Function code section. Here we need to write
a NodeJs function which copies the object from source bucket and paste it into the
destination bucket.
6. Remove the existing code in AWS lambda index.js. Copy the below code and paste it into
your lambda index.js file.
var AWS = require("aws-sdk");
exports.handler = (event, context, callback) => {
var s3 = new AWS.S3();
var sourceBucket = "mysourcebucket12345";
var destinationBucket = "mydestinationbucket12345";
var objectKey = event.Records[0].s3.object.key;
var copySource = encodeURI(sourceBucket + "/" + objectKey);
var copyParams = { Bucket: destinationBucket, CopySource: copySource, Key:
objectKey };
s3.copyObject(copyParams, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
console.log("S3 object copy successful.");
}
});
};
7. You need to change the source and destination bucket name in the code based on your
bucket names in index.js lambda function code.
o Click on the .
3. Click on if needed.
Test Lambda function
1. If you have any image in your local you can use that image for testing otherwise download
below the image on your computer : Download Me
Tasks:
1. Login to AWS Management Console.
2. Create IAM Policy and IAM Role.
3. Create a lambda function.
4. Configure test event.
5. Trigger the lambda function manually using test event.
6. Test the new EC2 instance launched.
on , this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
2. Go to and select .
4. Click on the tab and copy paste the below policy statement in the editor:
o Policy JSON:
{
"Version": "2012-10-17",
"Statement": [
"Effect": "Allow",
"Action": [
"ec2:Describe*",
"ec2:CreateKeyPair",
"ec2:CreateSecurityGroup",
"ec2:AuthorizeSecurityGroupIngress",
"ec2:AuthorizeSecurityGroupEgress",
"ec2:CreateTags",
"ec2:DescribeTags",
"ec2:RunInstances"
],
"Resource": "*",
"Condition": {
"StringEquals": {
"ec2:Region": "us-east-1"
}
]
• Click on .
• In Create Policy Page:
o Policy Name : mypolicy.
o Click on .
o Filter Policies: Now you can see a list of policies, search for your policy by name.
Search for the name mypolicy created by you.
▪ Click on the .
o Role Name:
▪ Role name : myrole
1. Go to menu, click on .
2. Make sure you are in US East (N. Virginia) region.
o Choose .
o Function name : myEC2LambdaFunction
o Runtime : Select Python 3.6
o Role : In the permissions section, click on
the to choose use an existing role.
o Existing role : Select myrole
o Click on .
4. Configuration Page: Here we need to configure our lambda function. If you scroll down you
can see the Function code section. Here we need to write a Python code which will
provision an EC2 instance.
5. You will be using boto3 SDK for AWS to write the python code.
6. Remove the existing code in AWS lambda lambda_function.py. Copy the below code and
paste it into your lambda lambda_function.py file.
o Note: Explaining the python code is beyond the scope of this lab. It is simple boto3
python code which will provision EC2 instance on triggering.
import json
import boto3
import time
from botocore.exceptions import ClientError
def lambda_handler(event, context):
o Click on .
3. Lambda function now gets executed and EC2 instance will be provisioned.
4. Once it's completed, you will be seeing a success message as shown below. It will display
the details such as
o Duration : Lambda execution time.
o Log Output: it contains details of EC2 instance provisioned.
o etc...
Check the EC2 instance launched
1. Navigate to EC2 page from services menu.
2. Go to Instances in left menu.
3. You can see the EC2 instance that has been provisioned by the Lambda function.
Introduction
Amazon DynamoDB
• Amazon DynamoDB is a fully managed NoSQL database service where
maintenance, administrative burden, operative and scaling are taken care off.
• We don't need to provide the specifications of how much we are going to save.
• It provides single digit latency even for terabytes of data and hence it is used
for applications where very fast reads are required.
• It is used in applications like gaming where data needs to be captured and
changes take place very quick.
Lab Tasks
1. In this lab we are going to launch an Amazon DynamoDB table.
6. While the changes takes place in the DynamoDB table , the DynamoDB
Streams will trigger the Lambda function which will push the data to S3 bucket
as a text file.
7. Download and Verify the contents of the S3 bucket.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
section of .
• Primary Key : id and Click the dropdown and choose String and click on .
4. The DynamoDB Table will be ready to use when the Status becomes Active you
can verify the Status of the table by Navigating to Tables menu in the Dynamodb
Dashboard.
Creating Items and Inserting Data into DynamoDB Table
1. Now you need to Create Item and then insert data into the table which you
have created.
2. Navigate and Select the DynamoDB Table (whizlabs_dynamodb_table) which
you have created in the DynamoDB Dashboard.
3. Once you have selected the DynamoDB table the screen which you are working
will split into Two and in the right hand screen click on tab and then
Click on .
4. The Primary Key field (id) which you have entered will be there and you need to
create three other fields such as firstname, last name, and age this can be
done by clicking the icon and then choose Append and then Choose the
field type as String from the dropdown and then enter the appropriate values and
then Click on
5. Similarly Create another 2 or three items in the table with the same appropriate
field and its corresponding values.
6. Finally you can verify the values for the appropriate fields from the DynamoDB
dashboard.
Creating Lambda Function
Console .
2. Make sure you are in N.Virginia region.
3. Click on and
• Choose one of the following options to create your function. Select Author from
Scratch
• Function Name : Enter whizlabs_dynamodb_function
• Runtime : Select Python 3.7 (Choose from Dropdown)
• Click on Choose or Create an execution Role and then Select Use an existing
Role
o Choose whizlabs_dynamodb_role from the dropdown menu
• Click on
4. Once the function is created Click the function (whizlabs_dynamodb_function)
which you have created from the Lambda dashboard and then in the Function
Code section make sure your of having following details:
• Runtime : Python 3.7
• Code Entry Type: Code entry type
• Handler : lambda_function.lambda_handler
5. Now remove the existing codes in the function code environment window and
copy the below function code to your system notepad.
6. Download whizlabs_dynamodb_function.py. Copy and paste the code in
the Function Code Environment window and save the function
as lambda_function.py
7. Make sure to provide the correct dynamodb table name
ie whizlabs_dynamodb_table, if you are creating dynamodb table with some
other name make sure to provide the correct table name in the lambda function
code.
8. Navigate to S3 page and get copy the name of the new S3 bucket created for
this lab which will be in format whizlabs22222222. You will have similar S3
bucket with different numericals.
9. Navigate back to the Lambda function. Change the S3 bucket name in the
Lambda code ie whizlabs22222222 to your S3 bucket name
10. After updating the code scroll down to the Basic setting and change
the Timeout value to 1 min and then leave the other values as default and then
4. On the right side of screen click on and select the Triggers from the
dropdown.
o Click .
6. The DynamoDB Stream trigger will be ready once the State of the Trigger
is Enabled
5. Once the changes are made, Go to Triggers tab and then press refresh
button. Now the DynamoDB Streams will trigger the Lambda function to dump
the items of the table into text file named data.txt (it will take a minute to do this)
and it will upload the file to S3 bucket whizlabs2222222
6. Go to All Services→ S3→ whizlabs2222222/data.txt. Now navigate your
bucket and enter into the bucket Select data.txt file→ Click on Download.
Note: A bucket named whizlabs* will be present in the account which you are
working. * will be 10 digit number
7. Open the data.txt file, the contents of the text file will be in json format, and
check for the changes made.
9. Repeat the procedure of updating and adding new items to the table to see the
new changes.
Completion and Conclusion
• You have successfully used AWS management console to launch an Amazon
DynamoDB Table
• You have successfully Created Lambda function
• You have successfully inserted contents into the DynamoDb Table
• You have successfully verified whether the DynamoDB Streams has triggered the
Lambda function to dump the contents of the DynamoDB Table into S3 Bucket
AWS Lambda Versioning and alias from the CLI
Lab Details
1. This lab walks you through the creation of lambda function and creating versions
and alias for your lambda function in CLI from EC2 instance.
2. Duration: 01:00:00 Hours
Introduction
Lamda
1. AWS Lambda service allows you to run code without provisioning or managing
dedicated servers.
2. In other words, Lambda will be called Serverless Computing.
3. The interesting feature about lambda is you only need to pay for the compute
time you consume and no need to pay when your code is not running.
4. you can run code for virtually any type of application with zero administration with
the help of AWS Lambda functions.
5. Just upload your code to Lambda and it will take care of everything required to
run and scale your code with high availability
6. We can set triggering events for our lambda function when to run or when to
get triggered.
7. Lambda currently supports various languages such as java, python, node js, c,
etc using which you can write your lambda function.
3. The new version created is a copy of the unpublished version of the function.
4. Lambda allows us to change the function code and settings only on the
unpublished version of a function.
5. Each version of your Lambda function has its own ARN.
6. Once the function published the code and most of the settings are locked to
ensure a consistent experience for users of that version and you cannot edit or
modify the code of that version.
7. A Lambda alias acts as a pointer to a specific Lambda function version.
8. AWS allows us to create one or more aliases for the particular lambda function.
9. Each Alias has its own unique ARN like versions and pointing to a specific
version and cant point one alias to others.
10. You can update an alias to point to a new version of the function that is pointing
to some other function.
2. Creating the Lambda function and Alias from EC2 Server in CLI.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
Creating IAM Role
1. Click on and select IAM under
the section.
chose
5. Click on .
• Key : Enter Name
• Value : Enter Lambdaversion_Role
• Click on .
6. In Create Role Page,
• Click on .
7. You have successfully created the role to create lambda function.
8. Make note of Role ARN by clicking the created IAM role as shown above which
will be used in creating lambda function in CLI from EC2 instance.
• ARN : arn:aws:iam::757712384777:role/Lambdaversion_Role
• Public IP : 3.84.84.40
3. Mac/Linux users can open a terminal and then execute the given command.
Windows users can follow step 4.
• ssh lambda_user@3.84.84.40
• Enter password : Whizlabs@321
4. Windows user can download putty from the
link https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html and then enter
the following command in the Host Name ( or IP address ) section
• Host Name : lambda_user@3.84.84.40
• Enter password : Whizlabs@321
• Port : 22
3. Now download the file s3bucket.py to your local system that has code for
creating s3 bucket whizlam17 and uploading file to s3 bucket stating file with the
latest version.
4. Open the s3bucket.py file from your local system using the preferred application
and then copy the text content.
5. Navigate to S3 dashboard and note down the name the name of bucket starts
with whizlabs. Here the bucket name is whizlabs79017537
6. Navigate to the server and create a file named s3bucket.py using below
command
• vi s3bucket.py
7. Paste the content by replacing the name of bucket whizlam17 with the one
noted in previous step and save it by pressing shift+colon followed by wq! and
then enter to save your s3bucket.py file.
8. Create a Zip file of the s3bucket.py file which is used to create lambda function in
CLI using below command
• zip s3bucket.zip s3bucket.py
9. Create a lambda function from CLI using the following command
• aws lambda create-function --function-name lambdaclidemo --runtime
python3.7 --zip-file fileb://s3bucket.zip --handler s3bucket.handler --role
arn:aws:iam::757712384777:role/Lambdaversion_Role
o Function name : lambdaclidemo
o Runtime : python3.7
o Handler : s3bucket.handler
o Role
ARN : arn:aws:iam::757712384777:role/Lamdaversion_Role
• You can find the details of the created lambda function in CLI as shown in
the above screenshot with $LATEST version
• Now navigate to Lambda dashboard in AWS console to view the current
versions of the function, choose a function, --> Qualifiers.--> Versions tab
and the Versions panel will display the list of versions for the selected
function.
10. If you haven't published a version of the selected function,
the Versions panelists only have the $LATEST version as shown
2. To invoke the lambda function in command line you can run the below
command.you can see that it will create the lambda with $LATEST version.
• aws lambda invoke --function-name lambdaclidemo --invocation-type
RequestResponse outputfile.txt
2. In AWS console you can find the newly published version of our lambdaclidemo
function as version 1. Navigate to Lambda dashboard, choose a function, --
> Qualifiers.--> Versions tab and the Versions
3. Let us change the content of file and the name of the file and upload it to s3.
• First navigate to EC2 CLI and open the file using vi editor using below
command
o vi s3bucket.py
• Change the content to File uploaded by version 2 and file name
as version2.txt as shown below
• Save the file by pressing shift+colon followed by wq! and pressing enter.
• Now remove the existing zip file s3bucket.zip and create new zip file with
updated codes using below commands
o rm -f s3bucket.zip
o zip s3bucket.zip s3bucket.py
• You can update the new code for your lambda function using below
command
o aws lambda update-function-code --function-name
lambdaclidemo --zip-file fileb://s3bucket.zip
• Now invoke the $LATEST function with the updated codes
o aws lambda invoke --function-name lambdaclidemo --invocation-
type RequestResponse outputfile.txt
• From AWS console in Lambda dashboard, click on the function name →
Qualifiers→ versions you can find that the version 1 with the file name
version1.txt and latest version with version2.txt.
• You can also confirm it by navigating to s3 console and click on s3 bucket
named whizlam17 with file named version1.txt and version2.txt,since we
invoked the lambda function twice with two different contents.
3. To delete Alias you can run delete command as mentioned below. let us delete
the alias named DEV
• aws lambda delete-alias --function-name lambdaclidemo --name DEV
• Once deleted, Navigate to Lambda dashboard and then refresh and you can
find that the DEV alias has been removed for the function version 1 and you
can only see the PROD alias.
2. You have created a lambda function and published it to create a new version.
Tasks:
1. Login to AWS Management Console.
2. Create a new CloudFormation Stack using JSON file provided in S3 bucket.
3. Test the Environment created by CloudFormation Stack.
on ?, this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
1. Navigate to menu in the top, click on in the section.
2. Make sure you are in N.Virginia Region.
3. You can see the bucket present with a name similar to whizlabs90553761. In your case the
name of the bucket might be different numerics.
the section.
o Click on .
3. Specify stack Details :
o Stack name: Enter a unique stack name - MyFirstCFStack
o Parameters
▪ DBName : Enter a database name - MyDatabase.
▪ DBPassword : Enter a database password - whizlabsdb123.
▪ DBRootPassword : Enter database root password - whizlabsdbroot123
▪ DBUser : Enter the database username - WhizlabsDBUser.
▪ InstanceType : Select t2.micro
▪ KeyName : Select the key from the list name whizlabs-key
▪ SSH Location : Enter 0.0.0.0/0
▪ Click on .
4. Configure stack options :
o Tags
▪ Key : Name
▪ Value : MyCF
o Permissions: No need to select for this lab leave it blank.
o Leave all other configuration fields as default.
o Click on .
9. Click on the refresh button beside New events available to see the updates.
Testing
1. Navigate to tab and you will be able to see the URL as mentioned below. Click
on the URL. This will take you to your server's home page.
o http://ec2-18-212-56-170.compute-1.amazonaws.com/
2. If you see the PHP info and your database connection message means, You have
completed a LAMP server setup with AWS CloudFormation. Sample screenshot provided
below:
Tasks:
1. Login to AWS Management Console.
2. Go through the Cloudformation template to understand all the terminologies.
3. Create a new CloudFormation Stack using JSON file provided in S3 bucket.
4. Test the Environment created by CloudFormation Stack.
on , this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Understand the Cloudformation Template
1. Navigate to menu in the top, click on in the section.
2. You can see the bucket present with a name similar to whizlabs44010075. In your case the
name of the bucket might be different numerics.
the section.
2. Make sure you are in N.Virginia Region.
o Click on .
4. Specify stack Details :
o Stack name: Enter a unique stack name - MyEC2CFStack
o As you can see below details are already autoloaded. These details are loaded from
the Lab_AWS_EC2_Provisioning_Using_CF.template.json.
o Parameters
▪ HTTPLocation : 0.0.0.0/0
▪ ICMPLocation : 0.0.0.0/0
▪ InstanceType : t2.micro
▪ KeyName : whizlabs-key
▪ SSHLocation : 0.0.0.0/0
▪ You can update the details if you want to or leave them as it is.
▪ Click on .
5. Configure stack options :
o Tags
▪ Key : Name
▪ Value : MyEC2CF
o Permissions: No need to select for this lab leave it blank.
o Leave all other configuration fields as default.
o Click on .
10. Click on the refresh button beside New events available to see the updates.
Lab Details:
1. This lab walks you through how to create a VPC using AWS CloudFormation
Stack. In this lab we will launch an AWS CloudFormation template to create a
four-subnet Amazon VPC that spans two Availability Zones and a NAT that
allows servers in the private subnets to communicate with the Internet in order to
download packages and updates.
2. Duration: 00:55:00 Hrs
Tasks:
1. Login to AWS Management Console.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Console page, click on the here link and then click on again.
Note : If you have completed one lab, make sure to sign out of the
aws account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Lab Steps:
on .
o You can see the bucket name starting with whizlabs and numeric digits
like whizlab1234564543.
o Open that bucket and click on object name VPC_template.json.
o Now copy the Object URL to the clipboard for use in CloudFormation
template.
3. Then Click on .
Select .
5. Choose
6. Click on .
Note: You need to wait till 5-10 min to complete the stack resource creation.
on .
Note: You need to wait till 5-10 min to complete the stack resource creation.
Tasks:
1. Login to AWS Management Console.
3. Create a VPC.
4. Create a Subnet.
8. Edit the route table and add all traffic routes to the internet gateway.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps:
Create an IAM Role
1. Make sure you are in the N.Virginia Region.
the section.
3. Select from the left side panel and click on the to create
a new IAM Role.
4. Under Create Role section
o Choose the service that will use this role: Select and then click
on as shown.
5. Type EC2fullaccess in the search bar and then choose
6. Click on .
o Key : Enter Name
o Value : Enter VPC-CLI-lab
o Click on .
7. In Create Role Page,
o Click on .
8. You have successfully created the role VPC-cli-lab.
Launching an EC2 Instance
1. Make sure you are in the N.Virginia Region.
on in the section.
on
4. Choose an Amazon Machine Image
(AMI):
the
6. Configure Instance Details:
o Select the IAM role which we created above from the list.
7. Click on
8. Add Storage: No need to change anything in this step, click
on
o Key : Name
o Value : MyEC2Instance
o Click on
10. Configure Security Group:
o Assign a security group: Select
o Security Group Name: Enter MyEC2-SG
o Description: Enter SSH into EC2 instance
o To add SSH,
▪ Choose Type:
14. In the tab, Copy the IPv4 Public IP Address of the EC2
instance ‘MyEC2Instance’
SSH into EC2 Instance
o Please follow the steps in SSH into EC2 Instance.
2.
o Output of this command is as shown in below:
o Note: Please note down the VPC Id from the output and keep it in a text editor.
o Note: Please note down the Internet gateway id in your text editor.
o Note: Please note down your new route table id in your text editor.
Create a public route in the Route table that
point to the Internet gateway using AWS CLI
5. This command will create a subnet with CIDR block 10.1.1.0/24 in the VPC
created above.
o aws ec2 create-route --route-table-id rtb-c1c8faa6 --destination-cidr-block
0.0.0.0/0 --gateway-id igw-1ff7a07b --region us-east-1
Note: Please replace the route table id and internet gateway id with yours.
o Note: Please note down your new route table id in your text editor
Note: Please replace the route table id and subnet id with yours.
4. You can go through the Internet gateway and see its attached.
5. Goto the Route table page and click on routes and will be able to see new public
route.
8. You have successfully created a Route table and added a public route using
AWS CLI.
9. You have successfully associated the Subnet to route table using AWS CLI.
Introduction
Before going to Nested stack we need to be familiar with few concepts such
as Cloudformation, Stack and Template.
Cloudformation
1. Cloudformation is a service provided by AWS for designing our own
infrastructure using codes i.e Cloudformation provides us with IaC (Infrastructure
as code)
2. Currently, cloudformation supports two languages JSON and YAML. You can
write your code with one of the languages.
3. Cloudformation comes with great features like you can update your infrastructure
whenever you want and also delete the stack in case you don’t need it.
4. The fascinating feature of cloudformation is that it saves more time in building
infrastructure and helps in focusing on the development.
5. It is also possible to replicate our infrastructure with a minimal time period.
6. It eliminates human error and works perfectly according to the code you have
written. It consists of two main components namely Stack and Templates.
Template
1. A Cloudformation template is the YAML or JSON formatted text file that explains
about our infrastructure.
2. It consists of various sections like
Stack
1. A stack consists of a collection of resources.
3. The advantage of the stack is that it is easy to create, delete or update the
collection of resources.
4. The advanced stacks have a nested stack which have a collection of stacks.
Nested Stack
1. The name suggests that it consists of one or more stacks of reference to each
other.
2. When your infrastructure keeps on growing in some places there may be a
chance where we need to use a particular template number of times.
3. In such cases, we isolate such a common template and refer such templates with
other templates wherever it needs to form a nested stack.
4. In other words, nested stack itself consists of one or more nested stack forming
hierarchy of stacks.
5. The nested stack will have a parent stack that will have one or more child stacks.
Lab Tasks
1. Login to AWS Management Console.
2. Go through the Cloudformation template to understand all the terminologies.
3. Create the nested stack using the YAML file provided in the S3 bucket.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Case Study
In this lab, we are going to see the example of Nested stack by creating Autoscaling
group stack and Load balancer stack and attach the Load balancer stack with the
autoscaling stack using Nested stack.
Steps
Understand the Cloudformation Template
the section.
2. You can see the bucket present with a name similar to whizlabs44010075. In
your case, the name of the bucket might be different numerics.
Template for Autoscaling group
1. Open that bucket and select on the Nested_ASG.yaml file.
2. Nested_ASG.yaml file contains the YAML code for creating Autoscaling Group.
3. Download and open the Nested_ASG.yaml file. Go through the YAML code
provided for creating the Autoscaling group
• S3 File URL : https://whizlabs44010075.s3.amazonaws.com/Nested_ASG.yaml
4. You will be able to see the YAML code used for creating the Autoscaling
Group. Cloudformation stack attached with the Launch
configuration and security group for Launch configuration.
5. Below are some important details provided in the Cloudformation template for
creating the Autoscaling group.
• Parameters
o InstanceType: It is a WebServer EC2 instance type. It must be a valid EC2
instance type.
o KeyName: Name of an existing EC2 KeyPair to enable SSH access to the
instance. It must be the name of an existing EC2 KeyPair
o AMIid: It is the Id of an image present in the Northern Virginia region used
to launch your web server.
o LoadBalancerName: Name of the load balancer to which you have to
attach the Autoscaling group.
o User data: To install HTTPD service at the time of launching the instance and
putting a test page to check the working of the load balancer.
o SSHLocation: The IP address range that can be used to SSH to the EC2
instances. It must be a valid IP CIDR range of form x.x.x.x/x.
• Resources
o WebserverASG: Resource name for creating the Autoscaling group.
o LaunchConfig: launch configuration resource defined for the Autoscaling group..
o WebsecGroup: security group for the launch configuration.
2. Nested_LB.yaml file contains the YAML code for creating the Load balancer.
3. Download and open the Nested_LB.yaml file. Go through the YAML code
provided for creating the Load balancer.
• S3 File URL :
https://whizlabs44010075.s3.amazonaws.com/Nested_LB.yaml
4. Below are some important details provided in the Cloudformation template for
creating the Load Balancer.
• Resources
o ElasticLoadBalancer: Resource name for creating the load balancer.
o Elbsg: Resource name for creating a security group for the load
balancer.
o Outputs: Getting the name of the load balancer using the output to
refer it in the Autoscaling group.
This template is used for creating the nested stack using the above two
stacks Nested_ASG.yaml and Nested_LB.yaml. Here we are attaching the
Autoscaling group with the load balancer.
1. Open that bucket and select on the Nested_stack.yaml file
6. Finally copy the new S3 URL (Object URL) of the file Nested_stack.yaml which
we will use to create Nested stack in below steps. To copy the S3 URL, click on
Nested_stack.yaml file and copy the URL as shown below screenshot.
• Your link will be similar to the one given below
o https://whizlabs-cloudformation-nested-
stack.s3.amazonaws.com/Nested_stack.yaml
Creating a web server with Autoscaling group and Load balancer
using Cloudformation Nested stack
the section.
2. Make sure you are in N.Virginia Region.
o Click on .
4. Specify stack Details :
• Tags:
• Click on
7. Once you click on the create button, you will be redirected to the CloudFormation
stack list. A sample screenshot is provided below.
8. Status: You can see its status CREATE_IN_PROGRESS.
9. You need to wait for 1-5 minutes to complete the stack resource creation.
10. Click on the refresh button beside New events available to see the
updates.
4. You will find the Autoscaling group created by cloudformation nested stack as
shown.
3. You will find the Launch configuration created by nested stack as shown below
• Name : LaunchConfig
• Key name : whizlabs-key
• InstanceType : t2.micro
• Security Group : WEBSERVER_SG
Checking for EC2 instance
3. You can find the Instance running created by nested stack as shown
4. You can find One EC2 instance is launched and running since we give the
minimum size for our Autoscaling group as 2.
• Closely note the timestamp of the server created. Refresh the URL a couple of
times and you will get a response from the other server created in the
different timestamp.
5. Thus the above screenshots conclude that the load balancer routes the traffic across two
servers launched in different time.
6. We have successfully created web servers and Auto scaling group and Load balancer
and route the traffic using nested stack in cloud formation.
Introduction
Amazon CloudFormation
• A complex application which requires multiple AWS resources can
be managed by a single service called AWS CloudFormation. Managing the
multiple AWS resources is more time consuming than the time spent on
developing the applications.
• AWS Cloudformation service enables us to design the
infrastructure and setup AWS resources which can be managed with less
manual intervention in an orderly and predictable manner
• Its a tool which is used to design and implement your applications quickly.
• Abstract of the code is called template which can be written
in json or yaml file.
• Templates can be created using AWS Cloudformation designer.
• Templates can also be manually created and the language in which templates
are created is json or yaml.
• Templates created can be reused to replicate the design in multiple
environments also.
• Resources provisioned by the template is called Stack.
• The Stack is an updatable stack which can be modified at a later point of time
also(can be used to extend the AWS resources too).
• The CloudFormation will automatically configure and provision the
resources based on the template and it will automatically take care of
the dependencies handling between the resources.
• AWS Cloudformation enables us to manage the
complete infrastructure through a text file.
• If there is any errors occurring during the execution of the template the
cloudformation will roll back and delete the resources provisioned.
Amazon Lambda
• AWS Lambda is a Serverless Compute service which is an automated version
of EC2.
• It works without any server and it allows us to execute the code for any type of
application.
• The developer doesn't have to worry about the AWS resources to launch or the
steps needed to manage the resources.
• The configuration of the tasks are done as code and it is implemented in
Lambda and once executed it will perform the tasks.
• Provisioning and Managing are both taken care of by the Lambda function
code.
• The language AWS Lambda supports are Node.js, Python, C#, Java and Go.
• It allows us not to deploy an application whereas it allows us to execute the
background tasks.
• It allows us to run codes in response to events from other AWS services.
• Automatic Scaling is done by AWS Lambda which is based on the size of the
workload.
• The Lambda Codes are executed by Triggers which it receives from the AWS
resources.
• The cost of the AWS Lambda is very low as it depends on the amount of
time the code is computed and it will charge for every 100ms and for
the number of times the code is executed.
• The time for the Lambda function execution is between 100ms to 5 Mins.
• It offers resources varying from 128MB of memory to 3GB of Memory.
Lab Tasks
1. In this lab we will use two Cloudformation templates to create
two cloudformation stacks, one template for S3 stack and another for EC2
stack.
2. Cloudformation stack when launched will create a lambda function.
3. Triggering Lambda function created by Cloudformation, we will create a S3
bucket and an EC2 Instance.
4. Trigger the Lambda function by Configuring test events in AWS Lambda
Service individually for each Lambda function and then Test the Configured test
event of the Lambda function.
5. Finally Navigate to AWS S3 and AWS EC2 service and Verify that the
resource S3 and EC2 instance is created after testing the lambda function
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps
Cloudformation Template
4. Below are the few important details provided in the Cloudformation template for
creating S3 stack
Resources:
• Whizs3bucket→ Resource name for creating the s3 stack
• Type→ service which the template is going to use here its Lambda service
• Code→ It contains the location at which the lambda code is present
o S3 Bucket→ contains the name of the bucket where lambda code is
residing
o S3 Key→ contains the name of the lambda function which will be a zip file
• Role→ contains the ARN of the role for creating the required stack
• Timeout→ value of timeout value in secs
• Handler→ Name of the handler
• RunTime→ Name of the runtime(Python,json along with its version)
• Memory size→ memory size in MB
Editing the s3_bucket.json template
1. Navigate to IAM Services→ Roles→ Locate the IAM role
named whizlabs_cloudformation_lambda_role and then click on the Role and
then copy the ARN of the role and then paste in a notepad (This role is having
the privileges to provision the s3 stack)
2. Similarly Navigate to AWS S3 services→ Locate the name of the bucket that
starts with whizlabs followed by numerals then copy the name of the bucket and
paste it in a notepad (in my case its whizlabs37958732) make sure the bucket
has the file named lambda_function.zip
3. Open the s3_bucket.json file using desired app and then replace the S3 bucket
name and Role ARN with the one which you have copied in the notepad
4. Save the s3_bucket.json file and close the file(Don’t change the file name)
4. Below are the few important details provided in the Cloudformation template for
creating EC2 stack
Resources:
• Whizec2instance→ Resource name for creating the EC2 stack
• Type→ service which the template is going to use here its Lambda service
• Code→ It contains the location at which the lambda code is present
o S3 Bucket→ contains the name of the bucket where lambda code is
residing
o S3 Key→ contains the name of the lambda function which will be a zip file
• Role→ contains the ARN of the role for creating the required stack
• Timeout→ value of timeout value in secs
• Handler→ Name of the handler
• RunTime→ Name of the runtime(Python,json along with its version)
• Memory size→ memory size in MB
Editing the ec2_instance.json template
1. Navigate to IAM Services→ Roles→ Locate the IAM role
named whizlabs_cloudformation_lambda_role and then click on the Role and
then copy the ARN of the role and then paste in a notepad (This role is having
the privileges to provision the EC2 stack)
2. Similarly Navigate to AWS S3 services→ Locate the name of the bucket that
starts with whizlabs followed by numerals then copy the name of the bucket and
paste it in a notepad (in my case its whizlabs37958732) make sure the bucket
has the file named ec2_function.zip
3. Open the ec_instance.json file using desired app and then replace the S3
bucket name and Role ARN with the one which you have copied in the notepad
4. Save the ec2_instance.json file and close the file(Don’t change the file name)
Creating S3 Stack and testing the Lambda function
1. Make sure to choose V.Virginia region in the AWS Management Control
dashboard which is present in the top right corner
2. Navigate and click on CloudFormation which will be available
under section of
on
• In the Specify stack details page provide the Stack name as whizlabs-ec2-
5. In the Configure Stack options page don’t change or feed anything just click
on
7. Once you click on Create Stack your Stack (whizlabs-s3-stack) will start
creating the lambda function. Initially the stack status will
be CREATE_IN_PROGRESS and the stack will be created once its status
is CREATE_COMPLETE.
8. Now Go to All Services→ Compute→ Lambda. In the Lambda Dashboard.
Click on Functions and then locate the function with the name whizlabs-s3-
stack
11. In the Configure Test page, provide the Event name as test1 and then click
on .
12. Now in the Lambda functions dashboard, Click on Test and wait for
the execution to complete and its result to Succeed.
13. If the Execution result is Failed, click on the Details and find reason for failure.
If its due to Conflicting Conditional operations, then Scroll down to the
Function code section and change the bucket name. Save the function and then
Click on Test.
14. Now Go to All Services→Storage→ S3 and Check for the bucket name which
you have provided in the function code. In my case its k6-bucket
Creating EC2 Stack and testing the Lambda function
1. Navigate and click on which will be available
under section of
2. In the Cloudformation dashboard Click on Create Stack→ With new
resources(standard)
3. Now Click on Create Stack button present in top left corner, then click
on
• In the Specify stack details page, provide the Stack name as whizlabs-ec2-
on
9. In the Configure Test page provide the Event name as test2 and then click
on
10. Now in the Lambda functions dashboard, Click on Test and wait for the
execution to complete and its result to Succeed.
11. Now Go to All Services→Compute→ EC2. In the EC2 dashboard in
the Running Instances, Check for an instance created.
Task Details:
1. Create Aurora Database Instance.
2. Connecting to Amazon Aurora MySQL RDS Database on a DB Instance.
3. Connecting from local Linux/IOS/Windows Machine
4. Execute Database Operations
Prerequisites:
MySQL Server Setup
• Windows users need to download and install MySQL Workbench
o MySQL Workbench will be used for connecting to database and execute SQL commands.
• Linux Users can need to install mysql. Run following command to install mysql in your local:
o brew install mysql
o Note: If you do not have brew please install brew or other means to install MySQL
on ?, this will open your AWS Management Console Account for this lab
in a new tab. If you are asked to logout in AWS Management Console page, click
Steps:
Create RDS Database Instance
the section.
2. Make sure you are in N.Virginia Region.
16. Navigate to .
17. On the RDS console, the details for new DB instance appear. The DB instance has a status
of creating until the DB instance is ready to use. When the state changes to Available, you
can connect to the DB instance. It can take up to 5 minutes before the new instance status
becomes Available.
3. Depending on Linux,IOS or Windows in your local system, follow the steps below
3. Click on
o Enter Following Details:
▪ Connection Name : Enter Amazon Aurora
▪ Connection Method : Select Standard (TCP/IP)
▪ Hostname : Enter myauroracluster.cluster-cdegnvsebaim.us-east-
1.rds.amazonaws.com
▪ Port : 3306
▪ Username : Enter root
▪ Password : Click on Store in Keychain and enter a password.
▪ Password: Whizlabs123
4.
5. Click on
on , this will open your AWS Console Account for this lab in a new
tab. If you are asked to logout in AWS Management Console page, click on here link and
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps:
Launching an instance
1. Launch your lab environment by clicking on the button.
active, Now click on the button, this will open your AWS
Console Account for this lab in a new tab.
4. Click on
5. Choose an Amazon Machine Image
(AMI):
Note: There are 2 Amazon Linux AMIs. Make sure you select Amazon Linux 2 AMI
the
7. Configure Instance Details: No need to change anything in this step, just go to the next
step by clicking
8. Add Storage: No need to change anything in this step, just go to the next step by
clicking
9. Add Tags: No need to change anything in this step, just go to next step Configure Security
Group by clicking on
10. Configure Security Group:
o To add SSH,
▪ Choose Type:
▪ Source: Custom(Allow specific IP address) or Anywhere (From ALL IP
addresses accessible).
o For HTTP,
▪ Click on
▪ Choose Type: HTTP
▪ Click on “ ”
▪ Choose Type: HTTPS
11. Review and Launch- Review all your select settings and click on .
12. Key Pair - This step is most important, Create a new key Pair and click
13. Launch Status: Your instance is now launching, Click on the instance ID and wait for
14. Note down the sample IPv4 Public IP Address of the EC2 instance. A sample is shown in
below screenshot.
SSH into EC2 Instance
1. To SSH, please follow the steps in SSH into EC2 Instance.
Run a Test page in browser
1. To ensure that all the softwares are up to date, run below command:
o sudo yum update -y
2. Next step is to get the latest versions of MariaDB(a community-developed fork of MySQL)
and PHP. Run the following commands to install them both.
o sudo amazon-linux-extras install -y lamp-mariadb10.2-php7.2 php7.2
3. Now lets install Apache server and MariaDB.
o sudo yum install -y httpd mariadb-server
4. Lets the Apache Server
o sudo systemctl start httpd
5. We can also make the apache server start automatically every time we boot the instance
with following command
o sudo systemctl enable httpd
o Test whether its enabled or not with below command
▪ sudo systemctl is-enabled httpd
6. Now it's time to test whether sample test page of apache server is running or not.
o Copy your Public IPv4 address and enter in your browser and hit enter. If you see the
below test page, it means apache server is successfully installed.
•
o If test page is not opening, then you have made some mistake while installing and
starting apache server. Please check the above steps properly and do it again.
10. Lets delete the phpinfo.php file. We just used it to test, for security reasons, these details
should not be available on the internet.
o rm /var/www/html/phpinfo.php
10. Once you login with root credentials. It will look like shown below:
Install WordPress
1. Come back to Terminal, SSH back into the instance if exited.
2. Lets download and unzip the WordPress installation package, to download the latest
WordPress installation package with the wget command, use the following command which
will always download the latest release.
o wget https://wordpress.org/latest.tar.gz
3. Unzip and unarchive the installation package. The installation folder is unzipped to a folder
called wordpress.
o tar -xzf latest.tar.gz
4. Lets create a database user and database for WordPress installation
5. WordPress installation needs to store information, such as blog posts and user comments, in
a database. The following procedure helps us to create blog's database and a user that is
authorized to read and save information to it.
6. Start the database server to make sure MySQL is running.
o sudo systemctl start mariadb
7. Log in to the database server as the root user. Enter the database root password when
prompted.
o mysql -u root -p
8. We will create a user and password for the MySQL database. WordPress installation uses
these values to communicate with the MySQL database. Enter the following command by
changing to a unique user name and password.
o CREATE USER 'whizlabs-wordpress-user'@'localhost' IDENTIFIED BY
'some_strong_password';
o Note: Make sure to note down the username and password as it will be used in the
future.
9. Lets create a database. Make sure you give the database a descriptive, meaningful name,
such as wordpress-db.
o CREATE DATABASE `my-wordpress-db`;
o Note: The punctuation marks surrounding the database name in the command below
are called backticks. The backtick (`) key is usually located above the Tab key on a
standard keyboard.
o Make sure to note down the database name.
o You can see the craeted database in phpmyadmin
▪ http://3.87.51.36/phpMyAdmin/index.php
10. We have to grant full privileges for the database to the WordPress user that you created
earlier.
o GRANT ALL PRIVILEGES ON `my-wordpress-db`.* TO "whizlabs-wordpress-
user"@"localhost";
11. We have to flush the database privileges to pick up all of your changes.
o FLUSH PRIVILEGES;
12. Now lets exit the mysql client.
o exit
Note: There are multiple AllowOverride lines in this file; be sure you change the line in
the <Directory "/var/www/html"> section.
4. Save the file and exit your text editor. Press ctrl+o and enter to save. Press ctrl+x to exit.
5. Next we will provide file permissions for the Apache web server. Apply the following group
memberships and permissions (as described in Setting up permissions and LAMP
server).
6. We have to grant file ownership of /var/www and its contents to the apache user:
o sudo chown -R apache /var/www
7. Next grant group ownership of /var/www and its contents to the apache group:
o sudo chgrp -R apache /var/www
8. We have to change the directory permissions of /var/www and its subdirectories to add
group write permissions and to set the group ID on future subdirectories.
o sudo chmod 2775 /var/www
o find /var/www -type d -exec sudo chmod 2775 {} \;
9. Recursively change the file permissions of /var/www and its subdirectories to add group
write permissions.
o find /var/www -type f -exec sudo chmod 0664 {} \;
10. Restart the Apache web server to pick up the new group and permissions.
o sudo systemctl restart httpd
8. Enter Username as John Doe and the login password which you have set password which
Introduction
• A mobile company is holding a flash sale for their new model with great features
at the best price. It is expected a huge number of buyers to place their orders.
The company is holding limited stock for a limited period, so it’s important to track
the order that arrived first. Your flash sale receives huge response and only the
buyers who place the order first will receive the product and remaining users gets
to try in the next sale. Once the request is received, they are sent to a FIFO
queue before they are processed.
• Let’s understand how messages gets in and out of the queue. Assume the
consumer asks for a batch of up to 300 messages, AWS SQS starts filling the
batch with the oldest message (REQ A1). Now SQS keeps filling the queue until
the batch is full. In our case, assume batch contains only three request and now
the Queue is empty. Once the message batch has left from the queue SQS
considers the batch to be In-flight until the batch gets processed completely and
deletes or the visibility timeout gets expires.
When you have a single consumer, this is easy to process. The consumer gets a batch
of messages, does its processing and deletes the messages. Now the consumer is
ready to process and take up the next batch of the messages.You can also add auto
scaling group to scale your processing power depending upon the requirement.
Note: SQS won’t release the next batch of messages until the first batch has been
deleted.
Tasks
1. Labs on types of Queues.
2. What is Long Polling in SQS & Configure long polling for a queue.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
3. Give the name to your queue, in this example we are giving queue name
as MyWhizQueue.fifo .
Note The name of a FIFO queue must end with the .fifo suffix.
4. Standard is selected by default. Choose FIFO. In case if you want to create
a Standard queue select the Quick-Create Queue option. In this example we are
creating a FIFO queue first with all default option. So click on Quick-Create
Queue.
5. Once you click on Quick-Create Queue a fifo queue gets created as given in the
image below.
6. We’ll be creating a Standard Queue, with all the default option. The only
difference is we don't provide the suffix .fifo while creating the queue.Please find
the image below.
The Queue Type column helps you distinguish standard queues from FIFO
queues at a glance. For a FIFO queue, the Content-Based
Deduplication column displays whether you have enabled exactly once
processing.
7. The detail section provide all the important parameters including ARN, Name ,
URL of the queue.
Let's try to make changes for Long Polling in our existing queue
1. Select any queue (standard or FIFO) from the list and click on Configure to
make changes in our queue. We’ll be selecting FIFO queue as an example.
2. Once you have selected the Configure Queue, update the Receive Message
Wait Time parameter. It can be any value between 0 to 20 seconds. In our
example we have changed it to 10 seconds. This will make our long polling to
come in effect. If we keep the value as 0 or default, it is considered short polling.
Once you have changed click on Save changes.
6. Let’s try to make this work in our existing queue by configuring the visibility
timeout. Select any queue (standard or FIFO) from the list and click
on Configure to make changes in our queue. We’ll be selecting FIFO queue as an
example.
7. Once you have selected the Configure, update the Default Visibility
Timeout parameter. It can be any value between 0 seconds to 12 hours. In our
example we have changed it to 5 mins.
Once you have changed click on Save changes at the bottom to bring changes
in effect..
4. Once you have selected the Configure, update Delivery Delay parameter. It can
be any value between 0 seconds to 15 Mins. In our example we have changed it
to 60 secs.
Once you have changed click on Save changes at the bottom to bring changes
into effect..
Purge Queue & Delete Queue
1. In this topic we’ll try to purge the queue but let's understand first what happens if
we purge a queue.
• Purge Queue option allows us to delete the messages in the queue.
• The message deletion process can take upto 60 sec, depending upon the size of
the queue.
• Note: Once you call the Purge Queues action, messages cannot be retrieved
from the queue.
2. Les try to make this change to our existing queue. Click on Queue Option and
select Purge Queues.
3. Once you select the purge queues option it will ask for a confirmation.
Click Yes parameter and purge the queues.
4. Similarly we can delete the queue once it has fulfilled our requirement or it may
be no longer used. You can click on Queue Option and select Delete Queue.
5. Once you select the delete queue option it will ask for a confirmation.
Click Yes parameter and delete the queues.
SQS points to remember
• The basic difference between Delay Queue and Visibility Timeout is Delay
Queue hides the message when it is first added in the queue whereas visibility
timeout hides a message only after the message is retrieved from the queue.
• In-Flight: message in the queue not delayed not in visibility timeout. Message is
considered in-flight.
• Max 1,20,000 message can stay in queue.
• Message size is 256KB.
• It has Default Retention period.
• SQS is pull based not push based.
4. What is Long Polling in SQS & Configure long polling for a queue.
Lab Tasks
1. Login to AWS Management Console.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
4. You can choose to have users sign in with an email address, phone number,
username or preferred username plus their password.
5. Here we choose Email address or Phone number, where Users can use an
email address or phone number as their username to sign up and sign in. Here,
choose Allow email addresses.
6. We can choose the Standard Attributes, which will be required while performing
a sign up. Here, we choose Email, Name, Preferred Username, Phone Number
which are required to perform a signup.
7. We can also customize our attributes that are required while signup by
clicking Add another attribute.
8. Click on
Policies
1. We give the Minimum Password Strength and can add the required
parameters like numbers, lowercase, uppercase and special characters. Here,
we select all the parameters.
2. You can choose to only allow administrators to create users or allow users
to sign themselves up.
3. We choose the allow users to sign themselves up where the users can sign up
themselves without administrator interference.
4. You can choose for how long until a temporary password set by an administrator
expires if the password is not used. This includes accounts created by
administrators i.e if you choose only allow administrators to create users.
Here, we can leave the option as we don’t select it.
5. Click on
5. Click on
Message Customizations
1. You can send emails from an SES verified identity. Before you can send an email
using Amazon SES, you must verify each identity that you're going to use as a
From, Source, Sender, or Return-Path address to prove that you own it. For now,
we leave it as blank.
2. Amazon SES Configuration: Cognito will send emails through your Amazon
SES configuration. Select Yes if you require higher daily email limits otherwise
select No. Here, we select No - Use Cognito(Default).
3. Verification Type: You can choose to send a code or a clickable link and
customize the message to verify email addresses. We keep it default as code.
4. User Invitation messages: We can customize SMS message, Email subject and
Email message as how you want the text to be delivered to the user.
5. Click on
Tags:
1. You can create new tags by entering tag keys and tag values.
2. Click on
Devices
• We can choose to remember our User’s devices. Here, we choose No and click
on
App Client
1. The app clients that we add will be given a unique ID and an optional secret key
to access this user pool. We are not using any App Client here, so we proceed to
the
Customize Workflows
1. You can make advanced customizations with AWS Lambda functions. Pick AWS
Lambda functions to trigger with different events if you want to customize
workflows and user experience.
2. You can go through all the Events. We skip this and proceed to
Review:
• Review all the settings and click on Create Pool as shown below.
• Navigate to Cognito, click on Users and groups to navigate to the Users page
as shown below.
3. You have learnt how to do settings for Policies, MFA and Verifications.
API Gateway - Creating Resources and Methods
Lab Details
1. This lab walks you through the steps to Create Resources and Methods in API
Gateway.
2. You will practice using Amazon API Gateway.
Introduction
Amazon API Gateway
• Amazon API Gateway is a fully managed service that makes it easy for
developers to create, publish, maintain, monitor, and secure APIs at any scale.
• APIs act as the front door for applications to access data, business logic, or
functionality from your backend services.
• API Gateway handles all the tasks involved in accepting and processing up to
hundreds of thousands of concurrent API calls, including traffic management,
CORS support, authorization and access control, throttling, monitoring, and API
version management.
• Using API Gateway, you can create RESTful APIs and WebSocket APIs that
enable real-time two-way communication applications. API Gateway supports
containerized and serverless workloads, as well as web applications.
Lab Tasks
1. Login to AWS Management Console.
2. Choose an API.
3. Create a new API.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps
Create an API
the section.
3. Then choose create new API as under settings choose API name
2. Select in actions.
• Resource Name: Enter whizlabs
Introduction
Amazon API Gateway
• Amazon API Gateway is a fully managed service that makes it easy for
developers to create, publish, maintain, monitor, and secure APIs at any scale.
• APIs act as the front door for applications to access data, business logic, or
functionality from your backend services.
• API Gateway handles all the tasks involved in accepting and processing up to
hundreds of thousands of concurrent API calls, including traffic management,
CORS support, authorization and access control, throttling, monitoring, and API
version management.
• Using API Gateway, you can create RESTful APIs and WebSocket APIs that
enable real-time two-way communication applications. API Gateway supports
containerized and serverless workloads, as well as web applications.
• AWS Lambda lets you run code without provisioning or managing servers. You
pay only for the compute time you consume.
• With Lambda, you can run code for virtually any type of application or backend
service - all with zero administration. Just upload your code and Lambda takes
care of everything required to run and scale your code with high availability. You
can set up your code to automatically trigger from other AWS services or call it
directly from any web or mobile app.
Lab Tasks
1. Login to AWS Management Console.
5. Create a Method.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
the section.
4. Once the Lambda Function created successfully it will display like below.
Create an API
the section.
3. Then choose create new API as under settings choose API name
Creating a Resource
Creating Method
1. Once you created Resource then click on Actions and
select then select Get in the drop down list.
1 then click on
o Note: If any pop up arises ignore it.
5. Then Click on
6. Once API deploy successful, navigate to the Stages. you will be able to see the
following
7. Copy and Paste the Invoke URL followed by your Resource name in the new
tab to make the First Get request.
8. Now you will get the GET request from API like below.
9. Now you have successfully completed the lab Build API Gateway with Lambda
Integration.
3. You will practice to mount the EFS to both instances by logging into your
instance using SSH authentication.
4. You will practice the file share happening between two instances.
Tasks:
1. Login to AWS Management Console.
active. Click on , this will open your AWS Console Account for
this lab in a new tab.
Note : If you have completed one lab, make sure to signout of the aws
account before starting new lab. If you face any issues, please go
through FAQs and Troubleshooting for Labs.
Steps
the section.
3. Click on
(AMI):
the
6. Configure Instance Details:
on
o Click on
9. Configure Security Group:
o Security Group Name: Enter EFS-SG
o To add SSH,
▪ Choose Type:
▪ Click on
▪ Choose Type: NFS
11. Key Pair : This step is most important, Create a new key Pair and click
13. Note down the sample IPv4 Public IP Addresses of the EC2 instances.
the section.
2. Click on Create File System.
3. Configure Network Access:
• VPC
o An Amazon EFS file system is accessed by EC2 instances running inside one of
your VPCs.
o Choose the VPC selected while launching the EC2 instance. In this case leave it
as Default.
• Mount Targets
o Instances connect to a file system by using a network interface called a mount
target. Each mount target has an IP address, which we assign automatically or
you can specify.
o We will select all the Availability Zones(AZ’s), so that EC2 instances across
your VPC can access the file system.
o Select all the Availability Zones and in the Security Groups cancel default and
select EFS-SG, created earlier.
o Click on
o Add tags:
▪ Key : Enter Name
▪ Value : Enter MyFirstEFS
o Enable Lifecycle Management : Choose None
o Choose throughout mode : Choose Bursting
o Choose Performance mode: Choose General Purpose
o Enable encryption : Leave it as default
o Click on
6. Review and Create: Review the configuration below before proceeding to create
your file system.
7. Click on
9. Scroll the page down. You can see the Mount target state is Creating. Wait for
the status to be Available.
10. Now open the EC2 page in a seperate Tab.
o sudo -s
4. Now run the updates using the following command:
o yum -y update
5. Install the NFS client.
o mkdir efs
7. Let us mount our file system in this directory. To do so, navigate to EFS and copy
the DNS Name in the file system.
o mount -t nfs DNS Name:/ efs/
o Note: Enter your EFS DNS Name in the place of DNS Name above. efs is the
directory that we created earlier.
8. To display information for all currently mounted file systems we use the
command
o df -h
o mkdir aws
o sudo -s
4. Now run the updates using the following command:
o yum -y update
5. Install the NFS client.
o mkdir efs
7. Let us mount our file system in this directory. To do so, navigate to EFS and copy
the DNS Name in the file system.
8. To display information for all currently mounted file systems we use the command
o df -h
o sudo -s
3. Navigate to efs directory in both the servers using command
o cd efs
4. Create a file in any one server.
o touch hello.txt
5. Check the file using command
o ls -ltr
6. Now go to the other server and give command
o ls -ltr
7. You can see the file created in this server also.
3. You have successfully Installed the NFS Client and mounted EFS to the
Instances.
4. You have successfully tested the file share between 2 Instances.
Create AWS EC2 Instance and run AWS CLI
Commands
Lab Details
1. This Lab walks you through the steps to create EC2 and run few AWS CLI
Commands.
2. Duration: 00:45:00 Hrs
Tasks:
1. Login to AWS Management Console.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
the section.
o Choose the service that will use this role: Select and then click
on as shown.
• Click on .
7. In Create Role Page,
• Click on .
8. You have successfully created the role.
on in the section.
on
4. Choose an Amazon Machine Image
(AMI):
the
6. Configure Instance Details:
o Select the IAM role which we created above from the list.
7. Click on
8. Add Storage: No need to change anything in this step, click
on
o Key : Name
o Value : MyEC2Instance
o Click on
10. Configure Security Group:
▪ Choose Type:
14. In the tab, Copy the IPv4 Public IP Address of the EC2
instance ‘MyEC2Instance’
• Once you get the below output, you have successfully created a Keypair.
AWS CLI command to create Security Group
The below command will create a Security group in the us-east-1 region.
• aws ec2 create-security-group --group-name my-sg --description "My security
group" --region us-east-1
• Once you get the below output, you have successfully created a Security Group.
on in the section.
Enter
5. You will be able to see the EC2 instance.
• Once you see the below output, Your EC2 instance has started to terminate.
• Now navigate to your EC2 dashboard and you will be able to see the EC2
instance state Shutting-down.
Tasks
1. Login to AWS Management Console.
7. View the instance getting shut down and terminate in the AWS management
console.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps
on in the section.
on
4. Choose an Amazon Machine Image
(AMI):
the
6. Configure Instance Details:
on
1. Go to and select .
o Click on .
o Type EC2fullaccess in the search bar and then choose
o click on the .
o Click on the
4.Role Name:
1. Go to menu, click on
2. Make sure you are in the US East (N. Virginia) region.
o Choose .
o Function name : myEC2LambdaFunction
o Runtime : Select Python 3.8
o Click on
4. Configuration Page: Here we need to configure our lambda function. If you
scroll down you can see the Function code section. Here we need to write
a Python code which will shut dow and terminate EC2 instance.
5. You will be using boto3 SDK for AWS to write the python code.
6. Remove the existing code in AWS lambda lambda_function.py. Copy the below
code and paste it into your lambda_function.py file.
import json
import boto3
def lambda_handler(event, context):
region = 'us-east-1'
client = boto3.client("ec2", region_name=region)
status = client.describe_instance_status(IncludeAllInstances = True)
for i in status["InstanceStatuses"]:
else:
print("Please wait for the instance to be stopped or running state")
print("\n")
return {
'statusCode': 200,
}
o Click on .
3. Lambda function will be executed, the running EC2 instance will be stopped &
the stopped instance will be terminated.
4. Once it's completed, you will be seeing a success message as shown below. It
will display the details:
Check the EC2 instances Status
1. Navigate to EC2 page from services menu.
3. You can see that the running instance is stopped and the stopped instance is
terminated.
3. Lambda function will be executed again, the stopped EC2 instance will be
Terminated.
4. Once it's completed, you will be seeing a success message as shown below. It
will display the details:
3. You can see that the running instance is stopped and the stopped instance is
terminated.
5. You have successfully shut down and terminated the EC2 instance.
S3 Bucket event trigger lambda function to send
Email notification
Lab Details
1. This lab walks you through creating a S3 bucket and trigger lambda function that
will send an Email notification to the user, when we upload or delete an S3 object
and testing it in the AWS management console.
2. Duration: 01:00:00 Hrs
Architecture Diagram:
Tasks:
1. Login to AWS Management Console.
3. Create a S3 bucket.
Flow Chart
Launching Lab Environment
1. Make sure to signout of the existing AWS Account before you start a new lab
session (if you have already logged into one). Check FAQs and Troubleshooting
for Labs , if you face any issues.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps:
Create an IAM Role
1. Go to and select .
o Click on .
o Type sesfullaccess in the search bar and then choose
o Click on the .
o Key : name
o Value : lambda_ses_role
o Click on the
4. Role Name:
Create a S3 Bucket
1. Make sure you are in the N.Virginia Region.
the section.
o Click on the .
o Click on the .
o Browse any local image or the image downloaded by name smiley.jpg.
the section.
4. Click on
• Note: Your Email id is used in the Lambda function to receive notification. And this
subscription will end when lab time ends or when you click on the endLab button.
6. Click on and you will be able to a successful
7. Now you will be able to see that the Verification Status on the Email will
be pending verification.
3. Once you click on the link, it will be redirected to another page saying
successfully verified.
4. Now go to the SES Email Address page and refresh the page. You will be able
to see the Verification status as Active.
o Choose .
o Function name : my_ses_s3_Lambda
o Runtime : Select Python 3.8
o Click on
4. Configuration Page: Here we need to configure our lambda function. If you
scroll down you can see the Function code section. Remove the existing code
in AWS lambda lambda_function.py. Copy the below code and paste it into
your lambda_function.py file.
import boto3
import json
for e in event["Records"]:
bucketName = e["s3"]["bucket"]["name"]
objectName = e["s3"]["object"]["key"]
eventName = e["eventName"]
bClient = boto3.client("ses")
eSubject = 'AWS Lab' + str(eventName) + 'Event'
eBody = """
<br>
Hi User,<br>
Welcome to Whizlabs Lab<br>
We are here to notify you that {} an event was triggered.<br>
Bucket name : {} <br>
Object name : {}
<br>
""".format(eventName, bucketName, objectName)
# TODO implement
return {
'statusCode': 200,
'body': json.dumps(result)
}
the section.
3. Enter the S3 bucket by clicking on your bucket name myseslambdawhizlabs
6. Now click on
o Events :
▪ Select and
7. Click on
the section.
3. Enter inside the bucket by clicking on your bucket name myseslambdawhizlabs
, Open it.
7. Ignore the warning
8. Now again try to upload a file to S3 bucket and you will get another mail.
Completion and Conclusion
1. You have successfully logged in to AWS Management console.
8. You have successfully tested the lab and got the Email.
Running Lambda on a Schedule
Lab Details
1. This lab walks you through the steps to Creating a Schedule on Lambda.
Introduction
AWS Lambda
• AWS Lambda is a compute service that lets you run code without provisioning or
managing servers. AWS Lambda executes your code only when needed and
scales automatically, from a few requests per day to thousands per second. You
pay only for the compute time you consume - there is no charge when your code
is not running. With AWS Lambda, you can run code for virtually any type of
application or backend service - all with zero administration.
• AWS Lambda runs your code on a high-availability compute infrastructure and
performs all of the administration of the compute resources, including server and
operating system maintenance, capacity provisioning and automatic scaling,
code monitoring and logging.
• You can use AWS Lambda to run your code in response to events, such as
changes to data in an Amazon S3 bucket or an Amazon DynamoDB table; to run
your code in response to HTTP requests using Amazon API Gateway; or invoke
your code using API calls made using AWS SDKs. With these capabilities, you
can use Lambda to easily build data processing triggers for AWS services like
Amazon S3 and Amazon DynamoDB, process streaming data stored in Kinesis,
or create your own back end that operates at AWS scale, performance, and
security.
• AWS Lambda is an ideal compute platform for many application scenarios,
provided that you can write your application code in languages supported by
AWS Lambda, and run within the AWS Lambda standard runtime environment
and resources provided by Lambda.
• When using AWS Lambda, you are responsible only for your code. AWS Lambda
manages the compute fleet that offers a balance of memory, CPU, network, and
other resources. This is in exchange for flexibility, which means you cannot log in
to compute instances, or customize the operating system on provided runtimes.
These constraints enable AWS Lambda to perform operational and
administrative activities on your behalf, including provisioning capacity,
monitoring fleet health, applying security patches, deploying your code, and
monitoring and logging your Lambda functions.
Lab Tasks
1. Login to AWS Management Console.
4. Create a Lambda.
active. Click on , this will open your AWS Console Account for
this lab in a new tab. If you are asked to logout in AWS Management
Steps:
1. Navigate to EC2 by clicking on the Service menu in the top, then click on EC2 in
the Compute section.
2. Click on Instances in the left side panel and click on Launch Instance.
3. Choose an Amazon Machine Image
(AMI):
1. Key : Name
2. Value : whizserver
3. Click on Next: Configure Security Group.
10. Key Pair : Select Key Pair as Proceed without Key Pair then acknowledge and
click on Launch Instances.
11. Once an instance launched successfully you can see the below.
Note: It will take upto 5 minutes to change the instance state Running.
Create an IAM Role
1. Navigate to Services at the top and Select IAM under “Security, Identity and
Compliance”.
2. Choose Roles in the left side panel and click on Create Role.
3. Select the “select type of trusted entity” : AWS Service and Choose use case
as Lambda then click on Next: Permissions.
4. Select Create Policy it will redirect to a new tab copy and paste the below code in
the json field and click on Review Policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "ec2:*",
"Effect": "Allow",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": "elasticloadbalancing:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": "cloudwatch:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": "autoscaling:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": "iam:CreateServiceLinkedRole",
"Resource": "*",
"Condition": {
"StringEquals": {
"iam:AWSServiceName": [
"autoscaling.amazonaws.com",
"ec2scheduled.amazonaws.com",
"elasticloadbalancing.amazonaws.com",
"spot.amazonaws.com",
"spotfleet.amazonaws.com",
"transitgateway.amazonaws.com"
]
}
}
}
]
}
5. Enter the Name: whizpolicy and click on Create Policy.
6. Once Policy created successfully, Now go to the role creation tab refresh once
then search your policy in “Filter Policies” and attach the whizpolicy and click
on Next: Tags.
7. Leave others as default by clicking on Next and enter the role name
as whizrole then click on Create Role.
4. Click on Add target and choose Lambda Function in the drop down list.
7. Enter the “Rule Name” as whizrule and leave others as default and click on Create
Rule.
8. Once the Rule was created successfully you see the below.
for i in status["InstanceStatuses"]:
else:
print("Please wait for the instance to be in stopped or running state")
print("\n")
return {
'statusCode': 200,
}
3. Go back to EC2 Instance refresh the instance let see the schedule activity.
4. Now you successfully completed the Lambda scheduling in this lambda will
trigger every 1minutes once. If the instance is stopped it will start and vice versa.
5. After one minute refresh the instance it shows the instance state as running.
Completion and Conclusion
1. You have successfully created the Instance.