Automation with Shell Scripting & Python in DevOps
Automation with Shell Scripting & Python in DevOps
DevOps
In DevOps, automation is key for improving efficiency, reducing errors, and
speeding up deployments. Python and Shell scripting are both widely used, each
offering unique advantages depending on the task.
● Shell Scripting: Best for system-level tasks and rapid automation; tied to
Unix-like environments.
● Python: More versatile, with a vast ecosystem of libraries, making it
suitable for complex, cross-platform automation.
Conclusion:
Shell scripting is great for quick, system-level tasks, while Python excels in
complex, scalable automation. Using both together can create a flexible and
efficient DevOps automation strategy.
#!/bin/bash
# Variables
INSTANCE_TYPE="t2.micro"
#!/bin/bash
CPU_THRESHOLD=80
fi
#!/bin/bash
# Variables
DB_USER="root"
DB_PASSWORD="password"
DB_NAME="my_database"
BACKUP_DIR="/backup"
DATE=$(date +%F)
mkdir -p $BACKUP_DIR
# Backup command
gzip $BACKUP_DIR/backup_$DATE.sql
# Variables
LOG_DIR="/var/log/myapp"
ARCHIVE_DIR="/var/log/myapp/archive"
DAYS_TO_KEEP=30
mkdir -p $ARCHIVE_DIR
# Find and compress logs older than 7 days
# Jenkins details
JENKINS_URL="http://jenkins.example.com"
JOB_NAME="my-pipeline-job"
USER="your-username"
API_TOKEN="your-api-token"
# Variables
NAMESPACE="default"
DEPLOYMENT_NAME="my-app"
IMAGE="my-app:v1.0"
# Deploy to Kubernetes
# Variables
TF_DIR="/path/to/terraform/config"
cd $TF_DIR
#!/bin/bash
# Variables
DB_USER="postgres"
DB_PASSWORD="password"
DB_NAME="my_database"
MIGRATION_FILE="/path/to/migration.sql"
# Variables
USER_NAME="newuser"
GROUP_NAME="devops"
#!/bin/bash
OPEN_PORTS=$(netstat -tuln)
else
Fi
11. Performance Tuning
This script clears memory caches and restarts services to free up system resources.
#!/bin/bash
#!/bin/bash
pytest tests/
mvn test
13. Scaling Infrastructure
This script automatically scales EC2 instances in an Auto Scaling group based on
CPU usage.
#!/bin/bash
fi
export DB_HOST="prod-db.example.com"
export API_KEY="prod-api-key"
export DB_HOST="staging-db.example.com"
export API_KEY="staging-api-key"
else
export DB_HOST="dev-db.example.com"
export API_KEY="dev-api-key"
fi
#!/bin/bash
fi
#!/bin/bash
# Install Docker
sudo sh get-docker.sh
fi
#!/bin/bash
#!/bin/bash
curl -s --head http://$server | head -n 1 | grep "HTTP/1.1 200 OK" > /dev/null
if [ $? -ne 0 ]; then
else
fi
done
#!/bin/bash
#!/bin/bash
# Reboot server during off-hours
sudo reboot
fi
#!/bin/bash
certbot renew
#!/bin/bash
fi
#!/bin/bash
else
fi
done
#!/bin/bash
# Pull latest changes from Git repository and create a release tag
#!/bin/bash
docker-compose down
docker-compose up -d
fi
#!/bin/bash
#!/bin/bash
# Check and apply security patches
#!/bin/bash
else
fi
ZONE_ID="your-hosted-zone-id"
DOMAIN_NAME="your-domain.com"
NEW_IP="your-new-ip-address"
"Changes": [
"Action": "UPSERT",
"ResourceRecordSet": {
"Name": "'$DOMAIN_NAME'",
"Type": "A",
"TTL": 60,
"ResourceRecords": [
"Value": "'$NEW_IP'"
}
}
}'
# Run ESLint
# Run Prettier
# API URL
API_URL="https://your-api-endpoint.com/endpoint"
else
fi
# Image to scan
IMAGE_NAME="your-docker-image:latest"
exit 1
else
fi
THRESHOLD=80
# Target URL
URL="https://your-application-url.com"
ab -n 1000 -c 10 $URL
Introduction: This script automates the process of updating DNS records in AWS
Route 53 when the IP address of a server changes. It ensures that DNS records are
updated dynamically when new servers are provisioned.
#!/bin/bash
# Variables
ZONE_ID="your-hosted-zone-id"
DOMAIN_NAME="your-domain.com"
NEW_IP="your-new-ip-address"
"Changes": [
"Action": "UPSERT",
"ResourceRecordSet": {
"Name": "'$DOMAIN_NAME'",
"Type": "A",
"TTL": 60,
"ResourceRecords": [
"Value": "'$NEW_IP'"
}'
#!/bin/bash
# Run ESLint
#!/bin/bash
# API URL
API_URL="https://your-api-endpoint.com/endpoint"
else
#!/bin/bash
# Image to scan
IMAGE_NAME="your-docker-image:latest"
if [ $? -eq 1 ]; then
exit 1
else
fi
42. Disk Usage Monitoring and Alerts (Email Notification)
Introduction: This script monitors disk usage and sends an alert via email if the
disk usage exceeds a specified threshold. It helps in proactive monitoring of disk
space.
#!/bin/bash
THRESHOLD=80
fi
#!/bin/bash
# Target URL
URL="https://your-application-url.com"
ab -n 1000 -c 10 $URL
#!/bin/bash
#!/bin/bash
1. File Operations
Read a file:
python
with open('file.txt', 'r') as file:
content = file.read()
print(content)
Write to a file:
python
file.write('Hello, DevOps!')
2. Environment Variables
Get an environment variable:
python
import os
db_user = os.getenv('DB_USER')
print(db_user)
Set an environment variable:
python
import os
os.environ['NEW_VAR'] = 'value'
3. Subprocess Management
Run shell commands:
python
import subprocess
print(result.stdout)
4. API Requests
python
import requests
response = requests.get('https://api.example.com/data')
print(response.json())
5. JSON Handling
Read JSON from a file:
python
import json
data = json.load(file)
print(data)
python
import json
data = {'name': 'DevOps', 'type': 'Workflow'}
6. Logging
Logging improves visibility, helps resolve issues faster, and optimizes system performance.
python
import logging
logging.basicConfig(level=logging.INFO)
python
import sqlite3
conn = sqlite3.connect('example.db')
cursor = conn.cursor()
conn.commit()
conn.close()
python
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print(stdout.read().decode())
ssh.close()
9. Error Handling
Try-except block:
python
try:
risky_code()
except Exception as e:
python
import docker
client = docker.from_env()
containers = client.containers.list()
for container in containers:
print(container.name)
python
import yaml
config = yaml.safe_load(file)
print(config)
python
import yaml
yaml.dump(data, file)
12. Parsing Command-Line Arguments
Using argparse:
python
import argparse
args = parser.parse_args()
print(args.num)
python
import psutil
python
app = Flask(__name__)
@app.route('/health', methods=['GET'])
def health_check():
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
python
import docker
client = docker.from_env()
print(container.logs())
python
import schedule
import time
def job():
schedule.every(1).minutes.do(job)
while True:
schedule.run_pending()
time.sleep(1)
17. Version Control with Git
Using GitPython to interact with Git repositories:
python
import git
repo = git.Repo('/path/to/repo')
repo.git.add('file.txt')
repo.index.commit('Added file.txt')
python
import smtplib
msg['From'] = 'you@example.com'
msg['To'] = 'recipient@example.com'
with smtplib.SMTP('smtp.example.com', 587) as server:
server.starttls()
server.login('your_username', 'your_password')
server.send_message(msg)
python
import os
import subprocess
os.system('myenv\\Scripts\\activate')
os.system('source myenv/bin/activate')
20. Integrating with CI/CD Tools
Using the requests library to trigger a Jenkins job:
python
import requests
url = 'http://your-jenkins-url/job/your-job-name/build'
print(response.status_code)
python
import time
class MyHandler(FileSystemEventHandler):
event_handler = MyHandler()
observer = Observer()
observer.start()
try:
while True:
time.sleep(1)
except KeyboardInterrupt:
observer.stop()
observer.join()
python
import unittest
def add(a, b):
return a + b
class TestMathFunctions(unittest.TestCase):
def test_add(self):
self.assertEqual(add(2, 3), 5)
if __name__ == '__main__':
unittest.main()
python
import pandas as pd
df = pd.read_csv('data.csv')
df['new_column'] = df['existing_column'] * 2
df.to_csv('output.csv', index=False)
24. Using Python for Infrastructure as Code
Using boto3 for AWS operations:
python
import boto3
ec2 = boto3.resource('ec2')
print(instance.id, instance.state)
python
import requests
from bs4 import BeautifulSoup
response = requests.get('http://example.com')
print(soup.title.string)
python
conn.run('uname -s')
python
import boto3
s3 = boto3.client('s3')
# Upload a file
# Download a file
python
import time
def tail_f(file):
while True:
line = file.readline()
if not line:
time.sleep(0.1) # Sleep briefly
continue
print(line)
tail_f(log_file)
python
import docker
client = docker.from_env()
container = client.containers.get('container_id')
print(container.attrs['State']['Health']['Status'])
python
import requests
import time
url = 'https://api.example.com/data'
while True:
response = requests.get(url)
if response.status_code == 200:
print(response.json())
break
else:
print('Error:', response.status_code)
break
python
import os
import subprocess
# Start services defined in docker-compose.yml
# Stop services
subprocess.run(['docker-compose', 'down'])
python
app = Flask(__name__)
api = Api(app)
class HelloWorld(Resource):
def get(self):
if __name__ == '__main__':
app.run(debug=True)
python
import asyncio
print('Hello')
await asyncio.sleep(1)
print('World')
asyncio.run(main())
48. Network Monitoring with scapy
Packet sniffing using scapy.
python
def packet_callback(packet):
print(packet.summary())
sniff(prn=packet_callback, count=10)
python
import configparser
config = configparser.ConfigParser()
config.read('config.ini')
print(config['DEFAULT']['SomeSetting'])
config['DEFAULT']['NewSetting'] = 'Value'
config.write(configfile)
python
import websocket
ws = websocket.WebSocketApp("ws://echo.websocket.org",
on_message=on_message)
ws.run_forever()
51. Creating a Docker Image with Python
Using docker library to build an image.
python
import docker
client = docker.from_env()
# Dockerfile content
dockerfile_content = """
FROM python:3.9-slim
WORKDIR /app
COPY . /app
"""
print(line)
python
import psutil
python
alembic_cfg = config.Config("alembic.ini")
command.upgrade(alembic_cfg, "head")
Paramiko helps you connect to remote servers securely, run commands, and
automate tasks using Python. It simplifies managing remote systems by ensuring
encrypted connections.
python
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print(stdout.read().decode())
client.close()
python
import boto3
cloudformation = boto3.client('cloudformation')
template_body = template_file.read()
response = cloudformation.create_stack(
StackName='MyStack',
TemplateBody=template_body,
TimeoutInMinutes=5,
Capabilities=['CAPABILITY_NAMED_IAM'],
print(response)
import boto3
ec2 = boto3.resource('ec2')
# Start an instance
instance = ec2.Instance('instance_id')
instance.start()
# Stop an instance
instance.stop()
python
import shutil
import os
source_dir = '/path/to/source'
backup_dir = '/path/to/backup'
shutil.copytree(source_dir, backup_dir)