Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Python Using AI

The document provides a comprehensive guide for beginners in Python, detailing steps to set up Google Colab and Visual Studio Code, along with various prompts for coding exercises and projects. It includes examples of basic Python code, error handling, web scraping, and creating visualizations and applications using libraries like Streamlit and Plotly. The document serves as a resource for learning Python fundamentals and practical applications through hands-on coding tasks.

Uploaded by

gokul s
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Python Using AI

The document provides a comprehensive guide for beginners in Python, detailing steps to set up Google Colab and Visual Studio Code, along with various prompts for coding exercises and projects. It includes examples of basic Python code, error handling, web scraping, and creating visualizations and applications using libraries like Streamlit and Plotly. The document serves as a resource for learning Python fundamentals and practical applications through hands-on coding tasks.

Uploaded by

gokul s
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Python using AI - Prompts & Codes

Steps to open Colab

1. Search for “Colab” → Open the first link

2. File → New Notebook

Prompt 1

Assume i'm a complete beginner in Python, who has never written a code, tell a simple line to run

print("Hello, World!")

Prompt 2

I have 10 days of holidays, and I want to use these 10 days to learn Python, and I have 2 hours of free
time every day, now prepare daily and hourly planner so that I'm at least good with Python basics

Prompt 3

Please give me a list of 10 MCQs on Python basics, make the options very tricky to choose

Prompt 4 Now give me 30 more of these and make the level to medium

Prompt 5

Can you explain me about this error: SyntaxError: incomplete input

Prompt 6

can you fix this code <paste your code>

Error code

wallet_amount = 100

ticket_price = 120

if wallet_amount > ticket_price:

print("Go for a movie")

else:

print("Go home"

Corrected code

wallet_amount = 100

ticket_price = 120
if wallet_amount > ticket_price:

print("Go for a movie")

else:

print("Go home")

Prompt 7 I started learning IF ELSE in Python, now help me understand about this

concept from scratch, with some examples in real life, assume you are

explaining this to a 5-year-old kid

toy_color = input("What color is the toy? ")

if toy_color == "red":

print("Yay! I found a red toy!")

else:

print("Oh no! No red toys here.")

Prompt 8

I want to create a world map visual in Python, if I hover over a country it should display that country
name

import plotly.graph_objects as go

import pandas as pd

# Load the built-in dataset of country names and codes

df = pd.read_csv('<https://raw.githubusercontent.com/plotly/datasets/master/
2014_world_gdp_with_codes.csv>')

# Create the choropleth map

fig = go.Figure(data=go.Choropleth(

locations = df['CODE'],

z = df['GDP (BILLIONS)'],

text = df['COUNTRY'],

colorscale = 'Blues',
autocolorscale=False,

reversescale=True,

marker_line_color='darkgray',

marker_line_width=0.5,

colorbar_title = 'GDP Billions USD',

))

# Update the layout

fig.update_layout(

title_text='World Map',

geo=dict(

showframe=False,

showcoastlines=True,

projection_type='equirectangular'

),

annotations = [dict(

x=0.55,

y=0.1,

xref='paper',

yref='paper',

text='Source: <a href="<https://www.cia.gov/library/publications/the-world-factbook/fields/


2195.html>">\\

CIA World Factbook</a>',

showarrow = False

)]

# Show the figure

fig.show()
How to set up Visual Studio Code?

1. Download “Visual Studio Code” - Latest version for Windows / Mac / Linux

2. Download “Python” - Latest version for Windows / Mac / Linux

3. Create a new file “app.py”

4. Paste your AI generated code below into this “app.py” file

5. Run this command streamlit run app.py

6. In case you get any errors → please run this command → pip install streamlit

Prompt 9 can you help me build a website in Python using Streamlit, it has to do the following: it should
be a calculator with basic add, minus, multiply, and divide, it should have buttons to click as well

import streamlit as st

def calculate(num1, num2, operation):

if operation == '+':

return num1 + num2

elif operation == '-':

return num1 - num2

elif operation == '*':

return num1 * num2

elif operation == '/':

return num1 / num2 if num2 != 0 else "Error: Division by zero"

st.title("Simple Calculator")

# Input fields

num1 = st.number_input("Enter first number", value=0.0)

num2 = st.number_input("Enter second number", value=0.0)

# Operation buttons with Unicode symbols

col1, col2, col3, col4 = st.columns(4)


with col1:

if st.button("\\u2795"): # Heavy Plus Sign

result = calculate(num1, num2, '+')

with col2:

if st.button("\\u2796"): # Heavy Minus Sign

result = calculate(num1, num2, '-')

with col3:

if st.button("\\u2716"): # Multiplication X

result = calculate(num1, num2, '*')

with col4:

if st.button("\\u2797"): # Heavy Division Sign

result = calculate(num1, num2, '/')

# Display result

if 'result' in locals():

st.success(f"Result: {result}")

# Add a note about the symbols

st.write("Button symbols: ➕ (add), ➖ (subtract), ✖️(multiply), ➗ (divide)")

Prompt 10

Write a Python script to scrape static content from a website. Use the URL http://books.toscrape.com/ to
scrape book titles and their prices. Parse the webpage using BeautifulSoup and print the data in a clean,
readable format. Ensure the script works for static, non-time-sensitive pages, and include error handling
for HTTP requests. Additionally, provide an option to save the scraped data to a CSV file.

import requests

from bs4 import BeautifulSoup

# URL of the website to scrape

url = "<http://books.toscrape.com/>"
# Send an HTTP GET request to the URL

response = requests.get(url)

# Parse the page content using BeautifulSoup

soup = BeautifulSoup(response.content, 'html.parser')

# Find all book containers

books = soup.find_all('article', class_='product_pod')

# List to store scraped data

book_data = []

# Loop through each book container to extract details

for book in books:

# Extract book title

title = book.h3.a['title']

# Extract book price

price = book.find('p', class_='price_color').text

# Add details to the list

book_data.append({'title': title, 'price': price})

# Display the scraped data

for i, book in enumerate(book_data, start=1):

print(f"{i}. {book['title']} - {book['price']}")

import csv
# Save the data to a CSV file

with open('books.csv', 'w', newline='', encoding='utf-8') as csvfile:

writer = csv.DictWriter(csvfile, fieldnames=['title', 'price'])

writer.writeheader()

writer.writerows(book_data)

print("Data saved to books.csv")

Prompt 11

can you help me scrape the T20I overall runs from this URL:

https://www.cricbuzz.com/profiles/1413/virat-kohli

import requests

from bs4 import BeautifulSoup

def scrape_t20i_runs(url):

# Send a GET request to the URL

response = requests.get(url)

# Check if the request was successful

if response.status_code == 200:

# Parse the HTML content

soup = BeautifulSoup(response.text, 'html.parser')

# Find the table containing the player stats

table = soup.find('table', class_='table cb-col-100 cb-plyr-thead')

if table:

# Find all rows in the table

rows = table.find_all('tr')
# Look for the T20I row

for row in rows:

cols = row.find_all('td')

if cols and cols[0].text.strip() == 'T20I':

# The runs are in the third column (index 2)

t20i_runs = cols[4].text.strip()

return t20i_runs

print("Couldn't find T20I stats in the table.")

return None

else:

print(f"Failed to retrieve the webpage. Status code: {response.status_code}")

return None

# URL of Virat Kohli's profile

url = "<https://www.cricbuzz.com/profiles/1413/virat-kohli>"

# Scrape T20I runs

t20i_runs = scrape_t20i_runs(url)

if t20i_runs:

print(f"Virat Kohli's T20I overall runs: {t20i_runs}")

else:

print("Failed to scrape T20I runs.")

Prompt 11

Decode the entire content of the page

Prompt 12
https://bolt.new/

Create a python app to manage my to do list

You might also like