18 Insanely Useful Python Automation Scripts I Use Every Day
Stop wasting time on repetitive tasks. These Python scripts do the boring stuff — so you don’t have to.

Automate Your Life — One Python Script at a Time
18 Insanely Useful Python Automation Scripts I Use Every Day
Every day, you open tabs, rename files, clean spreadsheets, check emails, or organize folders — the kind of repetitive grunt work that chips away at your energy and focus.
Here’s the truth:
Most of those tasks can be done by Python. Automatically. Silently. Reliably.
I’m not talking about theoretical scripts. These are 18 real Python automations I’ve personally built and use every single day — to save time, stay organized, and focus on deep work.
You don’t need to be a coding wizard to follow along. If you know basic Python and have a few libraries installed, you’re ready to automate your way to productivity bliss.
1. Auto-Organize Download Folder by File Type
Organizing your download folder manually is tedious.
This script automatically sorts your files into subfolders based on their extensions — like Images
, Documents
, Videos
, etc.
What It Does:
Scans your Downloads folder.
Checks each file’s extension (e.g..jpg
,.mp4
).
Moves the file into the corresponding subfolder (creating it if it doesn’t exist).
Skips already sorted files or folders.
Python Script: organize_downloads.py
# organize_downloads.py
import os
import shutil
# ✅ Set your downloads directory path
DOWNLOAD_DIR = os.path.expanduser("~/Downloads")
# ✅ Define how files should be categorized
FILE_CATEGORIES = {
"Images": [".jpg", ".jpeg", ".png", ".gif", ".bmp", ".svg"],
"Documents": [".pdf", ".docx", ".doc", ".txt", ".xlsx", ".pptx"],
"Videos": [".mp4", ".mkv", ".mov", ".avi"],
"Audio": [".mp3", ".wav", ".aac"],
"Archives": [".zip", ".rar", ".7z", ".tar", ".gz"],
"Executables": [".exe", ".dmg", ".pkg", ".sh"],
"Scripts": [".py", ".js", ".sh", ".bat"],
"Others": [], # fallback
}
# ✅ Scan and sort files
def organize_downloads():
for filename in os.listdir(DOWNLOAD_DIR):
file_path = os.path.join(DOWNLOAD_DIR, filename)
# Skip directories
if os.path.isdir(file_path):
continue
file_ext = os.path.splitext(filename)[1].lower()
moved = False
for category, extensions in FILE_CATEGORIES.items():
if file_ext in extensions:
move_file(file_path, os.path.join(DOWNLOAD_DIR, category))
moved = True
break
# Put unknown files in 'Others' folder
if not moved:
move_file(file_path, os.path.join(DOWNLOAD_DIR, "Others"))
def move_file(src, dest_dir):
os.makedirs(dest_dir, exist_ok=True)
dest = os.path.join(dest_dir, os.path.basename(src))
# Avoid overwriting files
if os.path.exists(dest):
base, ext = os.path.splitext(dest)
count = 1
while os.path.exists(dest):
dest = f"{base}_{count}{ext}"
count += 1
shutil.move(src, dest)
if __name__ == "__main__":
organize_downloads()
print("✅ Downloads folder organized.")
Before:
Downloads/
├── pic1.jpg
├── resume.pdf
├── movie.mp4
After:
Downloads/
├── Images/
│ └── pic1.jpg
├── Documents/
│ └── resume.pdf
├── Videos/
│ └── movie.mp4
2. Auto-Send Birthday Emails
Using smtplib
and a simple CSV of names and dates, I send personalized birthday greetings (and look like the thoughtful friend I pretend to be).
What It Does:
Reads a list of people and their birthdays from a CSV file
Checks if today is anyone’s birthday
Sends them a personalized email automatically using Gmail (or any SMTP server)
CSV File Format (birthdays.csv
):
name,email,birthday,message
Aashish Kumar,aashish@example.com,1998-07-23,Hope you have an amazing day!
Jane Doe,jane@example.com,1995-08-10,Wishing you joy and success this year!
Format: YYYY-MM-DD
Python Script: birthday_email.py
# birthday_email.py
import smtplib
import pandas as pd
from datetime import datetime
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
# ✅ Load birthday data
BIRTHDAYS_FILE = "birthdays.csv"
data = pd.read_csv(BIRTHDAYS_FILE)
# ✅ SMTP Settings
EMAIL = "youremail@gmail.com"
PASSWORD = "your-app-password" # Use Gmail App Password (not your normal password)
SMTP_SERVER = "smtp.gmail.com"
SMTP_PORT = 587
def send_email(to_email, subject, body):
msg = MIMEMultipart()
msg['From'] = EMAIL
msg['To'] = to_email
msg['Subject'] = subject
msg.attach(MIMEText(body, 'plain'))
try:
with smtplib.SMTP(SMTP_SERVER, SMTP_PORT) as server:
server.starttls()
server.login(EMAIL, PASSWORD)
server.send_message(msg)
print(f"✅ Sent to {to_email}")
except Exception as e:
print(f"❌ Failed to send to {to_email}: {e}")
def check_and_send_birthday_wishes():
today = datetime.today().strftime("%m-%d")
for _, row in data.iterrows():
bday = datetime.strptime(row['birthday'], "%Y-%m-%d").strftime("%m-%d")
if bday == today:
name = row['name']
email = row['email']
message = row['message']
subject = f"🎉 Happy Birthday, {name.split()[0]}!"
body = f"Hi {name},\n\n{message}\n\n— Sent by your friendly Python script 🐍"
send_email(email, subject, body)
if __name__ == "__main__":
check_and_send_birthday_wishes()
How to Set Up Gmail for Python Scripts
- Go to Google Account Settings > Security
- Enable 2-Step Verification
- Create an App Password for “Mail”
- Use that password in the script instead of your actual Gmail password
How to Run It Automatically
Mac/Linux (with cron
)
crontab -e
Add this line to check every day at 8 AM:
0 8 * * * /usr/bin/python3 /path/to/birthday_email.py
Windows (Task Scheduler)
- Create a task that runs the script daily
3. Convert Image Folders into PDFs
A script that batch-converts scanned docs or notes into a clean, compact PDF using Pillow
.
What It Does:
Takes a folder full of images (.jpg
,.png
, etc.)
Sorts them in filename order
Converts them into a single PDF document
Python Script:
from PIL import Image
import os
# ✅ Path to your image folder
FOLDER_PATH = "my_images"
OUTPUT_FILE = os.path.join(FOLDER_PATH, "output.pdf")
# ✅ Get list of image files sorted by filename
image_files = sorted([
f for f in os.listdir(FOLDER_PATH)
if f.lower().endswith(('.png', '.jpg', '.jpeg'))
])
# ✅ Convert to absolute paths
image_paths = [os.path.join(FOLDER_PATH, f) for f in image_files]
# ✅ Open images and convert to RGB (for compatibility)
images = [Image.open(path).convert('RGB') for path in image_paths]
# ✅ Save as PDF
if images:
images[0].save(OUTPUT_FILE, save_all=True, append_images=images[1:])
print(f"✅ PDF saved to {OUTPUT_FILE}")
else:
print("❌ No images found to convert.")
Perfect for scanning notes, merging receipts, or turning sketches into shareable PDFs.
4. Daily Time Tracker Logger
Log what I’m working on every hour into a CSV.
At the end of the week, I can visualize where my time went. Productivity accountability = activated.
What It Does:
This script helps you log what you’re working on throughout the day. It:
Asks you what you’re doing every hour (or custom interval)
Saves each log entry with a timestamp to a .csv
file
Helps you visualize how you spend your time across days/weeks
Python Script: time_tracker.py
# time_tracker.py
import csv
import os
from datetime import datetime
import time
# ✅ Customize
LOG_INTERVAL_MINUTES = 60 # How often to prompt for input
CSV_FILE = "time_log.csv"
def create_file_if_not_exists():
if not os.path.exists(CSV_FILE):
with open(CSV_FILE, mode='w', newline='') as file:
writer = csv.writer(file)
writer.writerow(['timestamp', 'activity'])
def log_activity():
now = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
print(f"\n🕐 {now} — What are you doing right now?")
activity = input("📝 Activity: ").strip()
if activity:
with open(CSV_FILE, mode='a', newline='') as file:
writer = csv.writer(file)
writer.writerow([now, activity])
print("✅ Logged!")
else:
print("⚠️ No activity entered. Skipped.")
def main():
print("📋 Daily Time Tracker Started (Press Ctrl+C to stop)...")
create_file_if_not_exists()
try:
while True:
log_activity()
print(f"⏳ Waiting {LOG_INTERVAL_MINUTES} minutes...\n")
time.sleep(LOG_INTERVAL_MINUTES * 60)
except KeyboardInterrupt:
print("\n👋 Exiting Time Tracker. Goodbye!")
if __name__ == "__main__":
main()
CSV Format Output (Example):
timestamp,activity
2025-07-23 09:00:00,Writing Medium article
2025-07-23 10:00:00,Code review
2025-07-23 11:00:00,Team meeting
5. Auto-Google Anything You Highlight
A simple script using pyperclip
and webbrowser
that watches your clipboard and Googles anything you highlight with a hotkey.
What It Does:
This clever script:
Watches your clipboard in real-time
Detects new text you highlight and copy (like withCtrl+C
orCmd+C
)
Automatically opens a Google search tab in your browser with that text
You copy it — Python Googles it. Like magic.
Requirements:
pyperclip
(for clipboard access)webbrowser
(standard library)
Install dependencies:
pip install pyperclip
Python Script: auto_google.py
# auto_google.py
import pyperclip
import webbrowser
import time
# ✅ Customize interval and filter
CHECK_INTERVAL = 1 # seconds
GOOGLE_SEARCH_URL = "https://www.google.com/search?q="
def main():
print("🔍 Auto-Google is running. Copy text to search automatically (Ctrl+C)...")
recent_value = ""
while True:
try:
tmp_value = pyperclip.paste()
# Only Google new copied text
if tmp_value != recent_value and tmp_value.strip():
recent_value = tmp_value
print(f"📋 Copied: {tmp_value}")
search_query = tmp_value.strip().replace(" ", "+")
webbrowser.open(GOOGLE_SEARCH_URL + search_query)
print("🌐 Google search opened!\n")
time.sleep(CHECK_INTERVAL)
except KeyboardInterrupt:
print("\n👋 Exiting Auto-Google.")
break
except Exception as e:
print(f"❌ Error: {e}")
time.sleep(CHECK_INTERVAL)
if __name__ == "__main__":
main()
6. Password Generator & Manager
Using secrets
and cryptography
, I built a minimalist, encrypted password vault that lives on my machine — no subscription required.
What It Does:
This script:
Generates strong, secure passwords
Stores them locally in an encrypted vault (vault.json
)
Retrieves saved credentials by service name
UsesFernet
encryption (fromcryptography
module) to ensure security
Requirements:
Install the encryption library:
pip install cryptography
Folder Structure:
password_manager/
├── password_manager.py
├── vault.json
├── secret.key
├── generate_key.py
Step 1: Generate the Secret Encryption Key (once)
# generate_key.py
from cryptography.fernet import Fernet
key = Fernet.generate_key()
with open("secret.key", "wb") as key_file:
key_file.write(key)
print("🔑 Encryption key generated and saved to secret.key")
Run this script once:
python generate_key.py
Never share your secret.key
— it unlocks your vault.
Step 2: Password Manager Script
import json
import os
import string
import secrets
from cryptography.fernet import Fernet
KEY_FILE = "secret.key"
VAULT_FILE = "vault.json"
# ✅ Load encryption key
def load_key():
if not os.path.exists(KEY_FILE):
raise FileNotFoundError("❌ secret.key file not found. Generate one first.")
with open(KEY_FILE, "rb") as f:
return f.read()
# ✅ Initialize
fernet = Fernet(load_key())
# ✅ Load vault
def load_vault():
if os.path.exists(VAULT_FILE):
with open(VAULT_FILE, "rb") as f:
encrypted_data = f.read()
if encrypted_data:
return json.loads(fernet.decrypt(encrypted_data).decode())
return {}
# ✅ Save vault
def save_vault(vault_data):
encrypted_data = fernet.encrypt(json.dumps(vault_data).encode())
with open(VAULT_FILE, "wb") as f:
f.write(encrypted_data)
# ✅ Generate password
def generate_password(length=16):
characters = string.ascii_letters + string.digits + string.punctuation
return ''.join(secrets.choice(characters) for _ in range(length))
# ✅ Add entry
def add_entry():
vault = load_vault()
service = input("🔧 Service name: ").strip()
username = input("👤 Username: ").strip()
length = int(input("🔑 Password length (default 16): ") or 16)
password = generate_password(length)
vault[service] = {"username": username, "password": password}
save_vault(vault)
print(f"✅ Added '{service}' with password: {password}")
# ✅ Retrieve entry
def get_entry():
vault = load_vault()
service = input("🔍 Service name: ").strip()
if service in vault:
print(f"👤 Username: {vault[service]['username']}")
print(f"🔑 Password: {vault[service]['password']}")
else:
print("❌ Service not found.")
# ✅ List all saved services
def list_services():
vault = load_vault()
print("🗂 Saved services:")
for service in vault:
print(f"• {service}")
# ✅ CLI Menu
def menu():
print("\n🔐 Python Password Manager")
print("[1] Add new password")
print("[2] Retrieve password")
print("[3] List saved services")
print("[4] Exit")
choice = input("Choose an option: ")
if choice == "1":
add_entry()
elif choice == "2":
get_entry()
elif choice == "3":
list_services()
elif choice == "4":
exit()
else:
print("❌ Invalid choice.")
if __name__ == "__main__":
while True:
menu()
7. Weekly Folder Cleaner
Every Sunday night, this script nukes temp folders older than 7 days from /tmp
, /screenshots
, and /old-recordings
.
What It Does:
This script automatically:
Scans a folder (or multiple folders)
Identifies files older than X days (e.g., 7 days)
Deletes them safely
It’s perfect for cleaning up:
Temp folders
Screenshots
Old recordings or downloads
Log files, cache, build folders, etc.
Python Script: weekly_cleaner.py
# weekly_cleaner.py
import os
import time
from datetime import datetime
# ✅ Customize your folders and retention days
FOLDERS_TO_CLEAN = [
os.path.expanduser("~/Downloads"),
os.path.expanduser("~/Desktop/screenshots"),
os.path.expanduser("~/Videos/recordings")
]
DAYS_OLD = 7 # Files older than this will be deleted
def is_old(file_path, days_threshold):
file_time = os.path.getmtime(file_path)
age_seconds = time.time() - file_time
return age_seconds > days_threshold * 86400 # Convert days to seconds
def clean_folder(folder_path, days_old):
if not os.path.exists(folder_path):
print(f"❌ Folder not found: {folder_path}")
return
print(f"\n🧹 Cleaning folder: {folder_path}")
deleted_count = 0
for root, _, files in os.walk(folder_path):
for file in files:
file_path = os.path.join(root, file)
try:
if is_old(file_path, days_old):
os.remove(file_path)
print(f"🗑 Deleted: {file_path}")
deleted_count += 1
except Exception as e:
print(f"⚠️ Error deleting {file_path}: {e}")
if deleted_count == 0:
print("✅ Nothing old enough to delete.")
else:
print(f"✅ Deleted {deleted_count} files.")
def run_cleaner():
print(f"📅 Weekly Cleaner Started — Threshold: {DAYS_OLD} days\n")
for folder in FOLDERS_TO_CLEAN:
clean_folder(folder, DAYS_OLD)
print("\n🧼 All done!")
if __name__ == "__main__":
run_cleaner()
8. Auto-Notes from Voice (Speech to Text)
Using SpeechRecognition
and Google API, I capture thoughts while walking. They get saved to a markdown file in Obsidian.
What It Does:
This script listens to your microphone input, converts your spoken words into text, and automatically:
Saves it to a.txt
or.md
file
Appends with timestamps
Useful for quick journaling, brainstorming, or capturing ideas on the go
Requirements:
Install these Python packages:
pip install SpeechRecognition pyaudio
On Windows, install PyAudio wheels from here if you face installation issues.
Python Script: voice_to_notes.py
# voice_to_notes.py
import speech_recognition as sr
from datetime import datetime
import os
# ✅ Customize your notes directory and filename
NOTES_DIR = "voice_notes"
os.makedirs(NOTES_DIR, exist_ok=True)
NOTE_FILE = os.path.join(NOTES_DIR, f"{datetime.today().strftime('%Y-%m-%d')}.txt")
def append_to_file(text):
timestamp = datetime.now().strftime("%H:%M:%S")
with open(NOTE_FILE, "a", encoding="utf-8") as file:
file.write(f"[{timestamp}] {text}\n")
print(f"📝 Saved: [{timestamp}] {text}")
def record_voice_note():
recognizer = sr.Recognizer()
mic = sr.Microphone()
print("🎙 Speak clearly. Say 'stop recording' to end.")
with mic as source:
recognizer.adjust_for_ambient_noise(source)
while True:
try:
print("🎧 Listening...")
audio = recognizer.listen(source)
text = recognizer.recognize_google(audio)
print(f"🗣 You said: {text}")
if "stop recording" in text.lower():
print("🛑 Stopping recording.")
break
append_to_file(text)
except sr.UnknownValueError:
print("❓ Couldn’t understand. Try again.")
except sr.RequestError as e:
print(f"❌ API error: {e}")
break
if __name__ == "__main__":
record_voice_note()
Sample Output (2025–07–23.txt)
[10:12:03] Remember to call Sarah about the design feedback.
[10:13:55] New blog post idea: How to use FastAPI with LangChain.
[10:15:21] stop recording
9. Bulk Rename Files Intelligently
Pattern-based renaming using regex. Saved me hours renaming lecture files, client videos, and downloaded course materials.
What It Does:
This script lets you rename multiple files in a folder using smart patterns like:
Replacing spaces with underscores
Adding prefixes/suffixes
Changing filename case (uppercase/lowercase/titlecase)
Removing special characters
Renaming based on file metadata (optional)
Python Script: bulk_rename.py
import os
import re
# ✅ Customize this
FOLDER_PATH = "Downloads"
REPLACE_SPACES = True
TO_LOWERCASE = True
ADD_PREFIX = "cleaned_" # leave empty "" to disable
ADD_SUFFIX = "" # leave empty "" to disable
REMOVE_SPECIAL_CHARS = True
PREVIEW_ONLY = False # If True, only show what would happen
def clean_filename(name):
name, ext = os.path.splitext(name)
if REPLACE_SPACES:
name = name.replace(" ", "_")
if REMOVE_SPECIAL_CHARS:
name = re.sub(r"[^\w\-]", "", name)
if TO_LOWERCASE:
name = name.lower()
name = f"{ADD_PREFIX}{name}{ADD_SUFFIX}"
return name + ext
def bulk_rename(folder):
if not os.path.exists(folder):
print("❌ Folder not found.")
return
files = os.listdir(folder)
renamed_count = 0
for filename in files:
src = os.path.join(folder, filename)
if os.path.isfile(src):
new_name = clean_filename(filename)
dst = os.path.join(folder, new_name)
if src == dst:
continue # Skip if name doesn't change
print(f"🔁 {filename} → {new_name}")
if not PREVIEW_ONLY:
os.rename(src, dst)
renamed_count += 1
print(f"\n✅ {renamed_count} file(s) renamed.")
if __name__ == "__main__":
bulk_rename(FOLDER_PATH)
Suppose your folder has files like:
Photo 01!.JPG
Screenshot 2023-12-01 @2PM.png
Document - Final (Copy).pdf
The result will be:
cleaned_photo_01.jpg
cleaned_screenshot_20231201_2pm.png
cleaned_document_final_copy.pdf
10. Daily Weather Notification via Telegram
Pulls weather data using an API and sends me a daily message on Telegram. No more guessing whether I need an umbrella.
What It Does:
This script fetches the current weather for your city every morning and sends it to your Telegram chat using a bot.
You’ll get a neat weather summary like:
📍 Delhi
🌤️ Clear Sky
🌡 Temp: 33°C (Feels like 36°C)
💧 Humidity: 48%
💨 Wind: 12 km/h
Have a great day! ☀️
Requirements:
- OpenWeatherMap API Key (Free plan works)
- A Telegram Bot and your
chat_id
Install Dependencies:
pip install requests
Python Script: daily_weather_telegram.py
# daily_weather_telegram.py
import requests
from datetime import datetime
# === CONFIGURATION ===
CITY = "Delhi"
COUNTRY_CODE = "IN"
API_KEY = "your_openweathermap_api_key"
TELEGRAM_BOT_TOKEN = "your_telegram_bot_token"
TELEGRAM_CHAT_ID = "your_chat_id"
# ======================
def get_weather():
url = f"https://api.openweathermap.org/data/2.5/weather?q={CITY},{COUNTRY_CODE}&appid={API_KEY}&units=metric"
response = requests.get(url)
data = response.json()
if response.status_code != 200:
return f"⚠️ Weather error: {data.get('message', 'Unknown error')}"
weather = data["weather"][0]["description"].capitalize()
temp = data["main"]["temp"]
feels_like = data["main"]["feels_like"]
humidity = data["main"]["humidity"]
wind_speed = data["wind"]["speed"]
city_name = data["name"]
message = (
f"📍 {city_name}\n"
f"🌤️ {weather}\n"
f"🌡 Temp: {temp:.0f}°C (Feels like {feels_like:.0f}°C)\n"
f"💧 Humidity: {humidity}%\n"
f"💨 Wind: {wind_speed * 3.6:.0f} km/h\n\n"
f"Have a great day! ☀️"
)
return message
def send_telegram_message(message):
url = f"https://api.telegram.org/bot{TELEGRAM_BOT_TOKEN}/sendMessage"
payload = {"chat_id": TELEGRAM_CHAT_ID, "text": message}
response = requests.post(url, json=payload)
if response.status_code != 200:
print("❌ Failed to send message:", response.json())
else:
print("✅ Weather sent via Telegram!")
if __name__ == "__main__":
weather_message = get_weather()
send_telegram_message(weather_message)
Optional: Automate Daily with cron
(Linux/macOS)
crontab -e
Add this to run at 8:00 AM daily:
0 8 * * * /usr/bin/python3 /path/to/daily_weather_telegram.py
11. Sync Calendar Events with a Notion Database
Every morning, pulls upcoming Google Calendar events and logs them into my Notion dashboard using their API.
What It Does:
This script syncs your upcoming Google Calendar events directly into a Notion database — so you can manage your meetings, schedules, and reminders all in one place.
Each calendar event gets added as a row in Notion with:
Event Title
Start & End Time
Description
Calendar Name
Requirements:
- Google Calendar API access (OAuth credentials)
- A Notion integration with database access
- Python 3
Install Dependencies:
pip install google-api-python-client google-auth-httplib2 google-auth-oauthlib requests
Python Script: sync_calendar_to_notion.py
# sync_calendar_to_notion.py
import datetime
import os
import requests
import json
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
# === CONFIG ===
NOTION_DATABASE_ID = "your_notion_database_id"
NOTION_TOKEN = "secret_your_notion_token"
SCOPES = ["https://www.googleapis.com/auth/calendar.readonly"]
# ==============
def get_google_events():
creds = None
if os.path.exists("token.json"):
creds = Credentials.from_authorized_user_file("token.json", SCOPES)
else:
flow = InstalledAppFlow.from_client_secrets_file("credentials.json", SCOPES)
creds = flow.run_local_server(port=0)
with open("token.json", "w") as token:
token.write(creds.to_json())
service = build("calendar", "v3", credentials=creds)
now = datetime.datetime.utcnow().isoformat() + "Z"
end = (datetime.datetime.utcnow() + datetime.timedelta(days=1)).isoformat() + "Z"
events_result = service.events().list(
calendarId="primary", timeMin=now, timeMax=end,
maxResults=10, singleEvents=True, orderBy="startTime"
).execute()
return events_result.get("items", [])
def create_notion_event(event):
url = "https://api.notion.com/v1/pages"
headers = {
"Authorization": f"Bearer {NOTION_TOKEN}",
"Notion-Version": "2022-06-28",
"Content-Type": "application/json"
}
title = event.get("summary", "No Title")
description = event.get("description", "")
start = event["start"].get("dateTime", event["start"].get("date"))
end = event["end"].get("dateTime", event["end"].get("date"))
calendar = event.get("organizer", {}).get("email", "Unknown")
payload = {
"parent": {"database_id": NOTION_DATABASE_ID},
"properties": {
"Name": {"title": [{"text": {"content": title}}]},
"Start": {"date": {"start": start}},
"End": {"date": {"start": end}},
"Description": {"rich_text": [{"text": {"content": description}}]},
"Calendar": {"select": {"name": calendar}}
}
}
response = requests.post(url, headers=headers, data=json.dumps(payload))
if response.status_code != 200:
print("❌ Failed to add to Notion:", response.json())
else:
print(f"✅ Added: {title}")
def main():
events = get_google_events()
if not events:
print("No upcoming events found.")
return
for event in events:
create_notion_event(event)
if __name__ == "__main__":
main()
Sample Output in Notion:
| Name | Start Time | End Time | Calendar | Description |
| ---------------- | ------------------- | ------------------- | --------------------------------------- | --------------- |
| Weekly Sync Call | 2025-07-24T10:00:00 | 2025-07-24T11:00:00 | [your@email.com](your@email.com) | Sprint planning |
12. Clean Messy CSV Files
What It Does:
This script automatically cleans messy CSV files by:
Removing empty rows and columns
Stripping whitespace from headers and values
Filling or dropping missing values
Converting column types if possible
Saving a cleaned version of the CSV
Great for quick data pre-processing before analysis or uploading to a database.
Install Required Package:
pip install pandas
Python Script: clean_csv.py
# clean_csv.py
import pandas as pd
import os
def clean_csv(input_path, output_path=None, fillna_strategy="ffill"):
# Load CSV with inferred encoding
try:
df = pd.read_csv(input_path, encoding='utf-8')
except UnicodeDecodeError:
df = pd.read_csv(input_path, encoding='ISO-8859-1')
print("🔍 Original shape:", df.shape)
# Strip whitespace from column names
df.columns = df.columns.str.strip()
# Drop completely empty rows or columns
df.dropna(how="all", inplace=True)
df.dropna(axis=1, how="all", inplace=True)
# Strip whitespace from string values
df = df.applymap(lambda x: x.strip() if isinstance(x, str) else x)
# Try converting columns to numeric or datetime
for col in df.columns:
try:
df[col] = pd.to_datetime(df[col])
except Exception:
try:
df[col] = pd.to_numeric(df[col])
except Exception:
pass
# Handle missing values (can be customized)
if fillna_strategy == "ffill":
df.fillna(method="ffill", inplace=True)
elif fillna_strategy == "bfill":
df.fillna(method="bfill", inplace=True)
elif fillna_strategy == "drop":
df.dropna(inplace=True)
else:
df.fillna(fillna_strategy, inplace=True) # string or number
print("✅ Cleaned shape:", df.shape)
# Save cleaned file
if not output_path:
base, ext = os.path.splitext(input_path)
output_path = f"{base}_cleaned{ext}"
df.to_csv(output_path, index=False)
print(f"📁 Cleaned CSV saved to: {output_path}")
# Example usage
if __name__ == "__main__":
clean_csv("your_messy_file.csv") # Replace with your file
Sample Before/After
Before:
Name , Age , Email
Alice , 25 , alice@example.com
, ,
Bob , , bob@example.com
After:
Name,Age,Email
Alice,25,alice@example.com
Bob,25,bob@example.com
13. Check for Leaked Passwords Using HaveIBeenPwned API
Every month, this script checks my email/password pairs and notifies me if any appear in data breaches.
What It Does:
This script checks if a given password has appeared in a known data breach using the Have I Been Pwned Pwned Passwords API (k-anonymity model).
It does NOT send the full password — only a partial hash — so your secrets stay secure.
Dependencies:
pip install requests
Python Script: check_password_leak.py
# check_password_leak.py
import hashlib
import requests
def get_password_hash(password):
"""Return SHA-1 hash of the password (uppercase)."""
return hashlib.sha1(password.encode('utf-8')).hexdigest().upper()
def check_password_pwned(password):
"""
Check if the given password has been exposed in data breaches using HaveIBeenPwned API.
"""
sha1_hash = get_password_hash(password)
prefix, suffix = sha1_hash[:5], sha1_hash[5:]
url = f"https://api.pwnedpasswords.com/range/{prefix}"
response = requests.get(url)
if response.status_code != 200:
raise RuntimeError(f"API error: {response.status_code}")
# Check suffix in response
hashes = (line.split(":") for line in response.text.splitlines())
for returned_suffix, count in hashes:
if returned_suffix == suffix:
return int(count)
return 0 # Not found
# Example usage
if __name__ == "__main__":
import getpass
print("🔐 Check if your password has been exposed in a data breach")
pwd = getpass.getpass("Enter a password to check (input hidden): ")
count = check_password_pwned(pwd)
if count:
print(f"🚨 This password has been seen {count} times in data breaches!")
else:
print("✅ Your password was NOT found in any known breaches. (Good job!)")
Example Output:
🔐 Check if your password has been exposed in a data breach
Enter a password to check (input hidden): ***********
🚨 This password has been seen 543,200 times in data breaches!
14. Auto-Open Daily Tabs on Browser Startup
My daily dashboard: email, analytics, Slack, notes — all open with a click thanks to webbrowser
and a simple list.
What It Does:
This script automatically opens your favorite or essential websites in your default browser every morning (or on system startup).
It’s like a morning routine in code: news, email, weather, workspace — all loaded in one go.
Python Script: daily_tabs_launcher.py
# daily_tabs_launcher.py
import webbrowser
from datetime import datetime
# Define your daily tab routine
daily_tabs = {
"Monday": [
"https://calendar.google.com",
"https://mail.google.com",
"https://notion.so",
"https://slack.com"
],
"Tuesday": [
"https://calendar.google.com",
"https://mail.google.com",
"https://chat.openai.com",
"https://github.com"
],
"Wednesday": [
"https://calendar.google.com",
"https://stackoverflow.com",
"https://medium.com",
"https://notion.so"
],
"Thursday": [
"https://chat.openai.com",
"https://figma.com",
"https://mail.google.com",
"https://notion.so"
],
"Friday": [
"https://gmail.com",
"https://calendar.google.com",
"https://github.com",
"https://youtube.com"
],
"Saturday": [
"https://youtube.com",
"https://netflix.com",
"https://reddit.com"
],
"Sunday": [
"https://youtube.com",
"https://open.spotify.com",
"https://medium.com"
]
}
def open_daily_tabs():
today = datetime.now().strftime("%A")
urls = daily_tabs.get(today, [])
if not urls:
print(f"No tabs set for {today}.")
return
print(f"🚀 Opening tabs for {today}:")
for url in urls:
print(f"→ {url}")
webbrowser.open_new_tab(url)
if __name__ == "__main__":
open_daily_tabs()
How to Run Automatically on Startup (Optional):
On Windows:
- Press
Win + R
, typeshell:startup
, and hit Enter. - Paste a shortcut to the Python script in the folder.
On macOS:
- Use Automator or add it as a Login Item under System Preferences > Users & Groups > Login Items.
On Linux:
- Add to your desktop environment’s startup applications list or use
~/.bashrc
.
15. Backup Key Files to Google Drive Automatically
Monitors a folder and syncs changes to Google Drive every 6 hours. Built with pydrive
+ watchdog
.
What It Does:
This script automatically uploads important files or folders to your Google Drive, ensuring you always have a backup in the cloud.
Think of it as your personal backup bot — silently keeping your data safe every day.
Requirements:
- Google Drive API credentials
google-api-python-client
,google-auth-httplib2
, andgoogle-auth-oauthlib
Install them via:
pip install --upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib
Step-by-Step Setup
- Go to the Google Developers Console
- Create a new project → Enable Google Drive API
- Create OAuth 2.0 credentials
- Download the
credentials.json
file and place it in the same directory as your script
Python Script: backup_to_drive.py
# backup_to_drive.py
import os
import pickle
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from googleapiclient.http import MediaFileUpload
# Define the folder you want to back up
FOLDER_TO_BACKUP = "/path/to/your/folder" # e.g. "C:/Users/you/Documents/Important"
FILE_TYPES = ['.pdf', '.docx', '.xlsx', '.txt', '.jpg'] # Adjust to your need
# Google Drive folder ID where files will be uploaded
DRIVE_FOLDER_ID = 'YOUR_GOOGLE_DRIVE_FOLDER_ID'
# Scope for full Drive access
SCOPES = ['https://www.googleapis.com/auth/drive.file']
def authenticate():
creds = None
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file('credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
return build('drive', 'v3', credentials=creds)
def upload_file(service, file_path, folder_id):
file_name = os.path.basename(file_path)
mime_type = 'application/octet-stream'
media = MediaFileUpload(file_path, mimetype=mime_type)
file_metadata = {
'name': file_name,
'parents': [folder_id]
}
uploaded_file = service.files().create(
body=file_metadata,
media_body=media,
fields='id'
).execute()
print(f"✅ Uploaded: {file_name} (ID: {uploaded_file.get('id')})")
def backup_files():
service = authenticate()
for root, _, files in os.walk(FOLDER_TO_BACKUP):
for file in files:
if any(file.endswith(ext) for ext in FILE_TYPES):
full_path = os.path.join(root, file)
upload_file(service, full_path, DRIVE_FOLDER_ID)
if __name__ == '__main__':
backup_files()
How to Automate:
- Windows: Use Task Scheduler to run it daily/weekly
- macOS/Linux: Add it to a cron job, e.g.
0 10 * * * /usr/bin/python3 /path/to/backup_to_drive.py
16. Auto-Lock Screen If Idle for Too Long
If no mouse/keyboard activity is detected for 15 minutes, this script locks the screen. Productivity + security win.
What It Does:
This script monitors user activity (keyboard & mouse) and automatically locks the screen if there’s no activity for a set time (e.g., 5 minutes).
Great for protecting your system when you walk away and forget to lock it.
Requirements:
Install the required package:
pip install pyautogui
On Windows: no extra setup
On macOS/Linux: make sure you havegnome-screensaver
(Linux) or usepmset
/AppleScript (macOS)
Python Script: auto_lock_if_idle.py
# auto_lock_if_idle.py
import pyautogui
import time
import os
import platform
# Idle time threshold in seconds (e.g., 300 sec = 5 min)
IDLE_THRESHOLD = 300
# Check interval in seconds
CHECK_INTERVAL = 10
def get_platform_lock_command():
system = platform.system()
if system == "Windows":
return "rundll32.exe user32.dll,LockWorkStation"
elif system == "Darwin": # macOS
return "/System/Library/CoreServices/Menu\\ Extras/User.menu/Contents/Resources/CGSession -suspend"
elif system == "Linux":
return "gnome-screensaver-command -l"
else:
raise NotImplementedError("Unsupported OS")
def lock_screen():
command = get_platform_lock_command()
os.system(command)
def get_idle_time():
x, y = pyautogui.position()
time.sleep(1)
x_new, y_new = pyautogui.position()
if (x, y) == (x_new, y_new):
return 1
else:
return 0
def main():
idle_seconds = 0
print("🔒 Auto-lock monitor started...")
while True:
if get_idle_time():
idle_seconds += CHECK_INTERVAL
print(f"🕒 Idle for {idle_seconds} seconds")
else:
idle_seconds = 0
if idle_seconds >= IDLE_THRESHOLD:
print("⏳ Inactivity threshold reached. Locking screen...")
lock_screen()
idle_seconds = 0 # Reset after locking
time.sleep(CHECK_INTERVAL)
if __name__ == "__main__":
main()
17. AI Chatbot for Personal Notes
Trained on my own notes (using LangChain + OpenAI) to answer questions like “What did I learn from Book X?” or “What’s the React hook for…?”
What It Does:
This script turns your personal notes into a searchable chatbot using OpenAI’s GPT API. You can ask questions like:
“What did I write about machine learning?”
“Summarize my notes on productivity.”
The bot understands natural language and responds contextually — like having your second brain on call.
Requirements:
Install dependencies:
pip install openai python-dotenv
Directory Structure Example
personal_notes/
├── productivity.md
├── python_tips.txt
├── mental_models.md
Setup Your API Key
Create a .env
file:
OPENAI_API_KEY=your_openai_api_key
Python Script: personal_notes_chatbot.py
# personal_notes_chatbot.py
import os
import openai
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
NOTES_DIR = "personal_notes"
def load_notes():
content = []
for file in os.listdir(NOTES_DIR):
if file.endswith(".txt") or file.endswith(".md"):
with open(os.path.join(NOTES_DIR, file), "r", encoding="utf-8") as f:
content.append(f"## {file} ##\n" + f.read())
return "\n\n".join(content)
def ask_gpt(query, notes_context):
prompt = f"""
You are a helpful AI assistant. I will provide you with my personal notes and a question. Use the notes to answer.
NOTES:
{notes_context}
QUESTION:
{query}
"""
response = openai.ChatCompletion.create(
model="gpt-4",
messages=[{"role": "user", "content": prompt}],
temperature=0.4,
)
return response.choices[0].message['content'].strip()
def main():
print("🤖 Personal Notes Chatbot Ready!")
notes = load_notes()
while True:
query = input("\nAsk me something about your notes (or type 'exit'): ")
if query.lower() == "exit":
break
answer = ask_gpt(query, notes)
print(f"\n🧠 GPT: {answer}")
if __name__ == "__main__":
main()
Example Usage
You ask:
What are my notes on building habits?
Bot replies:
Based on your notes, building habits involves starting small, setting daily reminders, and avoiding perfectionism...
18. Invoice Generator from Google Sheets
Every end of the month, it pulls project data from Google Sheets, calculates billing, and generates a professional PDF invoice using ReportLab
.
What It Does:
This Python script reads data from a Google Sheet, extracts invoice info, and generates a beautiful PDF invoice for each row. Perfect for freelancers, small businesses, or side hustlers.
Requirements
Install the dependencies:
pip install gspread oauth2client reportlab
Also:
- Create a Google Cloud project
- Enable the Google Sheets API
- Create service account credentials and share the Sheet with its email
- Save the credentials as
creds.json
in your project
Google Sheet Format (Example)
| Invoice # | Date | Client Name | Description | Amount |
| --------- | ---------- | ----------- | ---------------- | ------ |
| INV001 | 2025-07-01 | Acme Corp | Website Redesign | 1500 |
| INV002 | 2025-07-03 | Beta LLC | Monthly Retainer | 800 |
Python Script: generate_invoices.py
# generate_invoices.py
import gspread
from oauth2client.service_account import ServiceAccountCredentials
from reportlab.lib.pagesizes import A4
from reportlab.pdfgen import canvas
import os
# Setup Google Sheets access
def get_sheet():
scope = ['https://spreadsheets.google.com/feeds','https://www.googleapis.com/auth/drive']
creds = ServiceAccountCredentials.from_json_keyfile_name('creds.json', scope)
client = gspread.authorize(creds)
sheet = client.open("Invoices").sheet1 # Change the spreadsheet name if needed
return sheet.get_all_records()
# Create invoice PDF
def generate_invoice(data):
invoice_num = data['Invoice #']
client = data['Client Name']
filename = f"{invoice_num}_{client.replace(' ', '_')}.pdf"
c = canvas.Canvas(filename, pagesize=A4)
c.setFont("Helvetica-Bold", 18)
c.drawString(50, 800, f"Invoice: {invoice_num}")
c.setFont("Helvetica", 12)
c.drawString(50, 770, f"Date: {data['Date']}")
c.drawString(50, 750, f"Client: {client}")
c.drawString(50, 730, f"Description: {data['Description']}")
c.drawString(50, 710, f"Amount: ${data['Amount']:.2f}")
c.line(50, 700, 550, 700)
c.drawString(50, 680, "Thank you for your business!")
c.save()
print(f"✅ Saved invoice: {filename}")
def main():
print("📤 Fetching invoice data from Google Sheets...")
records = get_sheet()
print(f"Found {len(records)} rows. Generating invoices...\n")
for row in records:
generate_invoice(row)
if __name__ == "__main__":
main()
The Power of Small Scripts
These scripts don’t look flashy. But they free up hours of mental load every week.
The beauty of Python is in these small, quiet wins that add up to huge gains.
If you want to get better at automation, don’t wait. Start building your own toolbelt — one script at a time.
Call to Action
Want the full source code for these automations? Drop a comment or message — if enough readers ask, I’ll create a GitHub repo + tutorial series.
Follow me for more Python automation, productivity hacks, and practical tech tips every week.
