Learning to code by generating scripts using LLMs (chatGPT and Claude) - part two

Learning to code by generating scripts using LLMs (chatGPT and Claude) - part two


In the previous post Learning to code by generating scripts using LLMs (chatGPT and Claude)

Thanks for your responses and feedback. I am working with some interns in our team and here is a list we identifed of the scripts that we could generate. welcome thoughts. The list was generated using chatGPT. I also share code below for running the script from the previous post in flask (also LLM generated).

There are many challenging aspects for these scripts ex learning powershell, GIT etc but its a great learning experience in my view

1. Utilities

  1. Backup Script: Automate file and folder backups with versioning and optional compression.
  2. Web Scraper: Extract headlines or data tables from websites and save them to CSV or JSON.
  3. Disk Usage Monitor: Script to check disk usage and alert when a threshold is crossed.
  4. File Renamer: Batch rename files based on a specific pattern or date.
  5. Log Rotation: Automatically archive and compress old log files.
  6. Clipboard Manager: Save clipboard history to a file for later use.
  7. Duplicate File Finder: Identify duplicate files in a folder and prompt for deletion.
  8. System Cleaner: Remove temporary files and clear caches.
  9. Password Generator: Generate random secure passwords with specified length and complexity.
  10. URL Shortener: Use a service API to shorten URLs from a file or input.


2. Invoking REST APIs

  1. Weather Information: Fetch and display weather details using an open weather API.
  2. Stock Price Tracker: Fetch real-time stock data and save it locally.
  3. Translation Tool: Use a translation API to translate text from one language to another.
  4. Currency Converter: Fetch exchange rates and convert amounts dynamically.
  5. GitHub Repo Info: Use the GitHub API to fetch details about a repository.
  6. News Fetcher: Retrieve top headlines using a news API.
  7. Flight Information: Fetch flight status using an airline API.
  8. Movie Database: Query movie details using IMDb or TMDb APIs.
  9. Authentication Example: Fetch a token using an OAuth2-based API.
  10. Todo App Integration: Add tasks to a to-do list service like Todoist via their API.


3. Running Scripts Automatically

  1. Backup Scheduler: Automate backups daily using CRON or Windows Task Scheduler.
  2. Log Analyzer: Schedule periodic analysis of logs for errors or anomalies.
  3. API Poller: Periodically check for updates from an API and store results.
  4. Email Notifier: Send a summary email of system events daily.
  5. Database Cleanup: Schedule regular cleanup of old records in a database.
  6. Reminder Script: Send reminders (via email or notification) for upcoming tasks.
  7. Social Media Scheduler: Post tweets or other social media updates automatically.
  8. Script Validator: Check the syntax of code repositories regularly.
  9. Data Sync: Schedule sync between two locations (local/remote).
  10. Performance Monitor: Periodically log system resource usage.


4. Git (Command Line)

  1. Auto-Commit: Commit all changes with a pre-defined message.
  2. Branch Switcher: List all branches and easily switch between them.
  3. Repo Cloner: Clone multiple repositories from a list.
  4. Merge Conflict Checker: Detect and report merge conflicts automatically.
  5. Git Stats: Analyze commit history and contributors for a repo.
  6. Git Cleaner: Remove old or unused branches from local and remote.
  7. Pre-Commit Hook: Validate code formatting or run tests before committing.
  8. Git Searcher: Search for a keyword across all branches of a repo.
  9. Git Log Formatter: Format and output git logs in a custom way (e.g., JSON).
  10. Automated Pull: Pull updates from multiple repositories regularly.


5. JSON (Analyzing Event Logs)

  1. Log Parser: Extract and filter specific events from a JSON log file.
  2. Error Aggregator: Identify and group errors from JSON logs.
  3. Event Frequency Analyzer: Calculate the frequency of specific events in logs.
  4. Response Time Analyzer: Analyze API response times from JSON logs.
  5. Log Merger: Merge multiple JSON log files into one.
  6. Log Formatter: Pretty-print or compact JSON logs for readability or storage.
  7. Alert Generator: Trigger alerts based on specific patterns in JSON logs.
  8. JSON Diff Checker: Compare two JSON files and highlight differences.
  9. Event Timeline Generator: Create a timeline of events from JSON logs.
  10. Top N Events: Identify the top N most frequent events in logs.


6. Database Management

  1. CRUD Operations: Create, Read, Update, and Delete records in a database.
  2. Data Importer: Import data from a CSV or JSON file into a database.
  3. Schema Validator: Check if a database matches a predefined schema.
  4. Database Backup: Backup a database to a file.
  5. Query Automator: Run pre-defined queries and save results regularly.
  6. Connection Tester: Test and validate database connections.
  7. Index Analyzer: Suggest missing or unused indexes in a database.
  8. Data Archiver: Move old data to an archive table.
  9. Database Monitor: Monitor database size and performance metrics.
  10. User Access Auditor: Report user permissions and access logs.


7. Cloud (Azure PowerShell and AWS CLI)

  1. VM Deployer: Deploy a virtual machine on Azure or AWS.
  2. Storage Management: Automate the creation, listing, and deletion of storage containers or buckets.
  3. Cost Analyzer: Retrieve and analyze cloud billing data.
  4. Resource Cleanup: Identify and remove unused cloud resources.
  5. Backup to Cloud: Upload local files to Azure Blob or AWS S3.
  6. Serverless Deployment: Deploy a Lambda function or Azure Function.
  7. Service Status Checker: Monitor the health of specific cloud services.
  8. Cloud Watch Alerts: Create and manage monitoring alerts for AWS or Azure resources.
  9. IAM Role Creator: Automate the creation of user roles with specific permissions.
  10. Cloud Config Validator: Check compliance of resources against defined policies.


8. Virtual Machines and DevOps

  1. VM Snapshot Script: Automate snapshots of virtual machines.
  2. VM Status Checker: Periodically check the status of VMs and alert if down.
  3. CI/CD Pipeline Automation: Trigger builds or deployments based on a condition.
  4. VM Resource Monitor: Log CPU, memory, and disk usage of a VM.
  5. Deployment Script: Automate deployment of applications to a VM.
  6. Container Manager: Automate the creation, management, and removal of containers.
  7. Rollback Script: Automate rollback to a previous VM snapshot.
  8. Infrastructure as Code: Automate infrastructure setup with Terraform or similar tools.
  9. DevOps Alert Script: Notify on failed builds or deployments.
  10. Code Quality Checker: Automate running linting tools and code coverage reports.

Implementing script in flask

#!/usr/bin/env python3

"""
This script processes sales data and provides a REST API for invoking the processing.
"""

from flask import Flask, jsonify, request
import os
import pandas as pd

app = Flask(__name__)

DATA_FILE = "data/sales_data.csv"
OUTPUT_DIR = "output/"

def load_data(file_path):
    """Loads data from a CSV file."""
    if not os.path.exists(file_path):
        raise FileNotFoundError(f"The file {file_path} does not exist.")
    return pd.read_csv(file_path)

class DataProcessor:
    def __init__(self, data):
        self.data = data
    
    def clean_data(self):
        """Cleans the data."""
        self.data.dropna(inplace=True)
        return self.data.describe()  # Return summary statistics as an example output.

@app.route('/process', methods=['POST'])
def process_data():
    """
    Endpoint to process sales data.
    Expects the request body to specify 'file_path' if using a custom file.
    """
    try:
        # Use custom file if provided
        file_path = request.json.get('file_path', DATA_FILE)
        
        # Load the data
        data = load_data(file_path)
        
        # Process the data
        processor = DataProcessor(data)
        summary_stats = processor.clean_data()
        
        # Convert DataFrame summary to JSON
        summary_json = summary_stats.to_dict()
        
        return jsonify({
            "status": "success",
            "summary_statistics": summary_json
        }), 200
    except Exception as e:
        return jsonify({
            "status": "error",
            "message": str(e)
        }), 400

if __name__ == "__main__":
    app.run(debug=True)
        

要查看或添加评论,请登录

Ajit Jaokar的更多文章

社区洞察

其他会员也浏览了