A robust, production-ready command-line tool for converting CSV files to SQL databases with intelligent data type inference, multiple database support, and comprehensive error handling.
- Multiple Database Support: MySQL, PostgreSQL, SQLite
- Intelligent Data Type Inference: Automatically detects optimal SQL data types
- Batch Processing: Handles large files with configurable chunk sizes
- Data Validation: Comprehensive CSV file validation and analysis
- Error Handling: Robust error handling with detailed logging
- Command Line Interface: Professional CLI with colored output
- Interactive Mode: Menu-driven interface for ease of use
- Progress Tracking: Visual progress bars for long operations
- Detailed Analysis: Preview data structure before import
- Flexible Configuration: YAML files, environment variables, CLI options
- Security: No hardcoded credentials, environment variable support
- Logging: Comprehensive logging with configurable levels
- Python 3.8 or higher
- Virtual environment (recommended)
# Clone the repository
git clone https://github.com/SLASH217/CSV-to-SQL-Menu-Driven-.git
cd CSV-to-SQL-Menu-Driven-
# Create and activate virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txtInstall the appropriate database driver:
# For MySQL
pip install pymysql
# For PostgreSQL
pip install psycopg2-binary
# SQLite is included with Python# Set up PostgreSQL container
./setup_postgres.sh
# Or manually
cp .env_postgres.example .env_postgres
# Edit .env_postgres with your credentials
docker-compose up -d postgresCopy and edit the configuration file:
cp config.yaml my_config.yaml
# Edit my_config.yaml with your settings# For general use
cp .env.example .env
# Edit .env with your database credentials
# For PostgreSQL with Docker
cp .env_postgres.example .env_postgres
# Edit .env_postgres with your PostgreSQL credentialsPass configuration directly via CLI options.
# Analyze a CSV file
python csv_converter_pro.py import-csv sample.csv --analyze-only
# Import CSV to database
python csv_converter_pro.py import-csv sample.csv --table my_table
# Interactive mode
python csv_converter_pro.py interactivepython csv_converter_pro.py import-csv FILE [OPTIONS]
Options:
-t, --table TEXT Target table name
--if-exists [fail|replace|append] What to do if table exists (default: append)
--chunk-size INTEGER Rows to process at once (default: 10000)
--analyze-only Only analyze the CSV file# List databases
python csv_converter_pro.py list-databases
# List tables
python csv_converter_pro.py list-tables
# Create database
python csv_converter_pro.py create-database DB_NAME
# Drop database
python csv_converter_pro.py drop-database DB_NAME [--force]python csv_converter_pro.py interactive# Use custom config file
python csv_converter_pro.py -c my_config.yaml import-csv file.csv
# All commands support the -c/--config optionThe tool automatically infers SQL data types based on your CSV content:
| CSV Content | SQL Type | Example |
|---|---|---|
| Integer numbers | TINYINT/SMALLINT/INT/BIGINT | 42, 1000 |
| Decimal numbers | DECIMAL(10,2) | 3.14, 99.99 |
| Boolean values | BOOLEAN | true, false, 1, 0 |
| Date strings | DATE | 2023-12-25, 25/12/2023 |
| DateTime strings | DATETIME | 2023-12-25 10:30:00 |
| Text content | VARCHAR(n) or TEXT | Names, descriptions |
# Process large files with smaller chunks
python csv_converter_pro.py import-csv large_file.csv --chunk-size 5000
# Replace existing table for fresh import
python csv_converter_pro.py import-csv data.csv --if-exists replace# MySQL (default)
DB_TYPE=mysql python csv_converter_pro.py import-csv data.csv
# PostgreSQL
DB_TYPE=postgresql python csv_converter_pro.py import-csv data.csv
# SQLite
DB_TYPE=sqlite python csv_converter_pro.py import-csv data.csv# Analyze first
python csv_converter_pro.py import-csv sales_data.csv --analyze-only
# Import after reviewing
python csv_converter_pro.py import-csv sales_data.csv --table salespython csv_converter_pro.py interactive
# Follow the menu prompts:
# 1. Import CSV file
# 2. Analyze CSV file
# 3. List databases
# 4. List tables
# 5. Create database
# 6. Drop database
# 7. Exit# Process multiple files
for file in *.csv; do
python csv_converter_pro.py import-csv "$file" --table "table_$(basename "$file" .csv)"
donedatabase:
type: "mysql" # mysql, postgresql, sqlite
host: "localhost"
port: 3306
username: "root"
password: ""
database: "csv_converter"csv:
encoding: "utf-8" # File encoding
chunk_size: 10000 # Rows per batch
max_varchar_length: 255 # Maximum VARCHAR lengthlogging:
level: "INFO" # DEBUG, INFO, WARNING, ERROR
file: "csv_converter.log"Connection Errors
- Check database credentials in config.yaml or .env
- Ensure database server is running
- Verify network connectivity
Import Errors
- Check CSV file encoding (try UTF-8)
- Verify file permissions
- Ensure sufficient disk space
Memory Issues
- Reduce chunk_size for large files
- Close other applications
- Consider upgrading system memory
# Enable verbose logging
python csv_converter_pro.py -c config.yaml import-csv file.csv --verbose- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Repository: https://github.com/SLASH217/CSV-to-SQL-Menu-Driven-
- Issues: https://github.com/SLASH217/CSV-to-SQL-Menu-Driven-/issues
- Documentation: This README
- Complete rewrite with modern Python practices
- Added support for PostgreSQL and SQLite
- Intelligent data type inference
- Professional CLI interface
- Configuration management system
- Comprehensive error handling
- Interactive mode
- Progress tracking
- Detailed logging
- Basic CSV to MySQL conversion
- Menu-driven interface
- Basic data type detection