Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,17 @@ If your planning to do a cloud exit from Heroku and require to test your app wit
2. Copy `schema.sql` into the same directory as `migrate.sh`
3. Copy `development.sqlite3` = `<sqlite_database_file>` into the same directory as `migrate.sh`
4. Login into your Heroku account, navigate to your App > Add-ons where it list your Heroku Postgres DB. Under Datastores > Setting, copy your PostgreSQL Connection URL, it will be use in the next step. URI starts with `postgresql://...` = `[postgresql_connection_string]` ![heroku_db](/images/heroku_db_admin.jpg)
5. Run the script; `./migrate.sh <sqlite_database_file> [postgres_connection_string]`. Example:`./migrate.sh development.sqlite3 postgres://...`
5. Run the script; `./migrate.sh <sqlite_database_file> [postgres_connection_string]`. Example:`./migrate.sh development.sqlite3 postgres://...`
6. Wait until finished and once done your `development.sqlite3` will be populated with data from the postgresql db. You can copy the `development.sqlite3` into your app and test it locally.

## Migrating from PostgreSQL to Cloudflare D1 (Redwood)

For Redwood applications moving from PostgreSQL to the [Redwood SDK](https://github.com/redwoodjs/sdk) backed by Cloudflare D1,
run `migrate.sh` with the `--d1` flag. Supply the database name via `--d1 <name>`
or set the `D1_DATABASE_NAME` environment variable. The script dumps your PostgreSQL
data, converts it to a D1-compatible format, runs Prisma migrations and then
imports the data with `wrangler`.

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=hirefrank/pg-to-sqlite&type=Date)](https://www.star-history.com/#hirefrank/pg-to-sqlite&Date)
Expand Down
289 changes: 195 additions & 94 deletions migrate.sh
Original file line number Diff line number Diff line change
@@ -1,126 +1,227 @@
#!/bin/bash
set -euo pipefail

# Check if the minimum required arguments are provided
if [ "$#" -lt 1 ]; then
echo "Usage: $0 <sqlite_database_file> [postgres_connection_string] [--reset]" >&2
usage() {
cat >&2 <<EOF
Usage: $0 <sqlite_database_file> [postgres_connection_string] [--reset] [--d1 [name]]

--reset Remove existing dump and database files before running
--d1 [name] Run the Cloudflare D1 flow. Optionally supply the D1 database
name or provide it via the D1_DATABASE_NAME environment variable.
EOF
exit 1
}

if [ "$#" -lt 1 ]; then
usage
fi

START_TIME=$(date +%s)
LOG_FILE="migrate_postgres_to_sqlite3.log"
SQLITE_SCHEMA_FILE="./schema.sql"
echo "Starting migration process..." > "$LOG_FILE" # This line will create a new log file or overwrite an existing one
echo "Starting migration process..." > "$LOG_FILE" # Create or overwrite log file

# Define an array of tables to exclude
EXCLUDE_TABLES=("__drizzle_migrations", "ar_internal_metadata", "schema_migrations", "pg_catalog")
# Define an array of tables to exclude for the SQLite flow
EXCLUDE_TABLES=("__drizzle_migrations" "ar_internal_metadata" "schema_migrations" "pg_catalog")

# Assign the provided arguments to variables
# Required argument
SQLITE_DATABASE_FILE="$1"
POSTGRES_CONN_STRING="${2:-}"
shift

# Option defaults
POSTGRES_CONN_STRING=""
RESET_FLAG=false
D1_MODE=false
D1_DATABASE_NAME="${D1_DATABASE_NAME:-}"

# Parse remaining arguments
while [ "$#" -gt 0 ]; do
case "$1" in
--reset)
RESET_FLAG=true
shift
;;
--d1)
D1_MODE=true
if [ -n "${2:-}" ] && [[ ${2} != --* ]]; then
D1_DATABASE_NAME="$2"
shift 2
else
shift
fi
;;
*)
if [ -z "$POSTGRES_CONN_STRING" ]; then
POSTGRES_CONN_STRING="$1"
shift
else
usage
fi
;;
esac
done

POSTGRES_DUMP_FILE="$(basename "$SQLITE_DATABASE_FILE" .sqlite).dump"
RESET_FLAG="${3:-}"

if [ "$RESET_FLAG" == "--reset" ]; then
echo "Resetting environment (cleaning up existing files and recreating SQLite3 database)..." >> "$LOG_FILE"
if $RESET_FLAG; then
echo "Resetting environment (removing existing files)..." >> "$LOG_FILE"
rm -f "$POSTGRES_DUMP_FILE" "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
touch "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
fi

# Check for pg_dump, sed, sqlite3
for cmd in pg_dump sed sqlite3; do
if ! command -v "$cmd" &> /dev/null; then
# Determine required commands
CMDS=(pg_dump sed sqlite3)
if $D1_MODE; then
CMDS+=(npx pnpm)
fi
for cmd in "${CMDS[@]}"; do
if ! command -v "$cmd" >/dev/null 2>&1; then
echo "Error: $cmd is not installed." >&2
exit 1
fi
done

# Define an array of tables to exclude
EXCLUDE_TABLES=("__drizzle_migrations")

# Build the exclude-table options
EXCLUDE_TABLES_OPTIONS=""
for TABLE in "${EXCLUDE_TABLES[@]}"; do
EXCLUDE_TABLES_OPTIONS+="--exclude-table=$TABLE "
done
if $D1_MODE; then
# --- Cloudflare D1 flow ---
POSTGRES_DUMP_FILE="pgdb.dump.sql"
TEMP_SQL_FILE="converted_to_sqlite.sql"
rm -f "$POSTGRES_DUMP_FILE" "$TEMP_SQL_FILE"
touch "$TEMP_SQL_FILE"
rm -rf ../.wrangler
echo "Deleted directory .wrangler to ensure a clean start."

EXCLUDE_TABLES_OPTIONS=""
for TABLE in public._prisma_migrations; do
EXCLUDE_TABLES_OPTIONS+="--exclude-table=$TABLE --exclude-table-data=$TABLE "
done

echo "Creating PostgreSQL dump file '$POSTGRES_DUMP_FILE'..."
pg_dump -n public --data-only --attribute-inserts $EXCLUDE_TABLES_OPTIONS \
"$POSTGRES_CONN_STRING" > "$POSTGRES_DUMP_FILE"

echo "Converting PostgreSQL dump file to SQLite3 compatible SQL..."
sed -E \
-e 's/\\\\:/\:/g' \
-e 's/\\\\//g' \
-e 's/\\\\;/;/g' \
-e '/^SET /d' \
-e '/setval/d' \
-e "s/'true'/1/g" \
-e "s/'false'/0/g" \
-e 's/public\.//' \
-e '/^[[:space:]]*SELECT/d' \
-e "s/'([0-9]{4}-[0-9]{2}-[0-9]{2}) ([0-9]{2}:[0-9]{2}:[0-9]{2}\\.[0-9]+)\\+[0-9]{2}'/'\1T\2Z'/g" \
"$POSTGRES_DUMP_FILE" > "$TEMP_SQL_FILE"

sed -i '1i\
PRAGMA foreign_keys = OFF;\
PRAGMA defer_foreign_keys = on;\
' "$TEMP_SQL_FILE"
echo "PRAGMA defer_foreign_keys = off;" >> "$TEMP_SQL_FILE"
echo "Conversion to SQLite3 compatible SQL completed."

if [ -z "$D1_DATABASE_NAME" ]; then
echo "Error: D1 database name not provided." >&2
exit 1
fi

# Create or recreate the SQLite3 database file
if [ -f "$SQLITE_DATABASE_FILE" ]; then
echo "Recreating SQLite3 database: $SQLITE_DATABASE_FILE" >> "$LOG_FILE"
rm "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
fi
echo "Creating SQLite3 database: $SQLITE_DATABASE_FILE" >> "$LOG_FILE"
touch "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
DB_EXISTS=$(npx wrangler d1 list | grep -c "$D1_DATABASE_NAME" || true)
if [[ "$DB_EXISTS" -eq 0 ]]; then
echo "D1 database '$D1_DATABASE_NAME' does not exist. Creating it now..."
npx wrangler d1 create "$D1_DATABASE_NAME"
echo "D1 database '$D1_DATABASE_NAME' created successfully."
echo -e "\033[1;31m\033[1m\033[48;5;15m\n======================================================================\n Before continuing, update wrangler config file with the DB binding.\n======================================================================\n\033[0m"
read -p "Update your wrangler config file with the DB binding, then press [Enter] to continue..."
else
echo "D1 database '$D1_DATABASE_NAME' already exists."
echo "Checking if D1 database '$D1_DATABASE_NAME' has any tables..."
TABLE_NAMES=$(npx wrangler d1 execute "$D1_DATABASE_NAME" --remote --command='PRAGMA table_list' \
| grep -o '"name": *"[^"\n]*"' | grep -o '"[^"\n]*"$' | tr -d '"' \
| grep -Ev '(_cf_KV|sqlite_schema|sqlite_sequence|d1_migrations|sqlite_temp_schema)')
TABLE_COUNT=$(echo "$TABLE_NAMES" | grep -c .)
if [[ "$TABLE_COUNT" -gt 0 ]]; then
echo -e "\033[1;33mWARNING: D1 database '$D1_DATABASE_NAME' already contains tables ($TABLE_COUNT found).\033[0m"
read -p $'\033[1;33mWill need to delete and recreate your DB. Ok to proceed? > (Y/n): \033[0m' USER_CHOICE
USER_CHOICE=${USER_CHOICE:-y}
USER_CHOICE=$(echo "$USER_CHOICE" | tr '[:upper:]' '[:lower:]')
if [[ "$USER_CHOICE" == "n" ]]; then
echo "Aborting migration as requested by user."
exit 0
else
echo "Resetting D1 database: $D1_DATABASE_NAME"
npx wrangler d1 delete "$D1_DATABASE_NAME" -y
npx wrangler d1 create "$D1_DATABASE_NAME"
echo -e "\033[1;31m\033[1m\033[48;5;15m\n===========================================================================\nBefore continuing, update wrangler config file with the new database ID.\n===========================================================================\n\033[0m"
read -p "Update your wrangler config file with the new database ID, then press [Enter] to continue..."
fi
fi
fi

# Check if the PostgreSQL dump file exists
if [ -f "$POSTGRES_DUMP_FILE" ]; then
--attribute-inserts echo "Using existing PostgreSQL dump file: $POSTGRES_DUMP_FILE" >> "$LOG_FILE"
echo "Creating and then running an init Prisma migration to create tables in D1 database..."
mkdir -p ../migrations
find ../migrations -mindepth 1 -delete
echo y | pnpm --dir .. run migrate:dev
pnpm --dir .. run migrate:new init --no-apply
echo y | pnpm --dir .. run migrate:dev
echo y | pnpm --dir .. run migrate:prd

echo "Importing SQL statements into D1 database..."
npx wrangler d1 execute "$D1_DATABASE_NAME" --remote --file "$TEMP_SQL_FILE" -y
npx wrangler d1 execute "$D1_DATABASE_NAME" --local --file "$TEMP_SQL_FILE" -y
else
if [ -n "$POSTGRES_CONN_STRING" ]; then
echo "PostgreSQL dump file '$POSTGRES_DUMP_FILE' not found, creating a new one." >> "$LOG_FILE"
pg_dump --data-only --attribute-inserts $EXCLUDE_TABLES_OPTIONS "$POSTGRES_CONN_STRING" > "$POSTGRES_DUMP_FILE" 2>> "$LOG_FILE"
# --- SQLite flow ---
EXCLUDE_TABLES_OPTIONS=""
for TABLE in "${EXCLUDE_TABLES[@]}"; do
EXCLUDE_TABLES_OPTIONS+="--exclude-table=$TABLE "
done

if [ -f "$POSTGRES_DUMP_FILE" ]; then
echo "Using existing PostgreSQL dump file: $POSTGRES_DUMP_FILE" >> "$LOG_FILE"
else
echo "Skipping PostgreSQL dump file creation (no connection string provided)." >> "$LOG_FILE"
if [ -n "$POSTGRES_CONN_STRING" ]; then
echo "PostgreSQL dump file '$POSTGRES_DUMP_FILE' not found, creating a new one." >> "$LOG_FILE"
pg_dump --data-only --attribute-inserts $EXCLUDE_TABLES_OPTIONS "$POSTGRES_CONN_STRING" > "$POSTGRES_DUMP_FILE" 2>> "$LOG_FILE"
else
echo "Skipping PostgreSQL dump file creation (no connection string provided)." >> "$LOG_FILE"
fi
fi
fi

# Create a temporary SQL file and ensure it gets deleted on script exit
TEMP_SQL_FILE="$(mktemp)"
trap 'rm -f "$TEMP_SQL_FILE"' EXIT

# Convert the PostgreSQL dump file to SQL statements compatible with SQLite3
echo "Converting PostgreSQL dump file to SQLite3 compatible SQL..."
sed \
-e 's/\\\\:/\:/g' \
-e 's/\\\\//g' \
-e 's/\\\\;/;/g' \
-e '/^SET /d' \
-e '/setval/d' \
-e "s/'true'/1/g" \
-e "s/'false'/0/g" \
-e 's/public\.//' \
-e '/^[[:space:]]*SELECT/d' \
"$POSTGRES_DUMP_FILE" > "$TEMP_SQL_FILE"

echo "Conversion to SQLite3 compatible SQL completed."

# Wrap the SQL statements with BEGIN and COMMIT transactions
echo "BEGIN TRANSACTION;" > "$TEMP_SQL_FILE.converted"
cat "$TEMP_SQL_FILE" >> "$TEMP_SQL_FILE.converted"
echo "COMMIT;" >> "$TEMP_SQL_FILE.converted"

# Now move the fully prepared SQL file into place
mv "$TEMP_SQL_FILE.converted" "$TEMP_SQL_FILE"

# Create or recreate the SQLite3 database file
if [ -f "$SQLITE_DATABASE_FILE" ]; then
echo "Recreating SQLite3 database: $SQLITE_DATABASE_FILE" >> "$LOG_FILE"
rm "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
fi
echo "Creating SQLite3 database: $SQLITE_DATABASE_FILE" >> "$LOG_FILE"
touch "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1

# Check if the SQLite schema file exists
if [ ! -f "$SQLITE_SCHEMA_FILE" ]; then
echo "SQLite schema file '$SQLITE_SCHEMA_FILE' not found."
exit 1
TEMP_SQL_FILE="$(mktemp)"
trap 'rm -f "$TEMP_SQL_FILE"' EXIT
echo "Converting PostgreSQL dump file to SQLite3 compatible SQL..."
sed \
-e 's/\\\\:/\:/g' \
-e 's/\\\\//g' \
-e 's/\\\\;/;/g' \
-e '/^SET /d' \
-e '/setval/d' \
-e "s/'true'/1/g" \
-e "s/'false'/0/g" \
-e 's/public\.//' \
-e '/^[[:space:]]*SELECT/d' \
"$POSTGRES_DUMP_FILE" > "$TEMP_SQL_FILE"
echo "Conversion to SQLite3 compatible SQL completed."
echo "BEGIN TRANSACTION;" > "$TEMP_SQL_FILE.converted"
cat "$TEMP_SQL_FILE" >> "$TEMP_SQL_FILE.converted"
echo "COMMIT;" >> "$TEMP_SQL_FILE.converted"
mv "$TEMP_SQL_FILE.converted" "$TEMP_SQL_FILE"

echo "Creating SQLite3 database: $SQLITE_DATABASE_FILE" >> "$LOG_FILE"
rm -f "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
touch "$SQLITE_DATABASE_FILE" >> "$LOG_FILE" 2>&1
if [ ! -f "$SQLITE_SCHEMA_FILE" ]; then
echo "SQLite schema file '$SQLITE_SCHEMA_FILE' not found."
exit 1
fi
echo "Creating schema in SQLite3 database from file: $SQLITE_SCHEMA_FILE"
sqlite3 "$SQLITE_DATABASE_FILE" < "$SQLITE_SCHEMA_FILE"
echo "Disabling foreign key checks for the import..."
sqlite3 "$SQLITE_DATABASE_FILE" "PRAGMA foreign_keys=OFF;"
echo "Importing SQL statements into SQLite3 database..."
sqlite3 "$SQLITE_DATABASE_FILE" < "$TEMP_SQL_FILE"
echo "Re-enabling foreign key checks..."
sqlite3 "$SQLITE_DATABASE_FILE" "PRAGMA foreign_keys=ON;"
fi

# Create the schema in the SQLite database
echo "Creating schema in SQLite3 database from file: $SQLITE_SCHEMA_FILE"
sqlite3 "$SQLITE_DATABASE_FILE" < "$SQLITE_SCHEMA_FILE"

# Disable foreign key checks for the import
echo "Disabling foreign key checks for the import..."
sqlite3 "$SQLITE_DATABASE_FILE" "PRAGMA foreign_keys=OFF;"

# Import the SQL statements into SQLite3
echo "Importing SQL statements into SQLite3 database..."
sqlite3 "$SQLITE_DATABASE_FILE" < "$TEMP_SQL_FILE"

# Re-enable foreign key checks after the import
echo "Re-enabling foreign key checks..."
sqlite3 "$SQLITE_DATABASE_FILE" "PRAGMA foreign_keys=ON;"

END_TIME=$(date +%s)
DURATION=$((END_TIME - START_TIME))
echo "Import completed successfully in $((DURATION / 60)) minutes and $((DURATION % 60)) seconds."