API refactor
All checks were successful
continuous-integration/drone/push Build is passing

This commit is contained in:
2025-10-07 16:25:52 +09:00
parent 76d0d86211
commit 91c7e04474
1171 changed files with 81940 additions and 44117 deletions

102
docs/MOBILE_CALENDAR_API.md Normal file
View File

@@ -0,0 +1,102 @@
# Документация мобильного API календарного сервиса
## Эндпоинт `/api/v1/calendar/entries/mobile`
### POST
Создает или обновляет запись в календаре из мобильного приложения.
**URL**: `/api/v1/calendar/entries/mobile`
**Метод**: `POST`
**Аутентификация**: Обязательна
**Тело запроса**:
```json
{
"date": "2025-09-28",
"type": "MENSTRUATION",
"flow_intensity": 5,
"symptoms": ["FATIGUE", "HEADACHE"],
"mood": "NORMAL",
"notes": "Запись из мобильного приложения"
}
```
**Параметры**:
| Параметр | Тип | Обязательный | Описание |
|----------|-----|--------------|----------|
| date | строка (YYYY-MM-DD) | Да | Дата записи |
| type | строка | Да | Тип записи. Возможные значения: `MENSTRUATION`, `OVULATION`, `SPOTTING`, `DISCHARGE`, `PAIN`, `MOOD` |
| flow_intensity | число | Нет | Интенсивность (1-5). Только для типа `MENSTRUATION` |
| symptoms | массив строк | Нет | Массив симптомов. Возможные значения: `FATIGUE`, `HEADACHE`, `CRAMPS`, и др. |
| mood | строка | Нет | Настроение. Возможные значения: `NORMAL`, `HAPPY`, `SAD`, `ANXIOUS`, `IRRITATED`, `ENERGETIC`, `TIRED` |
| notes | строка | Нет | Заметки пользователя |
**Успешный ответ**:
- **Код**: `201 Created`
- **Содержимое**:
```json
{
"id": 1,
"uuid": "123e4567-e89b-12d3-a456-426614174000",
"user_id": 25,
"entry_date": "2025-09-28",
"entry_type": "period",
"flow_intensity": "heavy",
"period_symptoms": "",
"mood": "neutral",
"energy_level": 1,
"sleep_hours": 0,
"symptoms": "FATIGUE, HEADACHE",
"medications": "",
"notes": "Запись из мобильного приложения",
"is_predicted": false,
"confidence_score": null,
"created_at": "2025-09-28T16:21:30.123456",
"updated_at": "2025-09-28T16:21:30.123456",
"is_active": true
}
```
**Ошибки**:
- **Код**: `401 Unauthorized`
- **Содержимое**: `{"detail": "Не предоставлены учетные данные аутентификации"}`
- **Код**: `500 Internal Server Error`
- **Содержимое**: `{"detail": "Ошибка сервера: [описание ошибки]"}`
## Конвертация между мобильными и серверными типами
### Типы записей
| Мобильное приложение | Сервер |
|----------------------|--------|
| MENSTRUATION | period |
| OVULATION | ovulation |
| SPOTTING | symptoms |
| DISCHARGE | symptoms |
| PAIN | symptoms |
| MOOD | mood |
### Интенсивность кровотечения
| Мобильное приложение (1-5) | Сервер |
|----------------------------|--------|
| 1 | spotting |
| 2 | light |
| 3 | medium |
| 4-5 | heavy |
### Настроение
| Мобильное приложение | Сервер |
|----------------------|--------|
| HAPPY | happy |
| SAD | sad |
| NORMAL | happy |
| ANXIOUS | anxious |
| IRRITATED | irritated |
| ENERGETIC | energetic |
| TIRED | tired |

10
migrate_db.py Normal file
View File

@@ -0,0 +1,10 @@
import asyncio
async def run_migrations():
from shared.database import init_db
print("✅ Миграции базы данных запущены")
await init_db()
print("✅ Миграции базы данных завершены")
if __name__ == "__main__":
asyncio.run(run_migrations())

View File

@@ -0,0 +1,48 @@
#!/bin/bash
# Скрипт для перезапуска календарного сервиса после внесения изменений
# Цвета для вывода
GREEN="\033[0;32m"
YELLOW="\033[0;33m"
RED="\033[0;31m"
NC="\033[0m" # No Color
# Переход в директорию проекта
cd "$(dirname "$0")" || { echo -e "${RED}Не удалось перейти в директорию проекта${NC}"; exit 1; }
# Активация виртуального окружения
if [ -d "venv" ]; then
echo -e "${YELLOW}Активация виртуального окружения...${NC}"
source venv/bin/activate
fi
# Установка переменной PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Остановка текущего экземпляра сервиса
echo -e "${YELLOW}Остановка календарного сервиса...${NC}"
pid=$(lsof -t -i:8004 2>/dev/null)
if [ -n "$pid" ]; then
kill $pid
sleep 2
fi
# Перезапуск сервиса
echo -e "${GREEN}Перезапуск календарного сервиса на порту 8004...${NC}"
python -m uvicorn services.calendar_service.main:app --host 0.0.0.0 --port 8004 &
echo -e "${GREEN}✅ Календарный сервис перезапущен${NC}"
echo -e "${YELLOW}Подождите несколько секунд для инициализации...${NC}"
sleep 3
# Проверка работоспособности
echo -e "${YELLOW}Проверка работоспособности...${NC}"
if curl -s http://localhost:8004/health > /dev/null; then
echo -e "${GREEN}✅ Календарный сервис успешно запущен${NC}"
else
echo -e "${RED}❌ Проблемы с запуском календарного сервиса${NC}"
fi
echo -e "${YELLOW}Сохраняем PID для дальнейшего использования...${NC}"
lsof -t -i:8004 > /tmp/calendar_service.pid

32
run_api_gateway_for_emulator.sh Executable file
View File

@@ -0,0 +1,32 @@
#!/bin/bash
# Скрипт для запуска API Gateway для Android эмулятора
# Запускает API Gateway на всех доступных интерфейсах (0.0.0.0)
echo "🚀 Запускаем API Gateway на всех интерфейсах для Android-эмулятора..."
# Перейти в директорию проекта
cd "$(dirname "$0")" || { echo "Не удалось перейти в директорию проекта"; exit 1; }
# Активировать виртуальное окружение
source .venv/bin/activate || { echo "Не удалось активировать виртуальное окружение"; exit 1; }
# Получить текущий IP адрес
IP=$(hostname -I | awk '{print $1}')
echo "🔗 Локальный IP адрес: $IP"
echo "📱 Android эмулятор сможет подключиться по адресу: http://$IP:8000"
# Проверим запущен ли уже процесс на порту 8000
if lsof -ti:8000 > /dev/null; then
echo "⚠️ Порт 8000 уже используется, останавливаем процесс..."
lsof -ti:8000 | xargs kill -9
sleep 1
fi
# Запускаем API Gateway, привязанный ко всем интерфейсам
cd services/api_gateway || { echo "Не удалось перейти в директорию API Gateway"; exit 1; }
echo "✅ Запускаем API Gateway..."
PYTHONPATH="${PWD}/../.." python -m uvicorn main:app --host 0.0.0.0 --port 8000 --reload
echo "API Gateway запущен!"

View File

@@ -2,7 +2,7 @@
# Запуск API Gateway для приложения Women Safety # Запуск API Gateway для приложения Women Safety
echo -e "\033[1;34m🚀 Запуск API Gateway на порту 8000...\033[0m" echo -e "\033[1;34m🚀 Запуск API Gateway на порту 8000 для мобильного приложения...\033[0m"
# Переход в директорию проекта # Переход в директорию проекта
cd "$(dirname "$0")" || { echo "Не удалось перейти в директорию проекта"; exit 1; } cd "$(dirname "$0")" || { echo "Не удалось перейти в директорию проекта"; exit 1; }
@@ -19,6 +19,32 @@ fi
# Установка переменной PYTHONPATH # Установка переменной PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}" export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Создание Python-скрипта для миграции базы данных
echo -e "\033[1;33m🔄 Создание временного скрипта для миграции базы данных...\033[0m"
cat > migrate_db.py << 'EOL'
import asyncio
import sys
async def run_migrations():
from shared.database import init_db
print("🔄 Выполнение миграций базы данных...")
await init_db()
print("✅ Миграции успешно выполнены!")
if __name__ == "__main__":
asyncio.run(run_migrations())
EOL
# Запуск миграции базы данных
echo -e "\033[1;33m🔄 Запуск миграции базы данных...\033[0m"
python migrate_db.py
MIGRATION_STATUS=$?
if [ $MIGRATION_STATUS -ne 0 ]; then
echo -e "\033[1;31m❌ Ошибка при миграции базы данных. Проверьте логи.\033[0m"
exit 1
fi
# Запуск API Gateway # Запуск API Gateway
echo -e "\033[1;32m✅ Запуск API Gateway...\033[0m" echo -e "\033[1;32m✅ Запуск API Gateway...\033[0m"
cd services/api_gateway || { echo "Не удалось перейти в директорию API Gateway"; exit 1; } cd services/api_gateway || { echo "Не удалось перейти в директорию API Gateway"; exit 1; }

52
run_gateway_mobile.sh Executable file
View File

@@ -0,0 +1,52 @@
#!/bin/bash
# Запуск API Gateway для мобильного приложения Women Safety
echo -e "\033[1;34m🚀 Запуск API Gateway для мобильного приложения на порту 8000...\033[0m"
# Переход в директорию проекта
cd "$(dirname "$0")" || { echo "Не удалось перейти в директорию проекта"; exit 1; }
# Активация виртуального окружения
echo -e "\033[1;33m🔄 Активация виртуального окружения...\033[0m"
source venv/bin/activate
# Установка переменной PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Проверка внешнего IP для доступа из эмулятора Android
EXTERNAL_IP=$(hostname -I | awk '{print $1}')
echo -e "\033[1;33m📱 IP-адрес для доступа из эмулятора Android: ${EXTERNAL_IP}:8000\033[0m"
# Создание Python-скрипта для миграции базы данных
echo -e "\033[1;33m🔄 Создание временного скрипта для миграции базы данных...\033[0m"
cat > migrate_db.py << 'EOL'
import asyncio
import sys
async def run_migrations():
from shared.database import init_db
print("🔄 Выполнение миграций базы данных...")
await init_db()
print("✅ Миграции успешно выполнены!")
if __name__ == "__main__":
asyncio.run(run_migrations())
EOL
# Запуск миграции базы данных
echo -e "\033[1;33m🔄 Запуск миграции базы данных...\033[0m"
python migrate_db.py
MIGRATION_STATUS=$?
if [ $MIGRATION_STATUS -ne 0 ]; then
echo -e "\033[1;31m❌ Ошибка при миграции базы данных. Проверьте логи.\033[0m"
exit 1
fi
echo -e "\033[1;32m✅ База данных успешно мигрирована.\033[0m"
# Запуск API Gateway
echo -e "\033[1;32m✅ Запуск API Gateway для мобильного приложения...\033[0m"
cd services/api_gateway || { echo "Не удалось перейти в директорию API Gateway"; exit 1; }
python -m uvicorn main:app --host 0.0.0.0 --port 8000 --log-level debug

139
run_gateway_with_dependencies.sh Executable file
View File

@@ -0,0 +1,139 @@
#!/bin/bash
# Скрипт для запуска API Gateway с зависимостями для приложения Women Safety
# Устанавливаем цветные сообщения
GREEN="\033[1;32m"
BLUE="\033[1;34m"
YELLOW="\033[1;33m"
RED="\033[1;31m"
RESET="\033[0m"
echo -e "${BLUE}🚀 Запуск API Gateway и необходимых сервисов...${RESET}"
# Переход в директорию проекта
cd "$(dirname "$0")" || { echo -e "${RED}Не удалось перейти в директорию проекта${RESET}"; exit 1; }
# Активация виртуального окружения, если оно существует
if [ -d "venv" ]; then
echo -e "${YELLOW}🔄 Активация виртуального окружения venv...${RESET}"
source venv/bin/activate
elif [ -d ".venv" ]; then
echo -e "${YELLOW}🔄 Активация виртуального окружения .venv...${RESET}"
source .venv/bin/activate
else
echo -e "${YELLOW}⚠️ Виртуальное окружение не найдено, продолжаем без него...${RESET}"
fi
# Установка переменной PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Функция для проверки, запущен ли сервис на указанном порту
check_service() {
local port=$1
local service_name=$2
if lsof -ti:$port > /dev/null 2>&1; then
echo -e "${GREEN}✅ Сервис $service_name уже запущен на порту $port${RESET}"
return 0
else
echo -e "${YELLOW}⚠️ Сервис $service_name не запущен на порту $port${RESET}"
return 1
fi
}
# Функция для запуска сервиса
start_service() {
local port=$1
local service_path=$2
local service_name=$3
echo -e "${BLUE}🚀 Запуск $service_name на порту $port...${RESET}"
cd "$service_path" || { echo -e "${RED}Не удалось перейти в директорию $service_path${RESET}"; return 1; }
# Запускаем сервис в фоновом режиме
python -m uvicorn main:app --host 0.0.0.0 --port $port &
local pid=$!
# Ждем 3 секунды и проверяем, запустился ли сервис
sleep 3
if kill -0 $pid 2>/dev/null; then
echo -e "${GREEN}✅ Сервис $service_name успешно запущен (PID: $pid)${RESET}"
# Возвращаемся в корневую директорию
cd "$(dirname "$0")" || { echo -e "${RED}Не удалось вернуться в корневую директорию${RESET}"; exit 1; }
return 0
else
echo -e "${RED}Не удалось запустить сервис $service_name${RESET}"
# Возвращаемся в корневую директорию
cd "$(dirname "$0")" || { echo -e "${RED}Не удалось вернуться в корневую директорию${RESET}"; exit 1; }
return 1
fi
}
# Проверяем и запускаем пользовательский сервис (создаёт таблицы пользователей)
if ! check_service 8001 "User Service"; then
start_service 8001 "./services/user_service" "User Service"
fi
# Проверяем и запускаем сервис календаря
if ! check_service 8004 "Calendar Service"; then
start_service 8004 "./services/calendar_service" "Calendar Service"
fi
# Проверяем и запускаем сервис оповещений об экстренных ситуациях
if ! check_service 8002 "Emergency Service"; then
start_service 8002 "./services/emergency_service" "Emergency Service"
fi
# Проверяем и запускаем сервис локаций
if ! check_service 8003 "Location Service"; then
start_service 8003 "./services/location_service" "Location Service"
fi
# Проверяем и запускаем сервис уведомлений
if ! check_service 8005 "Notification Service"; then
start_service 8005 "./services/notification_service" "Notification Service"
fi
# Ждем несколько секунд, чтобы сервисы успели инициализироваться
echo -e "${YELLOW}⏳ Ожидаем инициализацию сервисов (5 сек)...${RESET}"
sleep 5
# Функция для проверки здоровья сервиса
check_health() {
local url=$1
local service_name=$2
if curl -s "$url" > /dev/null 2>&1; then
echo -e "${GREEN}✅ Проверка здоровья $service_name успешна${RESET}"
return 0
else
echo -e "${RED}❌ Проверка здоровья $service_name не удалась${RESET}"
return 1
fi
}
# Проверяем здоровье запущенных сервисов
check_health "http://localhost:8001/health" "User Service"
check_health "http://localhost:8002/health" "Emergency Service"
check_health "http://localhost:8003/health" "Location Service"
check_health "http://localhost:8004/health" "Calendar Service"
check_health "http://localhost:8005/health" "Notification Service"
# Теперь запускаем API Gateway
echo -e "${BLUE}🔄 Запуск API Gateway на порту 8000...${RESET}"
# Порт для API Gateway
API_GATEWAY_PORT=8000
# Проверяем, не занят ли порт уже
if lsof -ti:$API_GATEWAY_PORT > /dev/null 2>&1; then
echo -e "${YELLOW}⚠️ Порт $API_GATEWAY_PORT уже занят, освобождаем...${RESET}"
lsof -ti:$API_GATEWAY_PORT | xargs kill -9
sleep 1
fi
# Запускаем API Gateway в текущем терминале (не в фоне)
echo -e "${GREEN}✅ Запуск API Gateway...${RESET}"
cd services/api_gateway || { echo -e "${RED}Не удалось перейти в директорию API Gateway${RESET}"; exit 1; }
python -m uvicorn main:app --host 0.0.0.0 --port $API_GATEWAY_PORT --reload

86
run_microservices.sh Normal file
View File

@@ -0,0 +1,86 @@
#!/bin/bash
# Скрипт для быстрого запуска всех необходимых микросервисов Women Safety App
# Цвета для вывода
GREEN="\033[0;32m"
YELLOW="\033[0;33m"
BLUE="\033[0;34m"
RED="\033[0;31m"
NC="\033[0m" # No Color
echo -e "${BLUE}🚀 Запуск Women Safety App Microservices${NC}"
# Переход в директорию проекта
cd "$(dirname "$0")" || { echo -e "${RED}❌ Ошибка: Не удалось перейти в директорию проекта${NC}"; exit 1; }
# Проверка виртуального окружения
if [ ! -d "venv" ]; then
echo -e "${YELLOW}⚠️ Виртуальное окружение не найдено. Создаю новое...${NC}"
python3 -m venv venv
source venv/bin/activate
python -m pip install -r requirements.txt
else
echo -e "${GREEN}✅ Виртуальное окружение найдено${NC}"
source venv/bin/activate
fi
# Установка PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Функция для запуска сервиса
start_service() {
local name=$1
local port=$2
echo -e "${YELLOW}🔄 Запуск $name на порту $port...${NC}"
# Запускаем в фоновом режиме
python -m uvicorn services.${name}.main:app --host 0.0.0.0 --port $port &
# Сохраняем PID
echo $! > /tmp/${name}.pid
# Небольшая задержка для запуска
sleep 2
# Проверка запуска
if curl -s http://localhost:$port/health > /dev/null; then
echo -e "${GREEN}$name успешно запущен на порту $port${NC}"
else
echo -e "${RED}❌ Ошибка: $name не запущен на порту $port${NC}"
fi
}
# Создание скрипта миграции БД
cat > migrate_db.py << 'EOF'
import asyncio
async def run_migrations():
from shared.database import init_db
print("🔄 Выполнение миграций базы данных...")
await init_db()
print("✅ Миграции успешно выполнены!")
if __name__ == "__main__":
asyncio.run(run_migrations())
EOF
# Запуск миграций
echo -e "${YELLOW}🔄 Применение миграций базы данных...${NC}"
python migrate_db.py
# Запуск всех микросервисов
start_service "user_service" 8001
start_service "emergency_service" 8002
start_service "location_service" 8003
start_service "calendar_service" 8004
start_service "notification_service" 8005
# IP для мобильного доступа
EXTERNAL_IP=$(hostname -I | awk '{print $1}')
echo -e "${BLUE}📱 IP для мобильного приложения: ${EXTERNAL_IP}:8000${NC}"
# Запуск API Gateway
echo -e "${GREEN}🚪 Запуск API Gateway на порту 8000...${NC}"
python -m uvicorn services.api_gateway.main:app --host 0.0.0.0 --port 8000

79
run_services_for_emulator.sh Executable file
View File

@@ -0,0 +1,79 @@
#!/bin/bash
# Скрипт для запуска всех микросервисов на всех интерфейсах для Android-эмулятора
echo "🚀 Запускаем все микросервисы для Android-эмулятора..."
# Перейти в директорию проекта
cd "$(dirname "$0")" || { echo "Не удалось перейти в директорию проекта"; exit 1; }
# Активировать виртуальное окружение
source .venv/bin/activate || { echo "Не удалось активировать виртуальное окружение"; exit 1; }
# Получить текущий IP адрес
IP=$(hostname -I | awk '{print $1}')
echo "🔗 Локальный IP адрес: $IP"
echo "📱 Android эмулятор сможет подключиться по адресу: http://$IP:8000"
# Функция для остановки процесса на указанном порту
stop_service() {
local port=$1
if lsof -ti:$port > /dev/null; then
echo "⚠️ Останавливаем процесс на порту $port..."
lsof -ti:$port | xargs kill -9
sleep 1
fi
}
# Останавливаем все сервисы, если они запущены
echo "🛑 Останавливаем все сервисы, если они запущены..."
stop_service 8000 # API Gateway
stop_service 8001 # User Service
stop_service 8002 # Emergency Service
stop_service 8003 # Location Service
stop_service 8004 # Calendar Service
stop_service 8005 # Notification Service
# Экспортируем PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Функция для запуска сервиса
run_service() {
local name=$1
local port=$2
local path=$3
echo "✅ Запускаем $name на порту $port..."
cd $path || { echo "Не удалось перейти в директорию $path"; return; }
python -m uvicorn main:app --host 0.0.0.0 --port $port --reload &
cd $OLDPWD || { echo "Не удалось вернуться в исходную директорию"; return; }
sleep 2
}
# Запускаем все сервисы
echo "🚀 Запускаем все сервисы..."
run_service "User Service" 8001 "services/user_service"
run_service "Emergency Service" 8002 "services/emergency_service"
run_service "Location Service" 8003 "services/location_service"
run_service "Calendar Service" 8004 "services/calendar_service"
run_service "Notification Service" 8005 "services/notification_service"
run_service "API Gateway" 8000 "services/api_gateway"
echo ""
echo "📋 Сервисы доступны по следующим адресам:"
echo " 📡 API Gateway: http://$IP:8000"
echo " 👤 User Service: http://$IP:8001"
echo " 🚨 Emergency Service: http://$IP:8002"
echo " 📍 Location Service: http://$IP:8003"
echo " 📅 Calendar Service: http://$IP:8004"
echo " 🔔 Notification Service: http://$IP:8005"
echo ""
echo "📱 Для Android эмулятора используйте адрес: http://$IP:8000"
echo ""
echo "⚠️ Для остановки всех сервисов выполните: pkill -f uvicorn"
echo ""
echo "🔍 Мониторинг логов запущен. Нажмите Ctrl+C для выхода (сервисы продолжат работать)"
# Мониторим логи
tail -f services/*.log

View File

@@ -263,7 +263,12 @@ async def login_user(user_login: UserLogin, request: Request):
async with httpx.AsyncClient(timeout=30.0) as client: async with httpx.AsyncClient(timeout=30.0) as client:
try: try:
login_data = user_login.model_dump() # Преобразуем формат данных для совместимости с сервисом пользователей
login_data = {
"email": user_login.email,
"username": user_login.username,
"password": user_login.password
}
print(f"Sending login data to user service: {login_data}") print(f"Sending login data to user service: {login_data}")
response = await client.post( response = await client.post(
@@ -618,6 +623,7 @@ async def location_service_proxy(request: Request):
@app.api_route("/api/v1/calendar/settings", methods=["GET"], operation_id="calendar_settings_get") @app.api_route("/api/v1/calendar/settings", methods=["GET"], operation_id="calendar_settings_get")
@app.api_route("/api/v1/calendar/settings", methods=["PUT"], operation_id="calendar_settings_put") @app.api_route("/api/v1/calendar/settings", methods=["PUT"], operation_id="calendar_settings_put")
# Мобильное API для календаря # Мобильное API для календаря
@app.api_route("/api/v1/calendar/entries/mobile", methods=["POST"], operation_id="calendar_entries_mobile_post")
@app.api_route("/api/v1/entry", methods=["POST"], operation_id="mobile_calendar_entry_post") @app.api_route("/api/v1/entry", methods=["POST"], operation_id="mobile_calendar_entry_post")
@app.api_route("/api/v1/entries", methods=["GET"], operation_id="mobile_calendar_entries_get") @app.api_route("/api/v1/entries", methods=["GET"], operation_id="mobile_calendar_entries_get")
@app.api_route("/api/v1/entries", methods=["POST"], operation_id="mobile_calendar_entries_post") @app.api_route("/api/v1/entries", methods=["POST"], operation_id="mobile_calendar_entries_post")

View File

@@ -1,10 +1,10 @@
from datetime import date, datetime, timedelta from datetime import date, datetime, timedelta
from typing import Dict, List, Optional from typing import Dict, List, Optional, Any
from fastapi import Depends, FastAPI, HTTPException, Query from fastapi import Depends, FastAPI, HTTPException, Query
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel from pydantic import BaseModel
from sqlalchemy import and_, desc, select from sqlalchemy import and_, desc, select, func
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from services.calendar_service.models import CalendarEntry, CycleData, HealthInsights from services.calendar_service.models import CalendarEntry, CycleData, HealthInsights
@@ -14,6 +14,10 @@ from services.calendar_service.schemas import (CalendarEntryCreate, CalendarEntr
FlowIntensity, HealthInsightResponse, MoodType, FlowIntensity, HealthInsightResponse, MoodType,
CalendarEventCreate) CalendarEventCreate)
from services.calendar_service.mobile_endpoint import MobileCalendarEntryCreate, mobile_create_calendar_entry from services.calendar_service.mobile_endpoint import MobileCalendarEntryCreate, mobile_create_calendar_entry
from services.calendar_service.mobile_responses import (MobileCalendarEntryResponse,
MobileCalendarPeriodInfo,
MobilePredictionInfo,
MobileCalendarResponse)
from shared.auth import get_current_user_from_token as get_current_user from shared.auth import get_current_user_from_token as get_current_user
from shared.config import settings from shared.config import settings
from shared.database import get_db from shared.database import get_db
@@ -349,7 +353,7 @@ async def get_all_calendar_entries(
if end_date: if end_date:
query = query.filter(CalendarEntry.entry_date <= end_date) query = query.filter(CalendarEntry.entry_date <= end_date)
if entry_type: if entry_type:
query = query.filter(CalendarEntry.entry_type == entry_type) query = query.filter(CalendarEntry.entry_type == entry_type.value)
query = query.order_by(CalendarEntry.entry_date.desc()).limit(limit) query = query.order_by(CalendarEntry.entry_date.desc()).limit(limit)
@@ -679,7 +683,191 @@ async def create_mobile_calendar_entry(
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}") raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
from .utils import safe_int, safe_str, safe_bool, safe_date, safe_datetime, safe_get_column_value
# Новый эндпоинт для мобильного приложения для получения записей календаря с фильтрацией по датам
@app.get("/api/v1/calendar/entries/mobile", response_model=MobileCalendarResponse)
async def get_mobile_calendar_entries(
start_date: Optional[date] = Query(None, description="Начальная дата для фильтрации (включительно)"),
end_date: Optional[date] = Query(None, description="Конечная дата для фильтрации (включительно)"),
entry_type: Optional[str] = Query(None, description="Тип записи (MENSTRUATION, OVULATION и т.д.)"),
current_user: Dict = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
limit: int = Query(100, ge=1, le=500, description="Максимальное количество записей"),
):
"""Получить записи календаря для мобильного приложения с фильтрацией по датам"""
import logging
logging.info(f"Запрос мобильных данных. start_date={start_date}, end_date={end_date}, entry_type={entry_type}")
try:
# Получаем записи из базы данных
query = select(CalendarEntry).filter(CalendarEntry.user_id == current_user["user_id"])
# Фильтрация по датам
if start_date:
query = query.filter(CalendarEntry.entry_date >= start_date)
if end_date:
query = query.filter(CalendarEntry.entry_date <= end_date)
# Фильтрация по типу записи
if entry_type:
# Преобразуем тип из мобильного формата в серверный
server_entry_type = None
if entry_type == "MENSTRUATION":
server_entry_type = EntryType.PERIOD.value
elif entry_type == "OVULATION":
server_entry_type = EntryType.OVULATION.value
elif entry_type == "SYMPTOMS":
server_entry_type = EntryType.SYMPTOMS.value
if server_entry_type:
query = query.filter(CalendarEntry.entry_type == server_entry_type)
# Сортировка и ограничение количества записей
query = query.order_by(desc(CalendarEntry.entry_date)).limit(limit)
# Выполнение запроса
result = await db.execute(query)
entries = result.scalars().all()
# Преобразуем записи в формат мобильного приложения
mobile_entries = []
for entry in entries:
# Преобразуем тип записи из серверного формата в мобильный
mobile_type = "SYMPTOMS" # По умолчанию
entry_type_value = safe_str(entry.entry_type, "")
if entry_type_value == EntryType.PERIOD.value:
mobile_type = "MENSTRUATION"
elif entry_type_value == EntryType.OVULATION.value:
mobile_type = "OVULATION"
# Преобразуем симптомы из строки в список
symptoms_list = []
entry_symptoms = safe_str(entry.symptoms, "")
if entry_symptoms:
symptoms_list = [s.strip() for s in entry_symptoms.split(",")]
# Преобразуем flow_intensity, если есть
flow_intensity_value = None
entry_flow = safe_str(entry.flow_intensity, "")
if entry_flow in ['1', '2', '3', '4', '5']:
flow_intensity_value = int(entry_flow)
# Преобразуем mood, если есть
mood_value = None
entry_mood = safe_str(entry.mood, "")
if entry_mood:
mood_value = entry_mood.upper()
# Преобразуем notes, если есть
notes_value = safe_str(entry.notes, "")
# Получаем created_at
created_at_value = safe_datetime(
getattr(entry, 'created_at', None),
datetime.now()
).isoformat()
# Получаем is_predicted
is_predicted_value = safe_bool(
getattr(entry, 'is_predicted', None),
False
)
# Создаем мобильную запись
mobile_entry = MobileCalendarEntryResponse(
id=safe_int(entry.id),
uuid=str(safe_int(entry.id)), # Используем ID как UUID
date=safe_date(entry.entry_date, date.today()).isoformat(),
type=mobile_type,
flow_intensity=flow_intensity_value,
mood=mood_value,
symptoms=symptoms_list,
notes=notes_value,
created_at=created_at_value,
is_predicted=is_predicted_value
)
mobile_entries.append(mobile_entry)
# Получаем информацию о текущем цикле
current_cycle = await db.execute(
select(CycleData)
.filter(CycleData.user_id == current_user["user_id"])
.order_by(desc(CycleData.cycle_start_date))
.limit(1)
)
cycle_data = current_cycle.scalars().first()
# Создаем информацию о периоде
period_info = MobileCalendarPeriodInfo()
prediction_info = MobilePredictionInfo()
if cycle_data:
# Заполняем информацию о текущем цикле
cycle_start_date = safe_date(cycle_data.cycle_start_date)
if cycle_start_date:
period_info.current_cycle_start = cycle_start_date.isoformat()
cycle_length = safe_int(cycle_data.cycle_length)
if cycle_length > 0:
period_info.cycle_length = cycle_length
avg_cycle_length = safe_int(cycle_data.avg_cycle_length)
if avg_cycle_length > 0:
period_info.average_cycle_length = avg_cycle_length
# Если есть информация о периоде, рассчитываем дату окончания
period_length = safe_int(cycle_data.period_length)
if period_length > 0 and cycle_start_date:
end_date_value = cycle_start_date + timedelta(days=period_length)
period_info.expected_period_end = end_date_value.isoformat()
# Заполняем информацию о фертильном окне и овуляции
if cycle_start_date:
ovulation_day = avg_cycle_length // 2
ovulation_date = cycle_start_date + timedelta(days=ovulation_day)
period_info.ovulation_date = ovulation_date.isoformat()
# Фертильное окно начинается за 5 дней до овуляции и заканчивается через 1 день после
period_info.fertility_window_start = (ovulation_date - timedelta(days=5)).isoformat()
period_info.fertility_window_end = (ovulation_date + timedelta(days=1)).isoformat()
# Заполняем прогноз
next_period = safe_date(cycle_data.next_period_predicted)
if next_period:
prediction_info.next_period_date = next_period.isoformat()
prediction_info.confidence_level = 80 # Приблизительное значение уверенности
# Рассчитываем следующее фертильное окно и овуляцию
if avg_cycle_length > 0:
next_ovulation = next_period - timedelta(days=avg_cycle_length // 2)
prediction_info.next_ovulation_date = next_ovulation.isoformat()
prediction_info.next_fertile_window_start = (next_ovulation - timedelta(days=5)).isoformat()
prediction_info.next_fertile_window_end = (next_ovulation + timedelta(days=1)).isoformat()
# Собираем полный ответ
response = MobileCalendarResponse(
entries=mobile_entries,
period_info=period_info,
prediction=prediction_info
)
return response
except Exception as e:
logging.error(f"Ошибка при получении записей календаря: {str(e)}")
raise HTTPException(status_code=500, detail=f"Ошибка сервера: {str(e)}")
if __name__ == "__main__": if __name__ == "__main__":
import uvicorn import uvicorn
from .mobile_responses import (
MobileCalendarEntryResponse,
MobileCalendarPeriodInfo,
MobilePredictionInfo,
MobileCalendarResponse
)
from .schemas_mobile import MobileFlowIntensity, MobileMood
uvicorn.run(app, host="0.0.0.0", port=8004) uvicorn.run(app, host="0.0.0.0", port=8004)

View File

@@ -0,0 +1,47 @@
from datetime import date, datetime
from typing import List, Dict, Optional, Any
from uuid import UUID
from pydantic import BaseModel, Field
class MobileCalendarEntryResponse(BaseModel):
"""Формат ответа для мобильного приложения"""
id: int
uuid: str
date: str # ISO формат даты YYYY-MM-DD
type: str # Тип записи в формате мобильного приложения (MENSTRUATION, OVULATION, etc.)
flow_intensity: Optional[int] = None # Шкала интенсивности 1-5
mood: Optional[str] = None
symptoms: List[str] = Field(default_factory=list)
notes: Optional[str] = None
created_at: str # ISO формат даты и времени
is_predicted: bool = False
class MobileCalendarPeriodInfo(BaseModel):
"""Информация о текущем цикле для мобильного приложения"""
current_cycle_start: Optional[str] = None # ISO формат даты YYYY-MM-DD
expected_period_end: Optional[str] = None # ISO формат даты YYYY-MM-DD
cycle_length: Optional[int] = None
period_length: Optional[int] = None
average_cycle_length: Optional[int] = None
fertility_window_start: Optional[str] = None # ISO формат даты YYYY-MM-DD
fertility_window_end: Optional[str] = None # ISO формат даты YYYY-MM-DD
ovulation_date: Optional[str] = None # ISO формат даты YYYY-MM-DD
class MobilePredictionInfo(BaseModel):
"""Информация о прогнозах для мобильного приложения"""
next_period_date: Optional[str] = None # ISO формат даты YYYY-MM-DD
confidence_level: Optional[int] = None # 0-100 уровень уверенности
next_fertile_window_start: Optional[str] = None # ISO формат даты YYYY-MM-DD
next_fertile_window_end: Optional[str] = None # ISO формат даты YYYY-MM-DD
next_ovulation_date: Optional[str] = None # ISO формат даты YYYY-MM-DD
class MobileCalendarResponse(BaseModel):
"""Полный ответ для мобильного приложения"""
entries: List[MobileCalendarEntryResponse]
period_info: MobileCalendarPeriodInfo = Field(default_factory=MobileCalendarPeriodInfo)
prediction: MobilePredictionInfo = Field(default_factory=MobilePredictionInfo)

View File

@@ -1,6 +1,6 @@
import uuid import uuid
from sqlalchemy import Boolean, Column, Date, ForeignKey, Integer, String, Text from sqlalchemy import Boolean, Column, Date, Integer, String, Text
from sqlalchemy.dialects.postgresql import UUID from sqlalchemy.dialects.postgresql import UUID
from shared.database import BaseModel from shared.database import BaseModel
@@ -10,7 +10,7 @@ class CalendarEntry(BaseModel):
__tablename__ = "calendar_entries" __tablename__ = "calendar_entries"
uuid = Column(UUID(as_uuid=True), default=uuid.uuid4, unique=True, index=True) uuid = Column(UUID(as_uuid=True), default=uuid.uuid4, unique=True, index=True)
user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True) user_id = Column(Integer, nullable=False, index=True) # Убран ForeignKey
entry_date = Column(Date, nullable=False, index=True) entry_date = Column(Date, nullable=False, index=True)
entry_type = Column( entry_type = Column(
@@ -42,7 +42,7 @@ class CalendarEntry(BaseModel):
class CycleData(BaseModel): class CycleData(BaseModel):
__tablename__ = "cycle_data" __tablename__ = "cycle_data"
user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True) user_id = Column(Integer, nullable=False, index=True) # Убран ForeignKey
cycle_start_date = Column(Date, nullable=False) cycle_start_date = Column(Date, nullable=False)
cycle_length = Column(Integer) # Length of this cycle cycle_length = Column(Integer) # Length of this cycle
period_length = Column(Integer) # Length of period in this cycle period_length = Column(Integer) # Length of period in this cycle
@@ -65,7 +65,7 @@ class CycleData(BaseModel):
class HealthInsights(BaseModel): class HealthInsights(BaseModel):
__tablename__ = "health_insights" __tablename__ = "health_insights"
user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True) user_id = Column(Integer, nullable=False, index=True) # Убран ForeignKey
insight_type = Column( insight_type = Column(
String(50), nullable=False String(50), nullable=False
) # cycle_pattern, symptom_pattern, etc. ) # cycle_pattern, symptom_pattern, etc.

View File

@@ -1,8 +1,10 @@
from datetime import date, datetime from datetime import date, datetime
from enum import Enum from enum import Enum
from typing import List, Optional, Union from typing import List, Optional, Union, Any
import uuid
from uuid import UUID
from pydantic import BaseModel, Field from pydantic import BaseModel, Field, field_serializer, ConfigDict
class EntryType(str, Enum): class EntryType(str, Enum):
@@ -175,29 +177,36 @@ class CalendarEntryCreate(CalendarEntryBase):
class CalendarEntryResponse(BaseModel): class CalendarEntryResponse(BaseModel):
id: int id: int
uuid: str uuid: UUID # Используем UUID из uuid модуля
entry_date: date entry_date: date
entry_type: str entry_type: str
flow_intensity: Optional[str] flow_intensity: Optional[str] = None
period_symptoms: Optional[str] period_symptoms: Optional[str] = None
mood: Optional[str] mood: Optional[str] = None
energy_level: Optional[int] energy_level: Optional[int] = None
sleep_hours: Optional[int] sleep_hours: Optional[int] = None
symptoms: Optional[str] symptoms: Optional[str] = None
medications: Optional[str] medications: Optional[str] = None
notes: Optional[str] notes: Optional[str] = None
is_predicted: bool is_predicted: bool
confidence_score: Optional[int] confidence_score: Optional[int] = None
created_at: datetime created_at: datetime
updated_at: Optional[datetime] = None
is_active: Optional[bool] = None
user_id: Optional[int] = None
class Config: model_config = ConfigDict(from_attributes=True)
from_attributes = True
@field_serializer('uuid')
def serialize_uuid(self, uuid_val: UUID) -> str:
"""Преобразование UUID в строку для JSON-сериализации"""
return str(uuid_val)
# Модель ответа для мобильного приложения # Модель ответа для мобильного приложения
class CalendarEvent(BaseModel): class CalendarEvent(BaseModel):
id: int id: int
uuid: str uuid: UUID # Используем тип UUID из модуля uuid
date: date date: date
type: str type: str
flow_intensity: Optional[int] = None flow_intensity: Optional[int] = None
@@ -206,6 +215,8 @@ class CalendarEvent(BaseModel):
notes: Optional[str] = None notes: Optional[str] = None
created_at: datetime created_at: datetime
model_config = ConfigDict(from_attributes=True)
@classmethod @classmethod
def from_server_response(cls, entry: CalendarEntryResponse) -> 'CalendarEvent': def from_server_response(cls, entry: CalendarEntryResponse) -> 'CalendarEvent':
"""Преобразовать из серверной модели в модель для мобильного приложения""" """Преобразовать из серверной модели в модель для мобильного приложения"""

View File

@@ -0,0 +1,103 @@
"""Утилиты для работы с SQLAlchemy моделями"""
from datetime import date, datetime
from typing import Any, Optional, TypeVar, Type, cast
T = TypeVar('T')
def safe_get_column_value(obj: Any, column_name: str, default_value: Optional[T] = None) -> Optional[T]:
"""
Безопасное получение значения колонки из SQLAlchemy модели.
Args:
obj: Объект SQLAlchemy модели
column_name: Имя колонки
default_value: Значение по умолчанию, если колонка отсутствует или значение None
Returns:
Значение колонки или значение по умолчанию
"""
if not hasattr(obj, column_name):
return default_value
value = getattr(obj, column_name)
if value is None:
return default_value
# Если значение - дата, преобразуем его в Python date
if hasattr(value, 'isoformat'): # Дата или дата-время
return cast(T, value)
# Для других типов просто возвращаем значение
return cast(T, value)
def safe_int(value: Any, default: int = 0) -> int:
"""Безопасно преобразует значение в целое число"""
if value is None:
return default
try:
return int(value)
except (ValueError, TypeError):
return default
def safe_str(value: Any, default: str = "") -> str:
"""Безопасно преобразует значение в строку"""
if value is None:
return default
try:
return str(value)
except (ValueError, TypeError):
return default
def safe_bool(value: Any, default: bool = False) -> bool:
"""Безопасно преобразует значение в булево значение"""
if value is None:
return default
if isinstance(value, bool):
return value
try:
return bool(value)
except (ValueError, TypeError):
return default
def safe_date(value: Any, default: Optional[date] = None) -> date:
"""Безопасно преобразует значение в дату"""
if default is None:
default = date.today()
if value is None:
return default
if isinstance(value, date) and not isinstance(value, datetime):
return value
if isinstance(value, datetime):
return value.date()
try:
return date.fromisoformat(str(value))
except (ValueError, TypeError):
return default
def safe_datetime(value: Any, default: Optional[datetime] = None) -> datetime:
"""Безопасно преобразует значение в дату-время"""
if default is None:
default = datetime.now()
if value is None:
return default
if isinstance(value, datetime):
return value
if isinstance(value, date):
return datetime.combine(value, datetime.min.time())
try:
return datetime.fromisoformat(str(value))
except (ValueError, TypeError):
return default

View File

@@ -44,7 +44,8 @@ def get_password_hash(password: str) -> str:
# bcrypt has a 72-byte limit, so truncate if necessary # bcrypt has a 72-byte limit, so truncate if necessary
password_bytes = password.encode('utf-8') password_bytes = password.encode('utf-8')
if len(password_bytes) > 72: if len(password_bytes) > 72:
password = password_bytes[:72].decode('utf-8', errors='ignore') logging.warning("Password exceeds bcrypt limit of 72 bytes. Truncating.")
password = password_bytes[:70].decode('utf-8', errors='ignore')
return pwd_context.hash(password) return pwd_context.hash(password)
except Exception as e: except Exception as e:
# Handle bcrypt compatibility issues # Handle bcrypt compatibility issues

103
start_all_services.sh Executable file
View File

@@ -0,0 +1,103 @@
#!/bin/bash
# Скрипт для запуска микросервисной архитектуры Women Safety App
# Запускает все необходимые микросервисы и API Gateway
# Цвета для вывода
GREEN="\033[0;32m"
YELLOW="\033[0;33m"
BLUE="\033[0;34m"
RED="\033[0;31m"
NC="\033[0m" # No Color
# Переход в директорию проекта
cd "$(dirname "$0")" || { echo -e "${RED}Не удалось перейти в директорию проекта${NC}"; exit 1; }
# Активация виртуального окружения
echo -e "${YELLOW}Активация виртуального окружения...${NC}"
source venv/bin/activate
# Установка переменной PYTHONPATH
export PYTHONPATH="${PWD}:${PYTHONPATH}"
# Функция для проверки доступности порта
check_port() {
local port=$1
if lsof -Pi :$port -sTCP:LISTEN -t >/dev/null ; then
return 0 # порт занят
else
return 1 # порт свободен
fi
}
# Создание скрипта миграции
echo -e "${YELLOW}Создание скрипта миграции базы данных...${NC}"
cat > migrate_db.py << 'EOF'
import asyncio
async def run_migrations():
from shared.database import init_db
print("✅ Миграции базы данных запущены")
await init_db()
print("✅ Миграции базы данных завершены")
if __name__ == "__main__":
asyncio.run(run_migrations())
EOF
# Запуск миграции
echo -e "${YELLOW}Запуск миграций базы данных...${NC}"
python migrate_db.py
# Запуск микросервисов в фоновом режиме
echo -e "${YELLOW}Запуск микросервисов...${NC}"
# Список сервисов для запуска
services=(
"user_service:8001"
"emergency_service:8002"
"location_service:8003"
"calendar_service:8004"
"notification_service:8005"
)
# Запуск сервисов в фоновом режиме
for service in "${services[@]}"; do
name=${service%%:*}
port=${service#*:}
if check_port $port; then
echo -e "${YELLOW}Порт $port уже используется, пропускаем запуск $name${NC}"
continue
fi
echo -e "${BLUE}Запуск $name на порту $port...${NC}"
python -m uvicorn services.${name}.main:app --host 0.0.0.0 --port $port &
# Сохраняем PID процесса
echo $! > /tmp/${name}.pid
# Ждем немного, чтобы сервис успел запуститься
sleep 2
done
# Проверка, что все сервисы запущены
echo -e "${YELLOW}Проверка статуса сервисов...${NC}"
for service in "${services[@]}"; do
name=${service%%:*}
port=${service#*:}
if check_port $port; then
echo -e "${GREEN}✅ Сервис $name запущен на порту $port${NC}"
else
echo -e "${RED}❌ Сервис $name НЕ запущен на порту $port${NC}"
fi
done
# Получение IP-адреса для доступа из мобильного приложения
EXTERNAL_IP=$(hostname -I | awk '{print $1}')
echo -e "${GREEN}📱 IP-адрес для доступа из мобильного приложения: ${EXTERNAL_IP}:8000${NC}"
# Запуск API Gateway
echo -e "${GREEN}Запуск API Gateway на порту 8000...${NC}"
python -m uvicorn services.api_gateway.main:app --host 0.0.0.0 --port 8000

39
stop_all_services.sh Executable file
View File

@@ -0,0 +1,39 @@
#!/bin/bash
# Скрипт для остановки всех микросервисов Women Safety App
# Цвета для вывода
GREEN="\033[0;32m"
YELLOW="\033[0;33m"
RED="\033[0;31m"
NC="\033[0m" # No Color
# Проверка, запущены ли сервисы
services=("user_service" "emergency_service" "location_service" "calendar_service" "notification_service")
for service in "${services[@]}"; do
pid_file="/tmp/${service}.pid"
if [ -f "$pid_file" ]; then
pid=$(cat "$pid_file")
if ps -p "$pid" > /dev/null; then
echo -e "${YELLOW}Остановка ${service}...${NC}"
kill "$pid"
rm "$pid_file"
else
echo -e "${YELLOW}Сервис ${service} не запущен или PID не найден${NC}"
rm "$pid_file"
fi
else
echo -e "${YELLOW}PID файл для ${service} не найден${NC}"
fi
done
# Остановка API Gateway
gateway_pid=$(lsof -t -i:8000 2>/dev/null)
if [ -n "$gateway_pid" ]; then
echo -e "${YELLOW}Остановка API Gateway...${NC}"
kill "$gateway_pid"
fi
echo -e "${GREEN}Все сервисы остановлены${NC}"

View File

@@ -1,4 +1,4 @@
#!/home/trevor/dev/chat/venv/bin/python3.12 #!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import re import re
import sys import sys

View File

@@ -1,4 +1,4 @@
#!/home/trevor/dev/chat/venv/bin/python3.12 #!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import re import re
import sys import sys

View File

@@ -1,4 +1,4 @@
#!/home/trevor/dev/chat/venv/bin/python3.12 #!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import re import re
import sys import sys

8
venv/bin/fastapi Executable file
View File

@@ -0,0 +1,8 @@
#!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from fastapi.cli import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())

View File

@@ -1,4 +1,4 @@
#!/home/trevor/dev/chat/venv/bin/python3.12 #!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import re import re
import sys import sys

View File

@@ -1,4 +1,4 @@
#!/home/trevor/dev/chat/venv/bin/python3.12 #!/home/trevor/dev/chat/venv/bin/python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import re import re
import sys import sys

View File

@@ -1,47 +1,45 @@
Metadata-Version: 2.1 Metadata-Version: 2.1
Name: PyJWT Name: PyJWT
Version: 2.8.0 Version: 2.10.1
Summary: JSON Web Token implementation in Python Summary: JSON Web Token implementation in Python
Home-page: https://github.com/jpadilla/pyjwt Author-email: Jose Padilla <hello@jpadilla.com>
Author: Jose Padilla
Author-email: hello@jpadilla.com
License: MIT License: MIT
Project-URL: Homepage, https://github.com/jpadilla/pyjwt
Keywords: json,jwt,security,signing,token,web Keywords: json,jwt,security,signing,token,web
Classifier: Development Status :: 5 - Production/Stable Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Developers
Classifier: Natural Language :: English
Classifier: License :: OSI Approved :: MIT License Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Programming Language :: Python Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Utilities Classifier: Topic :: Utilities
Requires-Python: >=3.7 Requires-Python: >=3.9
Description-Content-Type: text/x-rst Description-Content-Type: text/x-rst
License-File: LICENSE License-File: LICENSE
License-File: AUTHORS.rst License-File: AUTHORS.rst
Requires-Dist: typing-extensions ; python_version <= "3.7"
Provides-Extra: crypto Provides-Extra: crypto
Requires-Dist: cryptography (>=3.4.0) ; extra == 'crypto' Requires-Dist: cryptography>=3.4.0; extra == "crypto"
Provides-Extra: dev Provides-Extra: dev
Requires-Dist: sphinx (<5.0.0,>=4.5.0) ; extra == 'dev' Requires-Dist: coverage[toml]==5.0.4; extra == "dev"
Requires-Dist: sphinx-rtd-theme ; extra == 'dev' Requires-Dist: cryptography>=3.4.0; extra == "dev"
Requires-Dist: zope.interface ; extra == 'dev' Requires-Dist: pre-commit; extra == "dev"
Requires-Dist: cryptography (>=3.4.0) ; extra == 'dev' Requires-Dist: pytest<7.0.0,>=6.0.0; extra == "dev"
Requires-Dist: pytest (<7.0.0,>=6.0.0) ; extra == 'dev' Requires-Dist: sphinx; extra == "dev"
Requires-Dist: coverage[toml] (==5.0.4) ; extra == 'dev' Requires-Dist: sphinx-rtd-theme; extra == "dev"
Requires-Dist: pre-commit ; extra == 'dev' Requires-Dist: zope.interface; extra == "dev"
Provides-Extra: docs Provides-Extra: docs
Requires-Dist: sphinx (<5.0.0,>=4.5.0) ; extra == 'docs' Requires-Dist: sphinx; extra == "docs"
Requires-Dist: sphinx-rtd-theme ; extra == 'docs' Requires-Dist: sphinx-rtd-theme; extra == "docs"
Requires-Dist: zope.interface ; extra == 'docs' Requires-Dist: zope.interface; extra == "docs"
Provides-Extra: tests Provides-Extra: tests
Requires-Dist: pytest (<7.0.0,>=6.0.0) ; extra == 'tests' Requires-Dist: coverage[toml]==5.0.4; extra == "tests"
Requires-Dist: coverage[toml] (==5.0.4) ; extra == 'tests' Requires-Dist: pytest<7.0.0,>=6.0.0; extra == "tests"
PyJWT PyJWT
===== =====
@@ -63,11 +61,12 @@ A Python implementation of `RFC 7519 <https://tools.ietf.org/html/rfc7519>`_. Or
Sponsor Sponsor
------- -------
+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ .. |auth0-logo| image:: https://github.com/user-attachments/assets/ee98379e-ee76-4bcb-943a-e25c4ea6d174
| |auth0-logo| | If you want to quickly add secure token-based authentication to Python projects, feel free to check Auth0's Python SDK and free plan at `auth0.com/developers <https://auth0.com/developers?utm_source=GHsponsor&utm_medium=GHsponsor&utm_campaign=pyjwt&utm_content=auth>`_. | :width: 160px
+--------------+-----------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
.. |auth0-logo| image:: https://user-images.githubusercontent.com/83319/31722733-de95bbde-b3ea-11e7-96bf-4f4e8f915588.png +--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| |auth0-logo| | If you want to quickly add secure token-based authentication to Python projects, feel free to check Auth0's Python SDK and free plan at `auth0.com/signup <https://auth0.com/signup?utm_source=external_sites&utm_medium=pyjwt&utm_campaign=devn_signup>`_. |
+--------------+-----------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
Installing Installing
---------- ----------

View File

@@ -0,0 +1,33 @@
PyJWT-2.10.1.dist-info/AUTHORS.rst,sha256=klzkNGECnu2_VY7At89_xLBF3vUSDruXk3xwgUBxzwc,322
PyJWT-2.10.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
PyJWT-2.10.1.dist-info/LICENSE,sha256=eXp6ICMdTEM-nxkR2xcx0GtYKLmPSZgZoDT3wPVvXOU,1085
PyJWT-2.10.1.dist-info/METADATA,sha256=EkewF6D6KU8SGaaQzVYfxUUU1P_gs_dp1pYTkoYvAx8,3990
PyJWT-2.10.1.dist-info/RECORD,,
PyJWT-2.10.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
PyJWT-2.10.1.dist-info/WHEEL,sha256=PZUExdf71Ui_so67QXpySuHtCi3-J3wvF4ORK6k_S8U,91
PyJWT-2.10.1.dist-info/top_level.txt,sha256=RP5DHNyJbMq2ka0FmfTgoSaQzh7e3r5XuCWCO8a00k8,4
jwt/__init__.py,sha256=VB2vFKuboTjcDGeZ8r-UqK_dz3NsQSQEqySSICby8Xg,1711
jwt/__pycache__/__init__.cpython-312.pyc,,
jwt/__pycache__/algorithms.cpython-312.pyc,,
jwt/__pycache__/api_jwk.cpython-312.pyc,,
jwt/__pycache__/api_jws.cpython-312.pyc,,
jwt/__pycache__/api_jwt.cpython-312.pyc,,
jwt/__pycache__/exceptions.cpython-312.pyc,,
jwt/__pycache__/help.cpython-312.pyc,,
jwt/__pycache__/jwk_set_cache.cpython-312.pyc,,
jwt/__pycache__/jwks_client.cpython-312.pyc,,
jwt/__pycache__/types.cpython-312.pyc,,
jwt/__pycache__/utils.cpython-312.pyc,,
jwt/__pycache__/warnings.cpython-312.pyc,,
jwt/algorithms.py,sha256=cKr-XEioe0mBtqJMCaHEswqVOA1Z8Purt5Sb3Bi-5BE,30409
jwt/api_jwk.py,sha256=6F1r7rmm8V5qEnBKA_xMjS9R7VoANe1_BL1oD2FrAjE,4451
jwt/api_jws.py,sha256=aM8vzqQf6mRrAw7bRy-Moj_pjWsKSVQyYK896AfMjJU,11762
jwt/api_jwt.py,sha256=OGT4hok1l5A6FH_KdcrU5g6u6EQ8B7em0r9kGM9SYgA,14512
jwt/exceptions.py,sha256=bUIOJ-v9tjopTLS-FYOTc3kFx5WP5IZt7ksN_HE1G9Q,1211
jwt/help.py,sha256=vFdNzjQoAch04XCMYpCkyB2blaqHAGAqQrtf9nSPkdk,1808
jwt/jwk_set_cache.py,sha256=hBKmN-giU7-G37L_XKgc_OZu2ah4wdbj1ZNG_GkoSE8,959
jwt/jwks_client.py,sha256=p9b-IbQqo2tEge9Zit3oSPBFNePqwho96VLbnUrHUWs,4259
jwt/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
jwt/types.py,sha256=VnhGv_VFu5a7_mrPoSCB7HaNLrJdhM8Sq1sSfEg0gLU,99
jwt/utils.py,sha256=hxOjvDBheBYhz-RIPiEz7Q88dSUSTMzEdKE_Ww2VdJw,3640
jwt/warnings.py,sha256=50XWOnyNsIaqzUJTk6XHNiIDykiL763GYA92MjTKmok,59

View File

@@ -1,5 +1,5 @@
Wheel-Version: 1.0 Wheel-Version: 1.0
Generator: bdist_wheel (0.40.0) Generator: setuptools (75.6.0)
Root-Is-Purelib: true Root-Is-Purelib: true
Tag: py3-none-any Tag: py3-none-any

View File

@@ -1,33 +0,0 @@
PyJWT-2.8.0.dist-info/AUTHORS.rst,sha256=klzkNGECnu2_VY7At89_xLBF3vUSDruXk3xwgUBxzwc,322
PyJWT-2.8.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
PyJWT-2.8.0.dist-info/LICENSE,sha256=eXp6ICMdTEM-nxkR2xcx0GtYKLmPSZgZoDT3wPVvXOU,1085
PyJWT-2.8.0.dist-info/METADATA,sha256=pV2XZjvithGcVesLHWAv0J4T5t8Qc66fip2sbxwoz1o,4160
PyJWT-2.8.0.dist-info/RECORD,,
PyJWT-2.8.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
PyJWT-2.8.0.dist-info/WHEEL,sha256=pkctZYzUS4AYVn6dJ-7367OJZivF2e8RA9b_ZBjif18,92
PyJWT-2.8.0.dist-info/top_level.txt,sha256=RP5DHNyJbMq2ka0FmfTgoSaQzh7e3r5XuCWCO8a00k8,4
jwt/__init__.py,sha256=mV9lg6n4-0xiqCKaE1eEPC9a4j6sEkEYQcKghULE7kU,1670
jwt/__pycache__/__init__.cpython-312.pyc,,
jwt/__pycache__/algorithms.cpython-312.pyc,,
jwt/__pycache__/api_jwk.cpython-312.pyc,,
jwt/__pycache__/api_jws.cpython-312.pyc,,
jwt/__pycache__/api_jwt.cpython-312.pyc,,
jwt/__pycache__/exceptions.cpython-312.pyc,,
jwt/__pycache__/help.cpython-312.pyc,,
jwt/__pycache__/jwk_set_cache.cpython-312.pyc,,
jwt/__pycache__/jwks_client.cpython-312.pyc,,
jwt/__pycache__/types.cpython-312.pyc,,
jwt/__pycache__/utils.cpython-312.pyc,,
jwt/__pycache__/warnings.cpython-312.pyc,,
jwt/algorithms.py,sha256=RDsv5Lm3bzwsiWT3TynT7JR41R6H6s_fWUGOIqd9x_I,29800
jwt/api_jwk.py,sha256=HPxVqgBZm7RTaEXydciNBCuYNKDYOC_prTdaN9toGbo,4196
jwt/api_jws.py,sha256=da17RrDe0PDccTbx3rx2lLezEG_c_YGw_vVHa335IOk,11099
jwt/api_jwt.py,sha256=yF9DwF1kt3PA5n_TiU0OmHd0LtPHfe4JCE1XOfKPjw0,12638
jwt/exceptions.py,sha256=KDC3M7cTrpR4OQXVURlVMThem0pfANSgBxRz-ttivmo,1046
jwt/help.py,sha256=Jrp84fG43sCwmSIaDtY08I6ZR2VE7NhrTff89tYSE40,1749
jwt/jwk_set_cache.py,sha256=hBKmN-giU7-G37L_XKgc_OZu2ah4wdbj1ZNG_GkoSE8,959
jwt/jwks_client.py,sha256=9W8JVyGByQgoLbBN1u5iY1_jlgfnnukeOBTpqaM_9SE,4222
jwt/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
jwt/types.py,sha256=VnhGv_VFu5a7_mrPoSCB7HaNLrJdhM8Sq1sSfEg0gLU,99
jwt/utils.py,sha256=PAI05_8MHQCxWQTDlwN0hTtTIT2DTTZ28mm1x6-26UY,3903
jwt/warnings.py,sha256=50XWOnyNsIaqzUJTk6XHNiIDykiL763GYA92MjTKmok,59

View File

@@ -1,530 +0,0 @@
SQLAlchemy-2.0.23.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
SQLAlchemy-2.0.23.dist-info/LICENSE,sha256=2lSTeluT1aC-5eJXO8vhkzf93qCSeV_mFXLrv3tNdIU,1100
SQLAlchemy-2.0.23.dist-info/METADATA,sha256=znDChLueFNPCOPuNix-FfY7FG6aQOCM-lQwwN-cPLQs,9551
SQLAlchemy-2.0.23.dist-info/RECORD,,
SQLAlchemy-2.0.23.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
SQLAlchemy-2.0.23.dist-info/WHEEL,sha256=JmQLNqDEfvnYMfsIaVeSP3fmUcYDwmF12m3QYW0c7QQ,152
SQLAlchemy-2.0.23.dist-info/top_level.txt,sha256=rp-ZgB7D8G11ivXON5VGPjupT1voYmWqkciDt5Uaw_Q,11
sqlalchemy/__init__.py,sha256=DjKCAltzrHGfaVdXVeFJpBmTaX6JmyloHANzewBUWo4,12708
sqlalchemy/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/__pycache__/events.cpython-312.pyc,,
sqlalchemy/__pycache__/exc.cpython-312.pyc,,
sqlalchemy/__pycache__/inspection.cpython-312.pyc,,
sqlalchemy/__pycache__/log.cpython-312.pyc,,
sqlalchemy/__pycache__/schema.cpython-312.pyc,,
sqlalchemy/__pycache__/types.cpython-312.pyc,,
sqlalchemy/connectors/__init__.py,sha256=uKUYWQoXyleIyjWBuh7gzgnazJokx3DaasKJbFOfQGA,476
sqlalchemy/connectors/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/connectors/__pycache__/aioodbc.cpython-312.pyc,,
sqlalchemy/connectors/__pycache__/asyncio.cpython-312.pyc,,
sqlalchemy/connectors/__pycache__/pyodbc.cpython-312.pyc,,
sqlalchemy/connectors/aioodbc.py,sha256=QiafuN9bx_wcIs8tByLftTmGAegXPoFPwUaxCDU_ZQA,5737
sqlalchemy/connectors/asyncio.py,sha256=ZZmJSFT50u-GEjZzytQOdB_tkBFxi3XPWRrNhs_nASc,6139
sqlalchemy/connectors/pyodbc.py,sha256=NskMydn26ZkHL8aQ1V3L4WIAWin3zwJ5VEnlHvAD1DE,8453
sqlalchemy/cyextension/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
sqlalchemy/cyextension/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/cyextension/collections.cpython-312-x86_64-linux-gnu.so,sha256=qPSMnyXVSLYHMr_ot_ZK7yEYadhTuT8ryb6eTMFFWrM,1947440
sqlalchemy/cyextension/collections.pyx,sha256=KDI5QTOyYz9gDl-3d7MbGMA0Kc-wxpJqnLmCaUmQy2U,12323
sqlalchemy/cyextension/immutabledict.cpython-312-x86_64-linux-gnu.so,sha256=J9m0gK6R8PGR36jxAKx415VxX0-0fqvbQAP9-DDU1qA,811232
sqlalchemy/cyextension/immutabledict.pxd,sha256=oc8BbnQwDg7pWAdThB-fzu8s9_ViOe1Ds-8T0r0POjI,41
sqlalchemy/cyextension/immutabledict.pyx,sha256=aQJPZKjcqbO8jHDqpC9F-v-ew2qAjUscc5CntaheZUk,3285
sqlalchemy/cyextension/processors.cpython-312-x86_64-linux-gnu.so,sha256=WOLcEWRNXn4UtJGhzF5B1h7JpPPfn-ziQMT0lkhobQE,533968
sqlalchemy/cyextension/processors.pyx,sha256=0swFIBdR19x1kPRe-dijBaLW898AhH6QJizbv4ho9pk,1545
sqlalchemy/cyextension/resultproxy.cpython-312-x86_64-linux-gnu.so,sha256=bte73oURZXuV7YvkjyGo-OjRCnSgYukqDp5KM9-Z8xY,626112
sqlalchemy/cyextension/resultproxy.pyx,sha256=cDtMjLTdC47g7cME369NSOCck3JwG2jwZ6j25no3_gw,2477
sqlalchemy/cyextension/util.cpython-312-x86_64-linux-gnu.so,sha256=8yMbb069NQN1b6yAsCBCMpbX94sH4iLs61vPNxd0bOg,958760
sqlalchemy/cyextension/util.pyx,sha256=lv03p63oVn23jLhMI4_RYGewUnJfh-4FkrNMEFL7A3Y,2289
sqlalchemy/dialects/__init__.py,sha256=hLsgIEomunlp4mNLnvjCQTLOnBVva8N7IT2-RYrN2_4,1770
sqlalchemy/dialects/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/__pycache__/_typing.cpython-312.pyc,,
sqlalchemy/dialects/_typing.py,sha256=P2ML2o4b_bWAAy3zbdoUjx3vXsMNwpiOblef8ThCxlM,648
sqlalchemy/dialects/mssql/__init__.py,sha256=CYbbydyMSLjUq8vY1siNStd4lvjVXod8ddeDS6ELHLk,1871
sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/aioodbc.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/base.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/json.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-312.pyc,,
sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-312.pyc,,
sqlalchemy/dialects/mssql/aioodbc.py,sha256=ncj3yyfvW91o3g19GB5s1I0oaZKUO_P-R2nwnLF0t9E,2013
sqlalchemy/dialects/mssql/base.py,sha256=l9vX6fK6DJEYA00N9uDnvSbqfgvxXfYUn2C4AF5T920,133649
sqlalchemy/dialects/mssql/information_schema.py,sha256=ll0zAupJ4cPvhi9v5hTi7PQLU1lae4o6eQ5Vg7gykXQ,8074
sqlalchemy/dialects/mssql/json.py,sha256=B0m6H08CKuk-yomDHcCwfQbVuVN2WLufuVueA_qb1NQ,4573
sqlalchemy/dialects/mssql/provision.py,sha256=x7XRSQDxz4jz2uIpqwhuIXpL9bic0Vw7Mhy39HOkyqY,5013
sqlalchemy/dialects/mssql/pymssql.py,sha256=BfJp9t-IQabqWXySJBmP9pwNTWnJqbjA2jJM9M4XeWc,4029
sqlalchemy/dialects/mssql/pyodbc.py,sha256=qwZ8ByOTZ1WObjxeOravoJBSBX-s4RJ_PZ5VJ_Ch5Ws,27048
sqlalchemy/dialects/mysql/__init__.py,sha256=btLABiNnmbWt9ziW-XgVWEB1qHWQcSFz7zxZNw4m_LY,2144
sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/aiomysql.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/asyncmy.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/base.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/dml.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/enumerated.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/expression.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/json.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/mariadb.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/mariadbconnector.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/reflection.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/reserved_words.cpython-312.pyc,,
sqlalchemy/dialects/mysql/__pycache__/types.cpython-312.pyc,,
sqlalchemy/dialects/mysql/aiomysql.py,sha256=Zb-_F9Pzl0t-fT1bZwbNNne6jjCUqBXxeizbhMFPqls,9750
sqlalchemy/dialects/mysql/asyncmy.py,sha256=zqupDz7AJihjv3E8w_4XAtq95d8stdrETNx60MLNVr0,9819
sqlalchemy/dialects/mysql/base.py,sha256=q-DzkR_txwDTeWTEByzHAoIArYU3Bb5HT2Bnmuw7WIM,120688
sqlalchemy/dialects/mysql/cymysql.py,sha256=5CQVJAlqQ3pT4IDGSQJH2hCzj-EWjUitA21MLqJwEEs,2291
sqlalchemy/dialects/mysql/dml.py,sha256=qw0ZweHbMsbNyVSfC17HqylCnf7XAuIjtgofiWABT8k,7636
sqlalchemy/dialects/mysql/enumerated.py,sha256=1L2J2wT6nQEmRS4z-jzZpoi44IqIaHgBRZZB9m55czo,8439
sqlalchemy/dialects/mysql/expression.py,sha256=WW5G2XPwqJfXjuzHBt4BRP0pCLcPJkPD1mvZX1g0JL0,4066
sqlalchemy/dialects/mysql/json.py,sha256=JlSFBAHhJ9JmV-3azH80xkLgeh7g6A6DVyNVCNZiKPU,2260
sqlalchemy/dialects/mysql/mariadb.py,sha256=Sugyngvo6j6SfFFuJ23rYeFWEPdZ9Ji9guElsk_1WSQ,844
sqlalchemy/dialects/mysql/mariadbconnector.py,sha256=F1VPosecC1hDZqjzZI29j4GUduyU4ewPwb-ekBQva5w,8725
sqlalchemy/dialects/mysql/mysqlconnector.py,sha256=5glmkPhD_KP-Mci8ZXBr4yzqH1MDfzCJ9F_kZNyXcGo,5666
sqlalchemy/dialects/mysql/mysqldb.py,sha256=R5BDiXiHX5oFuAOzyxZ6TYUTGzly-dulMeQLkeia6kk,9649
sqlalchemy/dialects/mysql/provision.py,sha256=uPT6-BIoP_12XLmWAza1TDFNhOVVJ3rmQoMH7nvh-Vg,3226
sqlalchemy/dialects/mysql/pymysql.py,sha256=d2-00IPoyEDkR9REQTE-DGEQrGshUq_0G5liZ5FiSEM,4032
sqlalchemy/dialects/mysql/pyodbc.py,sha256=mkOvumrxpmAi6noZlkaTVKz2F7G5vLh2vx0cZSn9VTA,4288
sqlalchemy/dialects/mysql/reflection.py,sha256=ak6E-eCP9346ixnILYNJcrRYblWbIT0sjXf4EqmfBsY,22556
sqlalchemy/dialects/mysql/reserved_words.py,sha256=DsPHsW3vwOrvU7bv3Nbfact2Z_jyZ9xUTT-mdeQvqxo,9145
sqlalchemy/dialects/mysql/types.py,sha256=i8DpRkOL1QhPErZ25AmCQOuFLciWhdjNL3I0CeHEhdY,24258
sqlalchemy/dialects/oracle/__init__.py,sha256=pjk1aWi9XFCAHWNSJzSzmoIcL32-AkU_1J9IV4PtwpA,1318
sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/base.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/dictionary.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/oracledb.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/dialects/oracle/__pycache__/types.cpython-312.pyc,,
sqlalchemy/dialects/oracle/base.py,sha256=u55_R9NrCRijud7ioHMxT-r0MSW0gMFjOwbrDdPgFsc,118036
sqlalchemy/dialects/oracle/cx_oracle.py,sha256=L0GvcB6xb0-zyv5dx3bpQCeptp0KSqH6g9FUQ4y-d-g,55108
sqlalchemy/dialects/oracle/dictionary.py,sha256=iUoyFEFM8z0sfVWR2n_nnre14kaQkV_syKO0R5Dos4M,19487
sqlalchemy/dialects/oracle/oracledb.py,sha256=_-fUQ94xai80B7v9WLVGoGDIv8u54nVspBdyGEyI76g,3457
sqlalchemy/dialects/oracle/provision.py,sha256=5cvIc3yTWxz4AIRYxcesbRJ1Ft-zT9GauQ911yPnN2o,8055
sqlalchemy/dialects/oracle/types.py,sha256=TeOhUW5W9qZC8SaJ-9b3u6OvOPOarNq4MmCQ7l3wWX0,8204
sqlalchemy/dialects/postgresql/__init__.py,sha256=bZEPsLbRtB7s6TMQAHCIzKBgkxUa3eDXvCkeARua37E,3734
sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/_psycopg_common.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/array.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/asyncpg.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/base.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/dml.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/ext.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/json.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/named_types.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/operators.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/pg_catalog.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/psycopg.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/__pycache__/types.cpython-312.pyc,,
sqlalchemy/dialects/postgresql/_psycopg_common.py,sha256=U3aWzbKD3VOj6Z6r-4IsIQmtjGGIB4RDZH6NXfd8Xz0,5655
sqlalchemy/dialects/postgresql/array.py,sha256=tLyU9GDAeIypNhjTuFQUYbaTeijVM1VVJS6UdzzXXn4,13682
sqlalchemy/dialects/postgresql/asyncpg.py,sha256=XNaoOZ5Da4-jUTaES1zEOTEW3WG8UKyVCoIS3LsFhzE,39967
sqlalchemy/dialects/postgresql/base.py,sha256=DGhaquFJWDQL7wIvQ2EE57LxD7zGR06BKQxvNZHFLgY,175634
sqlalchemy/dialects/postgresql/dml.py,sha256=_He69efdpDA5gGmBsE7Lo4ViSi3QnR38BiFmrR1tw6k,11203
sqlalchemy/dialects/postgresql/ext.py,sha256=oPP22Pq-n2lMmQ8ahifYmsmzRhSiSv1RV-xrTT0gycw,16253
sqlalchemy/dialects/postgresql/hstore.py,sha256=q5x0npbAMI8cdRFGTMwLoWFj9P1G9DUkw5OEUCfTXpI,11532
sqlalchemy/dialects/postgresql/json.py,sha256=panGtnEbcirQDy4yR2huWydFqa_Kmv8xhpLyf-SSRWE,11203
sqlalchemy/dialects/postgresql/named_types.py,sha256=zNoHsP3nVq5xxA7SOQ6LLDwYZEHFciZ-nDjw_I9f_G0,17092
sqlalchemy/dialects/postgresql/operators.py,sha256=MB40xq1124OnhUzkvtbnTmxEiey0VxMOYyznF96wwhI,2799
sqlalchemy/dialects/postgresql/pg8000.py,sha256=w6pJ3LaIKWmnwvB0Pr1aTJX5OKNtG5RNClVfkE019vU,18620
sqlalchemy/dialects/postgresql/pg_catalog.py,sha256=0lLnIgxfCrqkx_LNijMxo0trNLsodcd8KwretZIj4uM,8875
sqlalchemy/dialects/postgresql/provision.py,sha256=oxyAzs8_PhuK0ChivXC3l2Nldih3_HKffvGsZqD8XWI,5509
sqlalchemy/dialects/postgresql/psycopg.py,sha256=YMubzQHMYN1By8QJScIPb_PwNiACv6srddQ6nX6WltQ,22238
sqlalchemy/dialects/postgresql/psycopg2.py,sha256=3Xci4bTA2BvhrZAQa727uFWdaXEZmvfD-Z-upE3NyQE,31592
sqlalchemy/dialects/postgresql/psycopg2cffi.py,sha256=2EOuDwBetfvelcPoTzSwOHe6X8lTwaYH7znNzXJt9eM,1739
sqlalchemy/dialects/postgresql/ranges.py,sha256=yHB1BRlUreQPZB3VEn0KMMLf02zjf5jjYdmg4N4S2Sw,30220
sqlalchemy/dialects/postgresql/types.py,sha256=l24rs8_nK4vqLyQC0aUkf4S7ecw6T_7Pgq50Icc5CBs,7292
sqlalchemy/dialects/sqlite/__init__.py,sha256=wnZ9vtfm0QXmth1jiGiubFgRiKxIoQoNthb1bp4FhCs,1173
sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/aiosqlite.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/base.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/dml.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/json.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-312.pyc,,
sqlalchemy/dialects/sqlite/aiosqlite.py,sha256=GZJioZLot0D5CQ6ovPQoqv2iV8FAFm3G75lEFCzopoE,12296
sqlalchemy/dialects/sqlite/base.py,sha256=YYEB5BeuemLC3FAR7EB8vA0zoUOwHTKoF_srvnAStps,96785
sqlalchemy/dialects/sqlite/dml.py,sha256=PYESBj8Ip7bGs_Fi7QjbWLXLnU9a-SbP96JZiUoZNHg,8434
sqlalchemy/dialects/sqlite/json.py,sha256=XFPwSdNx0DxDfxDZn7rmGGqsAgL4vpJbjjGaA73WruQ,2533
sqlalchemy/dialects/sqlite/provision.py,sha256=O4JDoybdb2RBblXErEVPE2P_5xHab927BQItJa203zU,5383
sqlalchemy/dialects/sqlite/pysqlcipher.py,sha256=_JuOCoic--ehAGkCgnwUUKKTs6xYoBGag4Y_WkQUDwU,5347
sqlalchemy/dialects/sqlite/pysqlite.py,sha256=xBg6DKqvml5cCGxVSAQxR1dcMvso8q4uyXs2m4WLzz0,27891
sqlalchemy/dialects/type_migration_guidelines.txt,sha256=-uHNdmYFGB7bzUNT6i8M5nb4j6j9YUKAtW4lcBZqsMg,8239
sqlalchemy/engine/__init__.py,sha256=fJCAl5P7JH9iwjuWo72_3LOIzWWhTnvXqzpAmm_T0fY,2818
sqlalchemy/engine/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/_py_processors.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/_py_row.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/_py_util.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/base.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/characteristics.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/create.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/cursor.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/default.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/events.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/interfaces.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/mock.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/processors.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/reflection.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/result.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/row.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/strategies.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/url.cpython-312.pyc,,
sqlalchemy/engine/__pycache__/util.cpython-312.pyc,,
sqlalchemy/engine/_py_processors.py,sha256=RSVKm9YppSBDSCEi8xvbZdRCP9EsCYfbyEg9iDCMCiI,3744
sqlalchemy/engine/_py_row.py,sha256=Zdta0JGa7V2aV04L7nzXUEp-H1gpresKyBlneQu60pk,3549
sqlalchemy/engine/_py_util.py,sha256=5m3MZbEqnUwP5kK_ghisFpzcXgBwSxTSkBEFB6afiD8,2245
sqlalchemy/engine/base.py,sha256=RbIfWZ1Otyb4VzMYjDpK5BiDIE8QZwa4vQgRX0yCa28,122246
sqlalchemy/engine/characteristics.py,sha256=YvMgrUVAt3wsSiQ0K8l44yBjFlMK3MGajxhg50t5yFM,2344
sqlalchemy/engine/create.py,sha256=8372TLpy4FOAIZ9WmuNkx1v9DPgwpoCAH9P7LNXZCwY,32629
sqlalchemy/engine/cursor.py,sha256=6e1Tp63r0Kt-P4pEaYR7wUew2aClTdKAEI-FoAAxJxE,74405
sqlalchemy/engine/default.py,sha256=bi--ytxYJ0EtsCudl38owGtytnwTHX-PjlsYTFe8LpA,84065
sqlalchemy/engine/events.py,sha256=PQyc_sbmqks6pqyN7xitO658KdKzzJWfW1TKYwEd5vo,37392
sqlalchemy/engine/interfaces.py,sha256=pAFYR15f1Z_-qdzTYI4mAm8IYbD6maLBKbG3pBaJ8Us,112824
sqlalchemy/engine/mock.py,sha256=ki4ud7YrUrzP2katdkxlJGFUKB2kS7cZZAHK5xWsNF8,4179
sqlalchemy/engine/processors.py,sha256=ENN6XwndxJPW-aXPu_3NzAZsy5SvNznHoa1Qn29ERAw,2383
sqlalchemy/engine/reflection.py,sha256=2aakNheQJNMUXZbhY8s1NtqGoGWTxM2THkJlMMfiX_s,75125
sqlalchemy/engine/result.py,sha256=shRAsboHPTvKR38ryGgC4KLcUeVTbABSlWzAfOUKVZs,77841
sqlalchemy/engine/row.py,sha256=doiXKaUI6s6OkfqPIwNyTPLllxJfR8HYgEI8ve9VYe0,11955
sqlalchemy/engine/strategies.py,sha256=HjCj_FHQOgkkhhtnVmcOEuHI_cftNo3P0hN5zkhZvDc,442
sqlalchemy/engine/url.py,sha256=_WNE7ia0JIPRc1PLY_jSA3F7bB5kp1gzuzkc5eoKviA,30694
sqlalchemy/engine/util.py,sha256=3-ENI9S-3KLWr0GW27uWQfsvCJwMBGTKbykkKPUgiAE,5667
sqlalchemy/event/__init__.py,sha256=CSBMp0yu5joTC6tWvx40B4p87N7oGKxC-ZLx2ULKUnQ,997
sqlalchemy/event/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/event/__pycache__/api.cpython-312.pyc,,
sqlalchemy/event/__pycache__/attr.cpython-312.pyc,,
sqlalchemy/event/__pycache__/base.cpython-312.pyc,,
sqlalchemy/event/__pycache__/legacy.cpython-312.pyc,,
sqlalchemy/event/__pycache__/registry.cpython-312.pyc,,
sqlalchemy/event/api.py,sha256=nQAvPK1jrLpmu8aKCUtc-vYWcIuG-1FgAtp3GRkfIiI,8227
sqlalchemy/event/attr.py,sha256=NMe_sPQTju2PE-f68C8TcKJGW-Gxyi1CLXumAmE368Y,20438
sqlalchemy/event/base.py,sha256=Cr_PNJlCYJSU3rtT8DkplyjBRb-E2Wa3OAeK9woFJkk,14980
sqlalchemy/event/legacy.py,sha256=OpPqE64xk1OYjLW1scvc6iijhoa5GZJt5f7-beWhgOc,8211
sqlalchemy/event/registry.py,sha256=Zig9q2Galo8kO2aqr7a2rNAhmIkdJ-ntHSEcM5MfSgw,10833
sqlalchemy/events.py,sha256=pRcPKKsPQHGPH_pvTtKRmzuEIy-QHCtkUiZl4MUbxKs,536
sqlalchemy/exc.py,sha256=4SMKOJtz7_SWt5vskCSeXSi4ZlFyL4jh53Q8sk4-ODQ,24011
sqlalchemy/ext/__init__.py,sha256=w4h7EpXjKPr0LD4yHa0pDCfrvleU3rrX7mgyb8RuDYQ,322
sqlalchemy/ext/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/associationproxy.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/automap.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/baked.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/compiler.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/horizontal_shard.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/hybrid.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/indexable.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/instrumentation.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/mutable.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/orderinglist.cpython-312.pyc,,
sqlalchemy/ext/__pycache__/serializer.cpython-312.pyc,,
sqlalchemy/ext/associationproxy.py,sha256=5voNXWIJYGt6c8mwuSA6alm3SmEHOZ-CVK8ikgfzk8s,65960
sqlalchemy/ext/asyncio/__init__.py,sha256=iG_0TmBO1pCB316WS-p17AImwqRtUoaKo7UphYZ7bYw,1317
sqlalchemy/ext/asyncio/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/base.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/engine.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/exc.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/result.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/scoping.cpython-312.pyc,,
sqlalchemy/ext/asyncio/__pycache__/session.cpython-312.pyc,,
sqlalchemy/ext/asyncio/base.py,sha256=PXF4YqfRi2-mADAtaL2_-Uv7CzoBVojPbzyA5phJ9To,8959
sqlalchemy/ext/asyncio/engine.py,sha256=h4pe3ixuX6YfI97B5QWo2V4_CCCnOvM_EHPZhX19Mgc,47796
sqlalchemy/ext/asyncio/exc.py,sha256=1hCdOKzvSryc_YE4jgj0l9JASOmZXutdzShEYPiLbGI,639
sqlalchemy/ext/asyncio/result.py,sha256=zETerVB53gql1DL6tkO_JiqeU-m1OM-8kX0ULxmoL_I,30554
sqlalchemy/ext/asyncio/scoping.py,sha256=cBNluB7n_lwdAAo6pySbvNRqPN7UBzwQHZ6XhRDyWgA,52685
sqlalchemy/ext/asyncio/session.py,sha256=yWwhI5i_yVWjykxmxkcP3-xmw3UpoGYNhHZL8sYXQMA,62998
sqlalchemy/ext/automap.py,sha256=7p13-VpN0MOM525r7pmEnftedya9l5G-Ei_cFXZfpTc,61431
sqlalchemy/ext/baked.py,sha256=R8ZAxiVN6eH50AJu0O3TtFXNE1tnRkMlSj3AvkcWFhY,17818
sqlalchemy/ext/compiler.py,sha256=h7eR0NcPJ4F_k8YGRP3R9YX75Y9pgiVxoCjRyvceF7g,20391
sqlalchemy/ext/declarative/__init__.py,sha256=VJu8S1efxil20W48fJlpDn6gHorOudn5p3-lF72WcJ8,1818
sqlalchemy/ext/declarative/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/ext/declarative/__pycache__/extensions.cpython-312.pyc,,
sqlalchemy/ext/declarative/extensions.py,sha256=vwZjudPFA_mao1U04-RZCaU_tvPMBgQa5OTmSI7K7SU,19547
sqlalchemy/ext/horizontal_shard.py,sha256=eh14W8QWHYH22PL1l5qF_ad9Fyh1WAFjKi_vNfsme94,16766
sqlalchemy/ext/hybrid.py,sha256=98D72WBmlileYBtEKMSNF9l-bwRavThSV8-LyB2gjo0,52499
sqlalchemy/ext/indexable.py,sha256=RkG9BKwil-TqDjVBM14ML9c-geUrHxtRKpYkSJEwGHA,11028
sqlalchemy/ext/instrumentation.py,sha256=rjjSbTGilYeGLdyEWV932TfTaGxiVP44_RajinANk54,15723
sqlalchemy/ext/mutable.py,sha256=d3Pp8PcAVN4pHN9rhc1ReXBWe0Q70Q5S1klFoYGyDPA,37393
sqlalchemy/ext/mypy/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
sqlalchemy/ext/mypy/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/apply.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/decl_class.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/infer.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/names.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/plugin.cpython-312.pyc,,
sqlalchemy/ext/mypy/__pycache__/util.cpython-312.pyc,,
sqlalchemy/ext/mypy/apply.py,sha256=uUES4grydYtKykLKlxzJeBXeGe8kfWou9_rzEyEkfp0,10503
sqlalchemy/ext/mypy/decl_class.py,sha256=Ls2Efh4kEhle6Z4VMz0GRBgGQTYs2fHr5b4DfuDj44c,17377
sqlalchemy/ext/mypy/infer.py,sha256=si720RW6iGxMRZNP5tcaIxA1_ehFp215TzxVXaLjglU,19364
sqlalchemy/ext/mypy/names.py,sha256=tch4f5fDmdv4AWWFzXgGZdCpxmae59XRPT02KyMvrEI,10625
sqlalchemy/ext/mypy/plugin.py,sha256=fLXDukvZqbJ0JJCOoyZAuOniYZ_F1YT-l9gKppu8SEs,9750
sqlalchemy/ext/mypy/util.py,sha256=TlEQq4bcs8ARLL3PoFS8Qw6oYFeMqcGnWTeJ7NsPPFk,9408
sqlalchemy/ext/orderinglist.py,sha256=8Vcg7UUkLg-QbYAbLVDSqu-5REkR6L-FLLhCYsHYxCQ,14384
sqlalchemy/ext/serializer.py,sha256=ox6dbMOBmFR0H2RQFt17mcYBOGKgn1cNVFfqY8-jpgQ,6178
sqlalchemy/future/__init__.py,sha256=79DZx3v7TQZpkS_qThlmuCOm1a9UK2ObNZhyMmjfNB0,516
sqlalchemy/future/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/future/__pycache__/engine.cpython-312.pyc,,
sqlalchemy/future/engine.py,sha256=6uOpOedIqiT1-3qJSJIlv9_raMJU8NTkhQwN_Ngg8kI,499
sqlalchemy/inspection.py,sha256=i3aR-IV101YU8D9TA8Pxb2wi08QZuJ34sMy6L5M__rY,5145
sqlalchemy/log.py,sha256=aSlZ8DFHkOuI-AMmaOUUYtS9zGPadi_7tAo98QpUOiY,8634
sqlalchemy/orm/__init__.py,sha256=cBn0aPWyDFY4ya-cHRshQBcuThk1smTUCTrlp6LHdlE,8463
sqlalchemy/orm/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/_orm_constructors.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/_typing.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/attributes.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/base.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/bulk_persistence.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/clsregistry.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/collections.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/context.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/decl_api.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/decl_base.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/dependency.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/descriptor_props.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/dynamic.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/evaluator.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/events.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/exc.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/identity.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/instrumentation.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/interfaces.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/loading.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/mapped_collection.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/mapper.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/path_registry.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/persistence.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/properties.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/query.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/relationships.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/scoping.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/session.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/state.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/state_changes.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/strategies.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/strategy_options.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/sync.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/unitofwork.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/util.cpython-312.pyc,,
sqlalchemy/orm/__pycache__/writeonly.cpython-312.pyc,,
sqlalchemy/orm/_orm_constructors.py,sha256=_7_GY6qw2sA-GG_WXLz1GOO-0qC-SCBeA43GhVuS2Qw,99803
sqlalchemy/orm/_typing.py,sha256=oRUJVAGpU3_DhSkIb1anXgneweVIARjB51HlPhMNfcM,5015
sqlalchemy/orm/attributes.py,sha256=NFhYheqqu2VcXmKTdcvQKiRR_6qo0rHLK7nda7rpviA,92578
sqlalchemy/orm/base.py,sha256=iZXsygk4fn8wd7wx1iXn_PfnGDY7d41YRfS0mC_q5vE,27700
sqlalchemy/orm/bulk_persistence.py,sha256=S9VK5a6GSqnw3z7O5UG5OOnc9WxzmS_ooDkA5JmCIsY,69878
sqlalchemy/orm/clsregistry.py,sha256=4J-kKshmLOEyx3VBqREm2k_XY0cer4zwUoHJT3n5Xmw,17949
sqlalchemy/orm/collections.py,sha256=0AZFr9us9MiHo_Xcyi7DUsN02jSBERUOd-jIK8qQ1DA,52159
sqlalchemy/orm/context.py,sha256=VyJl1ZJ5OnJUACKlM-bPLyyoqu4tyaKKdxeC-QF4EuU,111698
sqlalchemy/orm/decl_api.py,sha256=a2Cyvjh6j5BlXJQ2i0jpQx7xkeI_6xo5MMxr0d2ndQY,63589
sqlalchemy/orm/decl_base.py,sha256=g9xW9G-n9iStMI0i3i-9Rt4LDRW8--3iCCRPlWF6Cko,81660
sqlalchemy/orm/dependency.py,sha256=g3R_1H_OGzagXFeen3Irm3c1lO3yeXGdGa0muUZgZAk,47583
sqlalchemy/orm/descriptor_props.py,sha256=SdrfVu05zhWLGe_DnBlgbU6e5sWkkfBTirH9Nrr1MLk,37176
sqlalchemy/orm/dynamic.py,sha256=pYlMIrpp80Ex4KByqdyhx0x0kIrl_cIADwkeVxvYu4s,9798
sqlalchemy/orm/evaluator.py,sha256=jPjVrP7XbVOG6aXTCBREq0rF3oNHLqB4XAT-gt_cpaA,11925
sqlalchemy/orm/events.py,sha256=fGnUHwDTV9FTiifB2mmIJispwPbIT4mZongRJD7uiw4,127258
sqlalchemy/orm/exc.py,sha256=A3wvZVs5sC5XCef4LoTUBG-UfhmliFpU9rYMdS2t_To,7356
sqlalchemy/orm/identity.py,sha256=gRiuQSrurHGEAJXH9QGYioXL49Im5EGcYQ-IKUEpHmQ,9249
sqlalchemy/orm/instrumentation.py,sha256=o1mTv5gCgl9d-SRvEXXjl8rzl8uBasRL3bpDgWg9P58,24337
sqlalchemy/orm/interfaces.py,sha256=RW7bBXGWtZHY2wXFOSqtvYm6UDl7yHZUyRX_6Yd3GfQ,48395
sqlalchemy/orm/loading.py,sha256=F1ZEHTPBglmznST2nGj_0ARccoFgTyaOOwjcqpYeuvM,57366
sqlalchemy/orm/mapped_collection.py,sha256=ZgYHaF37yo6-gZ7Da1Gg25rMgG2GynAy-RJoDhljV5g,19698
sqlalchemy/orm/mapper.py,sha256=kyq4pBkTvvEqlW4H4XK_ktP1sOiALNAycgvF5f-xtqw,170969
sqlalchemy/orm/path_registry.py,sha256=olyutgn0uNB7Wi32YNQx9ZHV6sUgV3TbyGplfSxfZ6g,25938
sqlalchemy/orm/persistence.py,sha256=qr1jUgo-NZ0tLa5eIis2271QDt4KNJwYlYU_9CaKNhQ,60545
sqlalchemy/orm/properties.py,sha256=dt1Gy06pbRY6zgm4QGR9nU6z2WCyoTZWBJYKpUhLq_c,29095
sqlalchemy/orm/query.py,sha256=VBSD0k15xU_XykggvLGAwGdwNglBAoBKbOk8qAoMKdI,117714
sqlalchemy/orm/relationships.py,sha256=wrHyICb8A5qPoyxf-nITQVJ13kCNr2MedDqEY8QMSt8,127816
sqlalchemy/orm/scoping.py,sha256=75iPEWDFhPcIXgl8EUd_sPTCL6punfegEaTRE5mP3e8,78835
sqlalchemy/orm/session.py,sha256=TeBcZNdY4HWQFdXNCIqbsQTtkvfJkBweMzvA9p3BiPA,193279
sqlalchemy/orm/state.py,sha256=EaWkVNWHaDeJ_FZGXHakSamUk51BXmtMWLGdFhlJmh8,37536
sqlalchemy/orm/state_changes.py,sha256=pqkjSDOR6H5BufMKdzFUIatDp3DY90SovOJiJ1k6Ayw,6815
sqlalchemy/orm/strategies.py,sha256=V0o-1kB1IVTxhOGqGtRyjddZqAbPdsl_h-k0N3MKCGo,114052
sqlalchemy/orm/strategy_options.py,sha256=EmgH28uMQhwwBCDVcXmywLk_Q8AbpnK02seMsMV4nmc,84102
sqlalchemy/orm/sync.py,sha256=5Nt_OqP4IfhAtHwFRar4dw-YjLENRLvp4d3jDC4wpnw,5749
sqlalchemy/orm/unitofwork.py,sha256=Wk5YZocBbxe4m1wU2aFQ7gY1Cp5CROi13kDEM1iOSz4,27033
sqlalchemy/orm/util.py,sha256=7hCRYbQjqhWJTkrPf_NXY9zF_18VWTpyguu-nfYfc6c,80340
sqlalchemy/orm/writeonly.py,sha256=WCPXCAwHqVCfhVWXQEFCP3OocIiHgqNJ5KnuJwSgGq4,22329
sqlalchemy/pool/__init__.py,sha256=CIv4b6ctueY7w3sML_LxyLKAdl59esYOhz3O7W5w7WE,1815
sqlalchemy/pool/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/pool/__pycache__/base.cpython-312.pyc,,
sqlalchemy/pool/__pycache__/events.cpython-312.pyc,,
sqlalchemy/pool/__pycache__/impl.cpython-312.pyc,,
sqlalchemy/pool/base.py,sha256=wuwKIak5d_4-TqKI2RFN8OYMEyOvV0djnoSVR8gbxAQ,52249
sqlalchemy/pool/events.py,sha256=IcWfORKbHM69Z9FdPJlXI7-NIhQrR9O_lg59tiUdTRU,13148
sqlalchemy/pool/impl.py,sha256=vU0n82a7uxdE34p3hU7cvUDA5QDy9MkIv1COT4kYFP8,17724
sqlalchemy/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
sqlalchemy/schema.py,sha256=mt74CGCBtfv_qI1_6zzNFMexYGyWDj2Jkh-XdH4kEWI,3194
sqlalchemy/sql/__init__.py,sha256=jAQx9rwhyPhoSjntM1BZSElJiMRmLowGThJVDGvExSU,5820
sqlalchemy/sql/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_dml_constructors.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_elements_constructors.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_orm_types.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_py_util.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_selectable_constructors.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/_typing.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/annotation.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/base.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/cache_key.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/coercions.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/compiler.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/crud.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/ddl.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/default_comparator.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/dml.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/elements.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/events.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/expression.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/functions.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/lambdas.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/naming.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/operators.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/roles.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/schema.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/selectable.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/sqltypes.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/traversals.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/type_api.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/util.cpython-312.pyc,,
sqlalchemy/sql/__pycache__/visitors.cpython-312.pyc,,
sqlalchemy/sql/_dml_constructors.py,sha256=hoNyINY3FNi1ZQajR6lbcRN7oYsNghM1wuzzVWxIv3c,3867
sqlalchemy/sql/_elements_constructors.py,sha256=-qksx59Gqhmzxo1xByPtZZboNvL8uYcCN14pjHYHxL8,62914
sqlalchemy/sql/_orm_types.py,sha256=_vR3_HQYgZR_of6_ZpTQByie2gaVScxQjVAVWAP3Ztg,620
sqlalchemy/sql/_py_util.py,sha256=iiwgX3dQhOjdB5-10jtgHPIdibUqGk49bC1qdZMBpYI,2173
sqlalchemy/sql/_selectable_constructors.py,sha256=RDqgejqiUuU12Be1jBpMIx_YdJho8fhKfnMoJLPFTFE,18812
sqlalchemy/sql/_typing.py,sha256=C8kNZQ3TIpM-Q12Of3tTaESB1UxIfRME_lXouqgwMT8,12252
sqlalchemy/sql/annotation.py,sha256=pTNidcQatCar6H1I9YAoPP1e6sOewaJ15B7_-7ykZOE,18271
sqlalchemy/sql/base.py,sha256=dVvZoPoa3pb6iuwTU4QoCvVWQPyHZthaekl5J2zV_SU,73928
sqlalchemy/sql/cache_key.py,sha256=Dl163qHjTkMCa5LTipZud8X3w0d8DvdIvGvv4AqriHE,32823
sqlalchemy/sql/coercions.py,sha256=ju8xEi7b9G_GzxaQ6Nwu0cFIWFZ--ottIVfdiuhHY7Y,40553
sqlalchemy/sql/compiler.py,sha256=9Wx423H72Yq7NHR8cmMAH6GpMCJmghs1L85YJqs_Lng,268763
sqlalchemy/sql/crud.py,sha256=nyAPlmvuyWxMqSBdWPffC5P3CGXTQKK0bJoDbNgB3iQ,56457
sqlalchemy/sql/ddl.py,sha256=XuUhulJLvvPjU4nYD6N42QLg8rEgquD6Jwn_yIHZejk,45542
sqlalchemy/sql/default_comparator.py,sha256=SE0OaK1BlY0RinQ21ZXJOUGkO00oGv6GMMmAH-4iNTQ,16663
sqlalchemy/sql/dml.py,sha256=eftbzdFJgMk7NV0BHKfK4dQ2R7XsyyJn6fCgYFJ0KNQ,65728
sqlalchemy/sql/elements.py,sha256=dsNa2K57RygsGoaWuTMPp2QQ6SU3uZXSMW6CLGBbcIY,171208
sqlalchemy/sql/events.py,sha256=xe3vJ6pQJau3dJWBAY0zU7Lz52UKuMrpLycriLm3AWA,18301
sqlalchemy/sql/expression.py,sha256=baMnCH04jeE8E3tA2TovXlsREocA2j3fdHKnzOB8H4U,7586
sqlalchemy/sql/functions.py,sha256=AcI_KstJxeLw6rEXx6QnIgR2rq4Ru6RXMbq4EIIUURA,55319
sqlalchemy/sql/lambdas.py,sha256=EfDdUBi5cSmkjz8pQCSRo858UWQCFNZxXkM-1qS0CgU,49281
sqlalchemy/sql/naming.py,sha256=l8udFP2wvXLgehIB0uF2KXwpkXSVSREDk6fLCH9F-XY,6865
sqlalchemy/sql/operators.py,sha256=BYATjkBQLJAmwHAlGUSV-dv9RLtGw_ziAvFbKDrN4YU,76107
sqlalchemy/sql/roles.py,sha256=71zm_xpRkUdnu-WzG6lxQVnFHwvUjf6X6e3kRIkbzAs,7686
sqlalchemy/sql/schema.py,sha256=TOBTbcRY6ehosJEcpYn2NX0_UGZP9lfFs-o8lJVc5tI,228104
sqlalchemy/sql/selectable.py,sha256=9dO2yhN83zjna7nPjOE1hcvGyJGjc_lj5SAz7SP5CBQ,233041
sqlalchemy/sql/sqltypes.py,sha256=_0FpFLH0AFueb3TIB5Vcx9nXWDNj31XFQTP0u8OXnSo,126540
sqlalchemy/sql/traversals.py,sha256=7b98JSeLxqecmGHhhLXT_2M4QMke6W-xCci5RXndhxI,33521
sqlalchemy/sql/type_api.py,sha256=D9Kq-ppwZvlNmxaHqvVmM8IVg4n6_erzJpVioye9WKE,83823
sqlalchemy/sql/util.py,sha256=lBEAf_-eRepTErOBCp1PbEMZDYdJqAiK1GemQtgojYo,48175
sqlalchemy/sql/visitors.py,sha256=KD1qOYm6RdftCufVGB8q6jFTIZIQKS3zPCg78cVV0mQ,36427
sqlalchemy/testing/__init__.py,sha256=9M2SMxBBLJ8xLUWXNCWDzkcvOqFznWcJzrSd712vATU,3126
sqlalchemy/testing/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/assertions.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/assertsql.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/asyncio.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/config.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/engines.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/entities.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/exclusions.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/pickleable.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/profiling.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/provision.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/requirements.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/schema.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/util.cpython-312.pyc,,
sqlalchemy/testing/__pycache__/warnings.cpython-312.pyc,,
sqlalchemy/testing/assertions.py,sha256=lNNZ-gfF4TDRXmB7hZDdch7JYZRb_qWGeqWDFKtopx0,31439
sqlalchemy/testing/assertsql.py,sha256=EIVk3i5qjiSI63c1ikTPoGhulZl88SSeOS2VNo1LJvM,16817
sqlalchemy/testing/asyncio.py,sha256=cAw68tzu3h5wjdIKfOqhFATcbMb38XeK0ThjIalUHuQ,3728
sqlalchemy/testing/config.py,sha256=MZOWz7wqzc1pbwHWSAR0RJkt2C-SD6ox-nYY7VHdi_U,12030
sqlalchemy/testing/engines.py,sha256=w5-0FbanItRsOt6x4n7wM_OnToCzJnrvZZ2hk5Yzng8,13355
sqlalchemy/testing/entities.py,sha256=rysywsnjXHlIIC-uv0L7-fLmTAuNpHJvcSd1HeAdY5M,3354
sqlalchemy/testing/exclusions.py,sha256=uoYLEwyNOK1eR8rpfOZ2Q3dxgY0akM-RtsIFML-FPrY,12444
sqlalchemy/testing/fixtures/__init__.py,sha256=9snVns5A7g28LqC6gqQuO4xRBoJzdnf068GQ6Cae75I,1198
sqlalchemy/testing/fixtures/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/testing/fixtures/__pycache__/base.cpython-312.pyc,,
sqlalchemy/testing/fixtures/__pycache__/mypy.cpython-312.pyc,,
sqlalchemy/testing/fixtures/__pycache__/orm.cpython-312.pyc,,
sqlalchemy/testing/fixtures/__pycache__/sql.cpython-312.pyc,,
sqlalchemy/testing/fixtures/base.py,sha256=OayRr25soCqj1_yc665D5XbWWzFCm7Xl9Txtps953p4,12256
sqlalchemy/testing/fixtures/mypy.py,sha256=7fWVZzYzNjqmLIoFa-MmXSGDPS3eZYFXlH-WxaxBDDY,11845
sqlalchemy/testing/fixtures/orm.py,sha256=x27qjpK54JETATcYuiphtW-HXRy8ej8h3aCDkeQXPfY,6095
sqlalchemy/testing/fixtures/sql.py,sha256=Q7Qq0n4qTT681nWt5DqjThopgjv5BB2KmSmrmAxUqHM,15704
sqlalchemy/testing/pickleable.py,sha256=B9dXGF7E2PywB67SngHPjSMIBDTFhyAV4rkDUcyMulk,2833
sqlalchemy/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
sqlalchemy/testing/plugin/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-312.pyc,,
sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-312.pyc,,
sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-312.pyc,,
sqlalchemy/testing/plugin/bootstrap.py,sha256=GrBB27KbswjE3Tt-zJlj6uSqGh9N-_CXkonnJSSBz84,1437
sqlalchemy/testing/plugin/plugin_base.py,sha256=4SizjghFdDddt5o5gQ16Nw0bJHrtuBa4smxJcea-ti8,21573
sqlalchemy/testing/plugin/pytestplugin.py,sha256=yh4PP406O0TwPMDzpJHpcNdU2WHXCLYI10F3oOLePjE,27295
sqlalchemy/testing/profiling.py,sha256=HPjYvRLT1nD90FCZ7AA8j9ygkMtf1SGA47Xze2QPueo,10148
sqlalchemy/testing/provision.py,sha256=w4F_ceGHPpWHUeh6cVcE5ktCC-ISrGc2yOSnXauOd5U,14200
sqlalchemy/testing/requirements.py,sha256=gkviA8f5p4qdoDwAK791I4oGvnEqlm0ZZwJZpJzobFY,51393
sqlalchemy/testing/schema.py,sha256=OSfMoIJ7ORbevGkeJdrKcTrQ0s7wXebuCU08mC1Y9jA,6513
sqlalchemy/testing/suite/__init__.py,sha256=_firVc2uS3TMZ3vH2baQzNb17ubM78RHtb9kniSybmk,476
sqlalchemy/testing/suite/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_cte.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_deprecations.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_insert.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_results.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_rowcount.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_select.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_types.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_unicode_ddl.cpython-312.pyc,,
sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-312.pyc,,
sqlalchemy/testing/suite/test_cte.py,sha256=O5idVeBnHm9zdiG3tuCBUn4hYU_TA63-6LNnRygr8g0,6205
sqlalchemy/testing/suite/test_ddl.py,sha256=xWimTjggpTe3S1Xfmt_IPofTXkUUcKuVSVCIfIyGMbA,11785
sqlalchemy/testing/suite/test_deprecations.py,sha256=XI8ZU1NxC-6uvPDImaaq9O7Ov6MF5gmy-yk3TfesLAo,5082
sqlalchemy/testing/suite/test_dialect.py,sha256=HUpHZb7pnHbsoRpDLONpsCO_oWhBgjglU9pBO-EOUw4,22673
sqlalchemy/testing/suite/test_insert.py,sha256=Wm_pW0qqUNV1Fs7mXoxtmaTHMQGmaVDgDsYgZs1jlxM,18308
sqlalchemy/testing/suite/test_reflection.py,sha256=Nd4Ao_J3Sr-VeAeWbUe3gs6STPvik9DC37WkyJc-PVg,106205
sqlalchemy/testing/suite/test_results.py,sha256=Hd6R4jhBNNQSp0xGa8wwTgpw-XUrCEZ3dWXpoZ4_DKs,15687
sqlalchemy/testing/suite/test_rowcount.py,sha256=zhKVv0ibFSQmnE5luLwgHAn840zOJ6HxtkR3oL995cs,7652
sqlalchemy/testing/suite/test_select.py,sha256=QHsBX16EZpxlEZZLM0pMNcwayPU0dig39McKwiiith0,58325
sqlalchemy/testing/suite/test_sequence.py,sha256=c80CBWrU930GPnPfr9TCRbTTuITR7BpIactncLIj2XU,9672
sqlalchemy/testing/suite/test_types.py,sha256=QjV48MqR7dB8UVzt56UL2z7Nt28-IhywX3DKuQeLYsY,65429
sqlalchemy/testing/suite/test_unicode_ddl.py,sha256=7obItCpFt4qlWaDqe25HWgQT6FoUhgz1W7_Xycfz9Xk,5887
sqlalchemy/testing/suite/test_update_delete.py,sha256=1hT0BTxB4SNipd6hnVlMnq25dLtQQoXov7z7UR0Sgi8,3658
sqlalchemy/testing/util.py,sha256=Wsu4GZgCW6wX9mmxfiffhDz1cZm3778OB3LtiWNgb3Y,14080
sqlalchemy/testing/warnings.py,sha256=pmfT33PF1q1PI7DdHOsup3LxHq1AC4-aYl1oL8HmrYo,1546
sqlalchemy/types.py,sha256=DgBpPaT-vtsn6_glx5wocrIhR2A1vy56SQNRY3NiPUw,3168
sqlalchemy/util/__init__.py,sha256=Bh0SkfkeCsz6-rbDmC41lAWOuCvKCiXVZthN2cWJEXk,8245
sqlalchemy/util/__pycache__/__init__.cpython-312.pyc,,
sqlalchemy/util/__pycache__/_collections.cpython-312.pyc,,
sqlalchemy/util/__pycache__/_concurrency_py3k.cpython-312.pyc,,
sqlalchemy/util/__pycache__/_has_cy.cpython-312.pyc,,
sqlalchemy/util/__pycache__/_py_collections.cpython-312.pyc,,
sqlalchemy/util/__pycache__/compat.cpython-312.pyc,,
sqlalchemy/util/__pycache__/concurrency.cpython-312.pyc,,
sqlalchemy/util/__pycache__/deprecations.cpython-312.pyc,,
sqlalchemy/util/__pycache__/langhelpers.cpython-312.pyc,,
sqlalchemy/util/__pycache__/preloaded.cpython-312.pyc,,
sqlalchemy/util/__pycache__/queue.cpython-312.pyc,,
sqlalchemy/util/__pycache__/tool_support.cpython-312.pyc,,
sqlalchemy/util/__pycache__/topological.cpython-312.pyc,,
sqlalchemy/util/__pycache__/typing.cpython-312.pyc,,
sqlalchemy/util/_collections.py,sha256=FYqVQg3CaqiEd21OFN1pNCfFbQ8gvlchW_TMtihSFNE,20169
sqlalchemy/util/_concurrency_py3k.py,sha256=31vs1oXaLzeTRgmOXRrWToRQskWmJk-CBs3-JxSTcck,8223
sqlalchemy/util/_has_cy.py,sha256=XMkeqCDGmhkd0uuzpCdyELz7gOjHxyFQ1AIlc5NneoY,1229
sqlalchemy/util/_py_collections.py,sha256=cYjsYLCLBy5jdGBJATLJCmtfzr_AaJ-HKTUN8OdAzxY,16630
sqlalchemy/util/compat.py,sha256=FkeHnW9asJYJvNmxVltee8jQNwQSdVRdKJlVRRInJI4,9388
sqlalchemy/util/concurrency.py,sha256=ZxcQYOKy-GBsQkPmCrBO5MzMpqW3JZme2Hiyqpbt9uc,2284
sqlalchemy/util/deprecations.py,sha256=pr9DSAf1ECqDk7X7F6TNc1jrhOeFihL33uEb5Wt2_T0,11971
sqlalchemy/util/langhelpers.py,sha256=CQQP2Q9c68nL5mcWL-Q38-INrtoDHDnBmq7QhnWyEDM,64980
sqlalchemy/util/preloaded.py,sha256=KKNLJEqChDW1TNUsM_TzKu7JYEA3kkuh2N-quM_2_Y4,5905
sqlalchemy/util/queue.py,sha256=ITejs6KS4Hz_ojrss2oFeUO9MoIeR3qWmZQ8J7yyrNU,10205
sqlalchemy/util/tool_support.py,sha256=epm8MzDZpVmhE6LIjrjJrP8BUf12Wab2m28A9lGq95s,5969
sqlalchemy/util/topological.py,sha256=hjJWL3C_B7Rpv9s7jj7wcTckcZUSkxc6xRDhiN1xyec,3458
sqlalchemy/util/typing.py,sha256=ESYm4oQtt-SarN04YTXCgovXT8tFupMiPmuGCDCMEIc,15831

View File

@@ -1,6 +1,6 @@
Metadata-Version: 2.1 Metadata-Version: 2.3
Name: aiofiles Name: aiofiles
Version: 23.2.1 Version: 24.1.0
Summary: File support for asyncio. Summary: File support for asyncio.
Project-URL: Changelog, https://github.com/Tinche/aiofiles#history Project-URL: Changelog, https://github.com/Tinche/aiofiles#history
Project-URL: Bug Tracker, https://github.com/Tinche/aiofiles/issues Project-URL: Bug Tracker, https://github.com/Tinche/aiofiles/issues
@@ -13,15 +13,15 @@ Classifier: Development Status :: 5 - Production/Stable
Classifier: Framework :: AsyncIO Classifier: Framework :: AsyncIO
Classifier: License :: OSI Approved :: Apache Software License Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy Classifier: Programming Language :: Python :: Implementation :: PyPy
Requires-Python: >=3.7 Requires-Python: >=3.8
Description-Content-Type: text/markdown Description-Content-Type: text/markdown
# aiofiles: file support for asyncio # aiofiles: file support for asyncio
@@ -135,6 +135,8 @@ several useful `os` functions that deal with files:
- `listdir` - `listdir`
- `scandir` - `scandir`
- `access` - `access`
- `getcwd`
- `path.abspath`
- `path.exists` - `path.exists`
- `path.isfile` - `path.isfile`
- `path.isdir` - `path.isdir`
@@ -176,25 +178,50 @@ as desired. The return type also needs to be registered with the
```python ```python
aiofiles.threadpool.wrap.register(mock.MagicMock)( aiofiles.threadpool.wrap.register(mock.MagicMock)(
lambda *args, **kwargs: threadpool.AsyncBufferedIOBase(*args, **kwargs)) lambda *args, **kwargs: aiofiles.threadpool.AsyncBufferedIOBase(*args, **kwargs)
)
async def test_stuff(): async def test_stuff():
data = 'data' write_data = 'data'
mock_file = mock.MagicMock() read_file_chunks = [
b'file chunks 1',
b'file chunks 2',
b'file chunks 3',
b'',
]
file_chunks_iter = iter(read_file_chunks)
with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file) as mock_open: mock_file_stream = mock.MagicMock(
read=lambda *args, **kwargs: next(file_chunks_iter)
)
with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file_stream) as mock_open:
async with aiofiles.open('filename', 'w') as f: async with aiofiles.open('filename', 'w') as f:
await f.write(data) await f.write(write_data)
assert f.read() == b'file chunks 1'
mock_file.write.assert_called_once_with(data) mock_file_stream.write.assert_called_once_with(write_data)
``` ```
### History ### History
#### 24.1.0 (2024-06-24)
- Import `os.link` conditionally to fix importing on android.
[#175](https://github.com/Tinche/aiofiles/issues/175)
- Remove spurious items from `aiofiles.os.__all__` when running on Windows.
- Switch to more modern async idioms: Remove types.coroutine and make AiofilesContextManager an awaitable instead a coroutine.
- Add `aiofiles.os.path.abspath` and `aiofiles.os.getcwd`.
[#174](https://github.com/Tinche/aiofiles/issues/181)
- _aiofiles_ is now tested on Python 3.13 too.
[#184](https://github.com/Tinche/aiofiles/pull/184)
- Dropped Python 3.7 support. If you require it, use version 23.2.1.
#### 23.2.1 (2023-08-09) #### 23.2.1 (2023-08-09)
- Import `os.statvfs` conditionally to fix importing on non-UNIX systems. - Import `os.statvfs` conditionally to fix importing on non-UNIX systems.
[#171](https://github.com/Tinche/aiofiles/issues/171) [#172](https://github.com/Tinche/aiofiles/pull/172) [#171](https://github.com/Tinche/aiofiles/issues/171) [#172](https://github.com/Tinche/aiofiles/pull/172)
- aiofiles is now also tested on Windows.
#### 23.2.0 (2023-08-09) #### 23.2.0 (2023-08-09)

View File

@@ -1,27 +1,27 @@
aiofiles-23.2.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 aiofiles-24.1.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
aiofiles-23.2.1.dist-info/METADATA,sha256=cot28p_PNjdl_MK--l9Qu2e6QOv9OxdHrKbjLmYf9Uw,9673 aiofiles-24.1.0.dist-info/METADATA,sha256=CvUJx21XclgI1Lp5Bt_4AyJesRYg0xCSx4exJZVmaSA,10708
aiofiles-23.2.1.dist-info/RECORD,, aiofiles-24.1.0.dist-info/RECORD,,
aiofiles-23.2.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 aiofiles-24.1.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
aiofiles-23.2.1.dist-info/WHEEL,sha256=KGYbc1zXlYddvwxnNty23BeaKzh7YuoSIvIMO4jEhvw,87 aiofiles-24.1.0.dist-info/WHEEL,sha256=1yFddiXMmvYK7QYTqtRNtX66WJ0Mz8PYEiEUoOUUxRY,87
aiofiles-23.2.1.dist-info/licenses/LICENSE,sha256=y16Ofl9KOYjhBjwULGDcLfdWBfTEZRXnduOspt-XbhQ,11325 aiofiles-24.1.0.dist-info/licenses/LICENSE,sha256=y16Ofl9KOYjhBjwULGDcLfdWBfTEZRXnduOspt-XbhQ,11325
aiofiles-23.2.1.dist-info/licenses/NOTICE,sha256=EExY0dRQvWR0wJ2LZLwBgnM6YKw9jCU-M0zegpRSD_E,55 aiofiles-24.1.0.dist-info/licenses/NOTICE,sha256=EExY0dRQvWR0wJ2LZLwBgnM6YKw9jCU-M0zegpRSD_E,55
aiofiles/__init__.py,sha256=1iAMJQyJtX3LGIS0AoFTJeO1aJ_RK2jpBSBhg0VoIrE,344 aiofiles/__init__.py,sha256=1iAMJQyJtX3LGIS0AoFTJeO1aJ_RK2jpBSBhg0VoIrE,344
aiofiles/__pycache__/__init__.cpython-312.pyc,, aiofiles/__pycache__/__init__.cpython-312.pyc,,
aiofiles/__pycache__/base.cpython-312.pyc,, aiofiles/__pycache__/base.cpython-312.pyc,,
aiofiles/__pycache__/os.cpython-312.pyc,, aiofiles/__pycache__/os.cpython-312.pyc,,
aiofiles/__pycache__/ospath.cpython-312.pyc,, aiofiles/__pycache__/ospath.cpython-312.pyc,,
aiofiles/base.py,sha256=rZwA151Ji8XlBkzvDmcF1CgDTY2iKNuJMfvNlM0s0E0,2684 aiofiles/base.py,sha256=zo0FgkCqZ5aosjvxqIvDf2t-RFg1Lc6X8P6rZ56p6fQ,1784
aiofiles/os.py,sha256=zuFGaIyGCGUuFb7trFFEm6SLdCRqTFsSV0mY6SO8z3M,970 aiofiles/os.py,sha256=0DrsG-eH4h7xRzglv9pIWsQuzqe7ZhVYw5FQS18fIys,1153
aiofiles/ospath.py,sha256=zqG2VFzRb6yYiIOWipqsdgvZmoMTFvZmBdkxkAl1FT4,764 aiofiles/ospath.py,sha256=WaYelz_k6ykAFRLStr4bqYIfCVQ-5GGzIqIizykbY2Q,794
aiofiles/tempfile/__init__.py,sha256=hFSNTOjOUv371Ozdfy6FIxeln46Nm3xOVh4ZR3Q94V0,10244 aiofiles/tempfile/__init__.py,sha256=hFSNTOjOUv371Ozdfy6FIxeln46Nm3xOVh4ZR3Q94V0,10244
aiofiles/tempfile/__pycache__/__init__.cpython-312.pyc,, aiofiles/tempfile/__pycache__/__init__.cpython-312.pyc,,
aiofiles/tempfile/__pycache__/temptypes.cpython-312.pyc,, aiofiles/tempfile/__pycache__/temptypes.cpython-312.pyc,,
aiofiles/tempfile/temptypes.py,sha256=ddEvNjMLVlr7WUILCe6ypTqw77yREeIonTk16Uw_NVs,2093 aiofiles/tempfile/temptypes.py,sha256=ddEvNjMLVlr7WUILCe6ypTqw77yREeIonTk16Uw_NVs,2093
aiofiles/threadpool/__init__.py,sha256=c_aexl1t193iKdPZaolPEEbHDrQ0RrsH_HTAToMPQBo,3171 aiofiles/threadpool/__init__.py,sha256=kt0hwwx3bLiYtnA1SORhW8mJ6z4W9Xr7MbY80UIJJrI,3133
aiofiles/threadpool/__pycache__/__init__.cpython-312.pyc,, aiofiles/threadpool/__pycache__/__init__.cpython-312.pyc,,
aiofiles/threadpool/__pycache__/binary.cpython-312.pyc,, aiofiles/threadpool/__pycache__/binary.cpython-312.pyc,,
aiofiles/threadpool/__pycache__/text.cpython-312.pyc,, aiofiles/threadpool/__pycache__/text.cpython-312.pyc,,
aiofiles/threadpool/__pycache__/utils.cpython-312.pyc,, aiofiles/threadpool/__pycache__/utils.cpython-312.pyc,,
aiofiles/threadpool/binary.py,sha256=hp-km9VCRu0MLz_wAEUfbCz7OL7xtn9iGAawabpnp5U,2315 aiofiles/threadpool/binary.py,sha256=hp-km9VCRu0MLz_wAEUfbCz7OL7xtn9iGAawabpnp5U,2315
aiofiles/threadpool/text.py,sha256=fNmpw2PEkj0BZSldipJXAgZqVGLxALcfOMiuDQ54Eas,1223 aiofiles/threadpool/text.py,sha256=fNmpw2PEkj0BZSldipJXAgZqVGLxALcfOMiuDQ54Eas,1223
aiofiles/threadpool/utils.py,sha256=B59dSZwO_WZs2dFFycKeA91iD2Xq2nNw1EFF8YMBI5k,1868 aiofiles/threadpool/utils.py,sha256=B59dSZwO_WZs2dFFycKeA91iD2Xq2nNw1EFF8YMBI5k,1868

View File

@@ -1,4 +1,4 @@
Wheel-Version: 1.0 Wheel-Version: 1.0
Generator: hatchling 1.18.0 Generator: hatchling 1.25.0
Root-Is-Purelib: true Root-Is-Purelib: true
Tag: py3-none-any Tag: py3-none-any

View File

@@ -1,6 +1,6 @@
"""Various base classes.""" """Various base classes."""
from types import coroutine from collections.abc import Awaitable
from collections.abc import Coroutine from contextlib import AbstractAsyncContextManager
from asyncio import get_running_loop from asyncio import get_running_loop
@@ -45,66 +45,22 @@ class AsyncIndirectBase(AsyncBase):
pass # discard writes pass # discard writes
class _ContextManager(Coroutine): class AiofilesContextManager(Awaitable, AbstractAsyncContextManager):
"""An adjusted async context manager for aiofiles."""
__slots__ = ("_coro", "_obj") __slots__ = ("_coro", "_obj")
def __init__(self, coro): def __init__(self, coro):
self._coro = coro self._coro = coro
self._obj = None self._obj = None
def send(self, value):
return self._coro.send(value)
def throw(self, typ, val=None, tb=None):
if val is None:
return self._coro.throw(typ)
elif tb is None:
return self._coro.throw(typ, val)
else:
return self._coro.throw(typ, val, tb)
def close(self):
return self._coro.close()
@property
def gi_frame(self):
return self._coro.gi_frame
@property
def gi_running(self):
return self._coro.gi_running
@property
def gi_code(self):
return self._coro.gi_code
def __next__(self):
return self.send(None)
@coroutine
def __iter__(self):
resp = yield from self._coro
return resp
def __await__(self): def __await__(self):
resp = yield from self._coro if self._obj is None:
return resp self._obj = yield from self._coro.__await__()
async def __anext__(self):
resp = await self._coro
return resp
async def __aenter__(self):
self._obj = await self._coro
return self._obj return self._obj
async def __aexit__(self, exc_type, exc, tb): async def __aenter__(self):
self._obj.close() return await self
self._obj = None
class AiofilesContextManager(_ContextManager):
"""An adjusted async context manager for aiofiles."""
async def __aexit__(self, exc_type, exc_val, exc_tb): async def __aexit__(self, exc_type, exc_val, exc_tb):
await get_running_loop().run_in_executor( await get_running_loop().run_in_executor(

View File

@@ -1,4 +1,5 @@
"""Async executor versions of file functions from the os module.""" """Async executor versions of file functions from the os module."""
import os import os
from . import ospath as path from . import ospath as path
@@ -7,7 +8,6 @@ from .ospath import wrap
__all__ = [ __all__ = [
"path", "path",
"stat", "stat",
"statvfs",
"rename", "rename",
"renames", "renames",
"replace", "replace",
@@ -17,15 +17,20 @@ __all__ = [
"makedirs", "makedirs",
"rmdir", "rmdir",
"removedirs", "removedirs",
"link",
"symlink", "symlink",
"readlink", "readlink",
"listdir", "listdir",
"scandir", "scandir",
"access", "access",
"sendfile",
"wrap", "wrap",
"getcwd",
] ]
if hasattr(os, "link"):
__all__ += ["link"]
if hasattr(os, "sendfile"):
__all__ += ["sendfile"]
if hasattr(os, "statvfs"):
__all__ += ["statvfs"]
stat = wrap(os.stat) stat = wrap(os.stat)
@@ -38,13 +43,15 @@ mkdir = wrap(os.mkdir)
makedirs = wrap(os.makedirs) makedirs = wrap(os.makedirs)
rmdir = wrap(os.rmdir) rmdir = wrap(os.rmdir)
removedirs = wrap(os.removedirs) removedirs = wrap(os.removedirs)
link = wrap(os.link)
symlink = wrap(os.symlink) symlink = wrap(os.symlink)
readlink = wrap(os.readlink) readlink = wrap(os.readlink)
listdir = wrap(os.listdir) listdir = wrap(os.listdir)
scandir = wrap(os.scandir) scandir = wrap(os.scandir)
access = wrap(os.access) access = wrap(os.access)
getcwd = wrap(os.getcwd)
if hasattr(os, "link"):
link = wrap(os.link)
if hasattr(os, "sendfile"): if hasattr(os, "sendfile"):
sendfile = wrap(os.sendfile) sendfile = wrap(os.sendfile)
if hasattr(os, "statvfs"): if hasattr(os, "statvfs"):

View File

@@ -1,4 +1,5 @@
"""Async executor versions of file functions from the os.path module.""" """Async executor versions of file functions from the os.path module."""
import asyncio import asyncio
from functools import partial, wraps from functools import partial, wraps
from os import path from os import path
@@ -26,3 +27,4 @@ getatime = wrap(path.getatime)
getctime = wrap(path.getctime) getctime = wrap(path.getctime)
samefile = wrap(path.samefile) samefile = wrap(path.samefile)
sameopenfile = wrap(path.sameopenfile) sameopenfile = wrap(path.sameopenfile)
abspath = wrap(path.abspath)

View File

@@ -10,7 +10,6 @@ from io import (
FileIO, FileIO,
TextIOBase, TextIOBase,
) )
from types import coroutine
from ..base import AiofilesContextManager from ..base import AiofilesContextManager
from .binary import ( from .binary import (
@@ -63,8 +62,7 @@ def open(
) )
@coroutine async def _open(
def _open(
file, file,
mode="r", mode="r",
buffering=-1, buffering=-1,
@@ -91,7 +89,7 @@ def _open(
closefd=closefd, closefd=closefd,
opener=opener, opener=opener,
) )
f = yield from loop.run_in_executor(executor, cb) f = await loop.run_in_executor(executor, cb)
return wrap(f, loop=loop, executor=executor) return wrap(f, loop=loop, executor=executor)

View File

@@ -1,149 +0,0 @@
../../../bin/alembic,sha256=kheZTewTBSd6rruOpyoj8QhFdGKiaj38MUFgBD5whig,238
alembic-1.12.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
alembic-1.12.1.dist-info/LICENSE,sha256=soUmiob0QW6vTQWyrjiAwVb3xZqPk1pAK8BW6vszrwg,1058
alembic-1.12.1.dist-info/METADATA,sha256=D9-LeKL0unLPg2JKmlFMB5NAxt9N9y-8oVEGOUHbQnU,7306
alembic-1.12.1.dist-info/RECORD,,
alembic-1.12.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic-1.12.1.dist-info/WHEEL,sha256=yQN5g4mg4AybRjkgi-9yy4iQEFibGQmlz78Pik5Or-A,92
alembic-1.12.1.dist-info/entry_points.txt,sha256=aykM30soxwGN0pB7etLc1q0cHJbL9dy46RnK9VX4LLw,48
alembic-1.12.1.dist-info/top_level.txt,sha256=FwKWd5VsPFC8iQjpu1u9Cn-JnK3-V1RhUCmWqz1cl-s,8
alembic/__init__.py,sha256=gczqgDgBRw3aV70aNeH6WGu0WdASQf_YiChV12qCRRI,75
alembic/__main__.py,sha256=373m7-TBh72JqrSMYviGrxCHZo-cnweM8AGF8A22PmY,78
alembic/__pycache__/__init__.cpython-312.pyc,,
alembic/__pycache__/__main__.cpython-312.pyc,,
alembic/__pycache__/command.cpython-312.pyc,,
alembic/__pycache__/config.cpython-312.pyc,,
alembic/__pycache__/context.cpython-312.pyc,,
alembic/__pycache__/environment.cpython-312.pyc,,
alembic/__pycache__/migration.cpython-312.pyc,,
alembic/__pycache__/op.cpython-312.pyc,,
alembic/autogenerate/__init__.py,sha256=4IHgWH89pForRq-yCDZhGjjVtsfGX5ECWNPuUs8nGUk,351
alembic/autogenerate/__pycache__/__init__.cpython-312.pyc,,
alembic/autogenerate/__pycache__/api.cpython-312.pyc,,
alembic/autogenerate/__pycache__/compare.cpython-312.pyc,,
alembic/autogenerate/__pycache__/render.cpython-312.pyc,,
alembic/autogenerate/__pycache__/rewriter.cpython-312.pyc,,
alembic/autogenerate/api.py,sha256=MNn0Xtmj44aMFjfiR0LMkbxOynHyiyaRBnrj5EkImm4,21967
alembic/autogenerate/compare.py,sha256=gSCjxrkQAl0rJD6o9Ln8wNxGVNU6FrWzKZYVkH5Tmac,47042
alembic/autogenerate/render.py,sha256=Fik2aPZEIxOlTCrBd0UiPxnX5SFG__CvfXqMWoJr6lw,34475
alembic/autogenerate/rewriter.py,sha256=Osba8GFVeqiX1ypGJW7Axt0ui2EROlaFtVZdMFbhzZ0,7384
alembic/command.py,sha256=ze4pYvKpB-FtF8rduY6F6n3XHqeA-15iXaaEDeNHVzI,21588
alembic/config.py,sha256=68e1nmYU5Nfh0bNRqRWUygSilDl1p0G_U1zZ8ifgmD8,21931
alembic/context.py,sha256=hK1AJOQXJ29Bhn276GYcosxeG7pC5aZRT5E8c4bMJ4Q,195
alembic/context.pyi,sha256=FLsT0be_vO_ozlC05EJkWR5olDPoTVq-7tgtoM5wSAw,31463
alembic/ddl/__init__.py,sha256=xXr1W6PePe0gCLwR42ude0E6iru9miUFc1fCeQN4YP8,137
alembic/ddl/__pycache__/__init__.cpython-312.pyc,,
alembic/ddl/__pycache__/base.cpython-312.pyc,,
alembic/ddl/__pycache__/impl.cpython-312.pyc,,
alembic/ddl/__pycache__/mssql.cpython-312.pyc,,
alembic/ddl/__pycache__/mysql.cpython-312.pyc,,
alembic/ddl/__pycache__/oracle.cpython-312.pyc,,
alembic/ddl/__pycache__/postgresql.cpython-312.pyc,,
alembic/ddl/__pycache__/sqlite.cpython-312.pyc,,
alembic/ddl/base.py,sha256=cCY3NldMRggrKd9bZ0mFRBE9GNDaAy0UJcM3ey4Utgw,9638
alembic/ddl/impl.py,sha256=Z3GpNM2KwBpfl1UCam1YsYbSd0mQzRigOKQhUCLIPgE,25564
alembic/ddl/mssql.py,sha256=0k26xnUSZNj3qCHEMzRFbaWgUzKcV07I3_-Ns47VhO0,14105
alembic/ddl/mysql.py,sha256=ff8OE0zQ8YYjAgltBbtjQkDR-g9z65DNeFjEMm4sX6c,16675
alembic/ddl/oracle.py,sha256=E0VaZaUM_5mwqNiJVA3zOAK-cuHVVIv_-NmUbH1JuGQ,6097
alembic/ddl/postgresql.py,sha256=aO8pcVN5ycw1wG2m1RRt8dQUD1KgRa6T4rSzg9FPCkU,26457
alembic/ddl/sqlite.py,sha256=9q7NAxyeFwn9kWwQSc9RLeMFSos8waM7x9lnXdByh44,7613
alembic/environment.py,sha256=MM5lPayGT04H3aeng1H7GQ8HEAs3VGX5yy6mDLCPLT4,43
alembic/migration.py,sha256=MV6Fju6rZtn2fTREKzXrCZM6aIBGII4OMZFix0X-GLs,41
alembic/op.py,sha256=flHtcsVqOD-ZgZKK2pv-CJ5Cwh-KJ7puMUNXzishxLw,167
alembic/op.pyi,sha256=ldQBwAfzm_-ZsC3nizMuGoD34hjMKb4V_-Q1rR8q8LI,48591
alembic/operations/__init__.py,sha256=e0KQSZAgLpTWvyvreB7DWg7RJV_MWSOPVDgCqsd2FzY,318
alembic/operations/__pycache__/__init__.cpython-312.pyc,,
alembic/operations/__pycache__/base.cpython-312.pyc,,
alembic/operations/__pycache__/batch.cpython-312.pyc,,
alembic/operations/__pycache__/ops.cpython-312.pyc,,
alembic/operations/__pycache__/schemaobj.cpython-312.pyc,,
alembic/operations/__pycache__/toimpl.cpython-312.pyc,,
alembic/operations/base.py,sha256=2so4KisDNuOLw0CRiZqorIHrhuenpVoFbn3B0sNvDic,72471
alembic/operations/batch.py,sha256=uMvGJDlcTs0GSHasg4Gsdv1YcXeLOK_1lkRl3jk1ezY,26954
alembic/operations/ops.py,sha256=aP9Uz36k98O_Y-njKIAifyvyhi0g2zU6_igKMos91_s,93539
alembic/operations/schemaobj.py,sha256=-tWad8pgWUNWucbpTnPuFK_EEl913C0RADJhlBnrjhc,9393
alembic/operations/toimpl.py,sha256=K8nUmojtL94tyLSWdDD-e94IbghZ19k55iBIMvzMm5E,6993
alembic/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/runtime/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/runtime/__pycache__/__init__.cpython-312.pyc,,
alembic/runtime/__pycache__/environment.cpython-312.pyc,,
alembic/runtime/__pycache__/migration.cpython-312.pyc,,
alembic/runtime/environment.py,sha256=qaerrw5jB7zYliNnCvIziaju4-tvQ451MuGW8PHnfvw,41019
alembic/runtime/migration.py,sha256=5UtTI_T0JtYzt6ZpeUhannMZOvXWiEymKFOpeCefaPY,49407
alembic/script/__init__.py,sha256=lSj06O391Iy5avWAiq8SPs6N8RBgxkSPjP8wpXcNDGg,100
alembic/script/__pycache__/__init__.cpython-312.pyc,,
alembic/script/__pycache__/base.cpython-312.pyc,,
alembic/script/__pycache__/revision.cpython-312.pyc,,
alembic/script/__pycache__/write_hooks.cpython-312.pyc,,
alembic/script/base.py,sha256=90SpT8wyTMTUuS0Svsy5YIoqJSrR-6CtYSzStmRvFT0,37174
alembic/script/revision.py,sha256=DE0nwvDOzdFo843brvnhs1DfP0jRC5EVQHrNihC7PUQ,61471
alembic/script/write_hooks.py,sha256=Nqj4zz3sm97kAPOpK1m-i2znJchiybO_TWT50oljlJw,4917
alembic/templates/async/README,sha256=ISVtAOvqvKk_5ThM5ioJE-lMkvf9IbknFUFVU_vPma4,58
alembic/templates/async/__pycache__/env.cpython-312.pyc,,
alembic/templates/async/alembic.ini.mako,sha256=k3IyGDG15Rp1JDweC0TiDauaKYNvj3clrGfhw6oV6MI,3505
alembic/templates/async/env.py,sha256=zbOCf3Y7w2lg92hxSwmG1MM_7y56i_oRH4AKp0pQBYo,2389
alembic/templates/async/script.py.mako,sha256=MEqL-2qATlST9TAOeYgscMn1uy6HUS9NFvDgl93dMj8,635
alembic/templates/generic/README,sha256=MVlc9TYmr57RbhXET6QxgyCcwWP7w-vLkEsirENqiIQ,38
alembic/templates/generic/__pycache__/env.cpython-312.pyc,,
alembic/templates/generic/alembic.ini.mako,sha256=gZWFmH2A9sP0i7cxEDhJFkjGtTKUXaVna8QAbIaRqxk,3614
alembic/templates/generic/env.py,sha256=TLRWOVW3Xpt_Tpf8JFzlnoPn_qoUu8UV77Y4o9XD6yI,2103
alembic/templates/generic/script.py.mako,sha256=MEqL-2qATlST9TAOeYgscMn1uy6HUS9NFvDgl93dMj8,635
alembic/templates/multidb/README,sha256=dWLDhnBgphA4Nzb7sNlMfCS3_06YqVbHhz-9O5JNqyI,606
alembic/templates/multidb/__pycache__/env.cpython-312.pyc,,
alembic/templates/multidb/alembic.ini.mako,sha256=j_Y0yuZVoHy7sTPgSPd8DmbT2ItvAdWs7trYZSOmFnw,3708
alembic/templates/multidb/env.py,sha256=6zNjnW8mXGUk7erTsAvrfhvqoczJ-gagjVq1Ypg2YIQ,4230
alembic/templates/multidb/script.py.mako,sha256=N06nMtNSwHkgl0EBXDyMt8njp9tlOesR583gfq21nbY,1090
alembic/testing/__init__.py,sha256=kOxOh5nwmui9d-_CCq9WA4Udwy7ITjm453w74CTLZDo,1159
alembic/testing/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/__pycache__/assertions.cpython-312.pyc,,
alembic/testing/__pycache__/env.cpython-312.pyc,,
alembic/testing/__pycache__/fixtures.cpython-312.pyc,,
alembic/testing/__pycache__/requirements.cpython-312.pyc,,
alembic/testing/__pycache__/schemacompare.cpython-312.pyc,,
alembic/testing/__pycache__/util.cpython-312.pyc,,
alembic/testing/__pycache__/warnings.cpython-312.pyc,,
alembic/testing/assertions.py,sha256=1CbJk8c8-WO9eJ0XJ0jJvMsNRLUrXV41NOeIJUAlOBk,5015
alembic/testing/env.py,sha256=zJacVb_z6uLs2U1TtkmnFH9P3_F-3IfYbVv4UEPOvfo,10754
alembic/testing/fixtures.py,sha256=NyP4wE_dFN9ZzSGiBagRu1cdzkka03nwJYJYHYrrkSY,9112
alembic/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/testing/plugin/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/plugin/__pycache__/bootstrap.cpython-312.pyc,,
alembic/testing/plugin/bootstrap.py,sha256=9C6wtjGrIVztZ928w27hsQE0KcjDLIUtUN3dvZKsMVk,50
alembic/testing/requirements.py,sha256=WByOiJxn2crazIXPq6-0cfqV95cfd9vP_ZQ1Cf2l8hY,4841
alembic/testing/schemacompare.py,sha256=7_4_0Y4UvuMiZ66pz1RC_P8Z1kYOP-R4Y5qUcNmcMKA,4535
alembic/testing/suite/__init__.py,sha256=MvE7-hwbaVN1q3NM-ztGxORU9dnIelUCINKqNxewn7Y,288
alembic/testing/suite/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/suite/__pycache__/_autogen_fixtures.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_comments.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_computed.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_diffs.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_fks.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_identity.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_environment.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_op.cpython-312.pyc,,
alembic/testing/suite/_autogen_fixtures.py,sha256=cDq1pmzHe15S6dZPGNC6sqFaCQ3hLT_oPV2IDigUGQ0,9880
alembic/testing/suite/test_autogen_comments.py,sha256=aEGqKUDw4kHjnDk298aoGcQvXJWmZXcIX_2FxH4cJK8,6283
alembic/testing/suite/test_autogen_computed.py,sha256=qJeBpc8urnwTFvbwWrSTIbHVkRUuCXP-dKaNbUK2U2U,6077
alembic/testing/suite/test_autogen_diffs.py,sha256=T4SR1n_kmcOKYhR4W1-dA0e5sddJ69DSVL2HW96kAkE,8394
alembic/testing/suite/test_autogen_fks.py,sha256=AqFmb26Buex167HYa9dZWOk8x-JlB1OK3bwcvvjDFaU,32927
alembic/testing/suite/test_autogen_identity.py,sha256=kcuqngG7qXAKPJDX4U8sRzPKHEJECHuZ0DtuaS6tVkk,5824
alembic/testing/suite/test_environment.py,sha256=w9F0xnLEbALeR8k6_-Tz6JHvy91IqiTSypNasVzXfZQ,11877
alembic/testing/suite/test_op.py,sha256=2XQCdm_NmnPxHGuGj7hmxMzIhKxXNotUsKdACXzE1mM,1343
alembic/testing/util.py,sha256=CQrcQDA8fs_7ME85z5ydb-Bt70soIIID-qNY1vbR2dg,3350
alembic/testing/warnings.py,sha256=RxA7x_8GseANgw07Us8JN_1iGbANxaw6_VitX2ZGQH4,1078
alembic/util/__init__.py,sha256=cPF_jjFx7YRBByHHDqW3wxCIHsqnGfncEr_i238aduY,1202
alembic/util/__pycache__/__init__.cpython-312.pyc,,
alembic/util/__pycache__/compat.cpython-312.pyc,,
alembic/util/__pycache__/editor.cpython-312.pyc,,
alembic/util/__pycache__/exc.cpython-312.pyc,,
alembic/util/__pycache__/langhelpers.cpython-312.pyc,,
alembic/util/__pycache__/messaging.cpython-312.pyc,,
alembic/util/__pycache__/pyfiles.cpython-312.pyc,,
alembic/util/__pycache__/sqla_compat.cpython-312.pyc,,
alembic/util/compat.py,sha256=WN8jPPFB9ri_uuEM1HEaN1ak3RJc_H3x8NqvtFkoXuM,2279
alembic/util/editor.py,sha256=JIz6_BdgV8_oKtnheR6DZoB7qnrHrlRgWjx09AsTsUw,2546
alembic/util/exc.py,sha256=KQTru4zcgAmN4IxLMwLFS56XToUewaXB7oOLcPNjPwg,98
alembic/util/langhelpers.py,sha256=ZFGyGygHRbztOeajpajppyhd-Gp4PB5slMuvCFVrnmg,8591
alembic/util/messaging.py,sha256=B6T-loMhIOY3OTbG47Ywp1Df9LZn18PgjwpwLrD1VNg,3042
alembic/util/pyfiles.py,sha256=95J01FChN0j2uP3p72mjaOQvh5wC6XbdGtTDK8oEzsQ,3373
alembic/util/sqla_compat.py,sha256=94MHlkj43y-QQySz5dCUiJUNOPr3BF9TQ_BrP6ey-8w,18906

View File

@@ -1,11 +1,10 @@
Metadata-Version: 2.1 Metadata-Version: 2.4
Name: alembic Name: alembic
Version: 1.12.1 Version: 1.16.5
Summary: A database migration tool for SQLAlchemy. Summary: A database migration tool for SQLAlchemy.
Home-page: https://alembic.sqlalchemy.org Author-email: Mike Bayer <mike_mp@zzzcomputing.com>
Author: Mike Bayer License-Expression: MIT
Author-email: mike_mp@zzzcomputing.com Project-URL: Homepage, https://alembic.sqlalchemy.org
License: MIT
Project-URL: Documentation, https://alembic.sqlalchemy.org/en/latest/ Project-URL: Documentation, https://alembic.sqlalchemy.org/en/latest/
Project-URL: Changelog, https://alembic.sqlalchemy.org/en/latest/changelog.html Project-URL: Changelog, https://alembic.sqlalchemy.org/en/latest/changelog.html
Project-URL: Source, https://github.com/sqlalchemy/alembic/ Project-URL: Source, https://github.com/sqlalchemy/alembic/
@@ -13,27 +12,27 @@ Project-URL: Issue Tracker, https://github.com/sqlalchemy/alembic/issues/
Classifier: Development Status :: 5 - Production/Stable Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Developers
Classifier: Environment :: Console Classifier: Environment :: Console
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Database :: Front-Ends Classifier: Topic :: Database :: Front-Ends
Requires-Python: >=3.7 Requires-Python: >=3.9
Description-Content-Type: text/x-rst Description-Content-Type: text/x-rst
License-File: LICENSE License-File: LICENSE
Requires-Dist: SQLAlchemy >=1.3.0 Requires-Dist: SQLAlchemy>=1.4.0
Requires-Dist: Mako Requires-Dist: Mako
Requires-Dist: typing-extensions >=4 Requires-Dist: typing-extensions>=4.12
Requires-Dist: importlib-metadata ; python_version < "3.9" Requires-Dist: tomli; python_version < "3.11"
Requires-Dist: importlib-resources ; python_version < "3.9"
Provides-Extra: tz Provides-Extra: tz
Requires-Dist: python-dateutil ; extra == 'tz' Requires-Dist: tzdata; extra == "tz"
Dynamic: license-file
Alembic is a database migrations tool written by the author Alembic is a database migrations tool written by the author
of `SQLAlchemy <http://www.sqlalchemy.org>`_. A migrations tool of `SQLAlchemy <http://www.sqlalchemy.org>`_. A migrations tool

View File

@@ -0,0 +1,163 @@
../../../bin/alembic,sha256=_J6yD4KtWGrilKk3GrsJKTd-33Dqp4ejOp_LNh0fQNs,234
alembic-1.16.5.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
alembic-1.16.5.dist-info/METADATA,sha256=_hKTp0jnKI77a2esxmoCXgv5t2U8hDZS7yZDRkDBl0k,7265
alembic-1.16.5.dist-info/RECORD,,
alembic-1.16.5.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic-1.16.5.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
alembic-1.16.5.dist-info/entry_points.txt,sha256=aykM30soxwGN0pB7etLc1q0cHJbL9dy46RnK9VX4LLw,48
alembic-1.16.5.dist-info/licenses/LICENSE,sha256=NeqcNBmyYfrxvkSMT0fZJVKBv2s2tf_qVQUiJ9S6VN4,1059
alembic-1.16.5.dist-info/top_level.txt,sha256=FwKWd5VsPFC8iQjpu1u9Cn-JnK3-V1RhUCmWqz1cl-s,8
alembic/__init__.py,sha256=H_hItDeyDOrQAHc1AFoYXIRN3O3FSxw4zSNiVzz2JlM,63
alembic/__main__.py,sha256=373m7-TBh72JqrSMYviGrxCHZo-cnweM8AGF8A22PmY,78
alembic/__pycache__/__init__.cpython-312.pyc,,
alembic/__pycache__/__main__.cpython-312.pyc,,
alembic/__pycache__/command.cpython-312.pyc,,
alembic/__pycache__/config.cpython-312.pyc,,
alembic/__pycache__/context.cpython-312.pyc,,
alembic/__pycache__/environment.cpython-312.pyc,,
alembic/__pycache__/migration.cpython-312.pyc,,
alembic/__pycache__/op.cpython-312.pyc,,
alembic/autogenerate/__init__.py,sha256=ntmUTXhjLm4_zmqIwyVaECdpPDn6_u1yM9vYk6-553E,543
alembic/autogenerate/__pycache__/__init__.cpython-312.pyc,,
alembic/autogenerate/__pycache__/api.cpython-312.pyc,,
alembic/autogenerate/__pycache__/compare.cpython-312.pyc,,
alembic/autogenerate/__pycache__/render.cpython-312.pyc,,
alembic/autogenerate/__pycache__/rewriter.cpython-312.pyc,,
alembic/autogenerate/api.py,sha256=L4qkapSJO1Ypymx8HsjLl0vFFt202agwMYsQbIe6ZtI,22219
alembic/autogenerate/compare.py,sha256=LRTxNijEBvcTauuUXuJjC6Sg_gUn33FCYBTF0neZFwE,45979
alembic/autogenerate/render.py,sha256=ceQL8nk8m2kBtQq5gtxtDLR9iR0Sck8xG_61Oez-Sqs,37270
alembic/autogenerate/rewriter.py,sha256=NIASSS-KaNKPmbm1k4pE45aawwjSh1Acf6eZrOwnUGM,7814
alembic/command.py,sha256=pZPQUGSxCjFu7qy0HMe02HJmByM0LOqoiK2AXKfRO3A,24855
alembic/config.py,sha256=nfwN_OOFPpee-OY4o10DANh7VG_E4O7bdW00Wx8NNKY,34237
alembic/context.py,sha256=hK1AJOQXJ29Bhn276GYcosxeG7pC5aZRT5E8c4bMJ4Q,195
alembic/context.pyi,sha256=fdeFNTRc0bUgi7n2eZWVFh6NG-TzIv_0gAcapbfHnKY,31773
alembic/ddl/__init__.py,sha256=Df8fy4Vn_abP8B7q3x8gyFwEwnLw6hs2Ljt_bV3EZWE,152
alembic/ddl/__pycache__/__init__.cpython-312.pyc,,
alembic/ddl/__pycache__/_autogen.cpython-312.pyc,,
alembic/ddl/__pycache__/base.cpython-312.pyc,,
alembic/ddl/__pycache__/impl.cpython-312.pyc,,
alembic/ddl/__pycache__/mssql.cpython-312.pyc,,
alembic/ddl/__pycache__/mysql.cpython-312.pyc,,
alembic/ddl/__pycache__/oracle.cpython-312.pyc,,
alembic/ddl/__pycache__/postgresql.cpython-312.pyc,,
alembic/ddl/__pycache__/sqlite.cpython-312.pyc,,
alembic/ddl/_autogen.py,sha256=Blv2RrHNyF4cE6znCQXNXG5T9aO-YmiwD4Fz-qfoaWA,9275
alembic/ddl/base.py,sha256=A1f89-rCZvqw-hgWmBbIszRqx94lL6gKLFXE9kHettA,10478
alembic/ddl/impl.py,sha256=UL8-iza7CJk_T73lr5fjDLdhxEL56uD-AEjtmESAbLk,30439
alembic/ddl/mssql.py,sha256=NzORSIDHUll_g6iH4IyMTXZU1qjKzXrpespKrjWnfLY,14216
alembic/ddl/mysql.py,sha256=LSfwiABdT54sKY_uQ-h6RvjbGiG-1vCSDkO3ECeq3qM,18383
alembic/ddl/oracle.py,sha256=669YlkcZihlXFbnXhH2krdrvDry8q5pcUGfoqkg_R6Y,6243
alembic/ddl/postgresql.py,sha256=S7uye2NDSHLwV3w8SJ2Q9DLbcvQIxQfJ3EEK6JqyNag,29950
alembic/ddl/sqlite.py,sha256=u5tJgRUiY6bzVltl_NWlI6cy23v8XNagk_9gPI6Lnns,8006
alembic/environment.py,sha256=MM5lPayGT04H3aeng1H7GQ8HEAs3VGX5yy6mDLCPLT4,43
alembic/migration.py,sha256=MV6Fju6rZtn2fTREKzXrCZM6aIBGII4OMZFix0X-GLs,41
alembic/op.py,sha256=flHtcsVqOD-ZgZKK2pv-CJ5Cwh-KJ7puMUNXzishxLw,167
alembic/op.pyi,sha256=PQ4mKNp7EXrjVdIWQRoGiBSVke4PPxTc9I6qF8ZGGZE,50711
alembic/operations/__init__.py,sha256=e0KQSZAgLpTWvyvreB7DWg7RJV_MWSOPVDgCqsd2FzY,318
alembic/operations/__pycache__/__init__.cpython-312.pyc,,
alembic/operations/__pycache__/base.cpython-312.pyc,,
alembic/operations/__pycache__/batch.cpython-312.pyc,,
alembic/operations/__pycache__/ops.cpython-312.pyc,,
alembic/operations/__pycache__/schemaobj.cpython-312.pyc,,
alembic/operations/__pycache__/toimpl.cpython-312.pyc,,
alembic/operations/base.py,sha256=npw1iFboTlEsaQS0b7mb2SEHsRDV4GLQqnjhcfma6Nk,75157
alembic/operations/batch.py,sha256=1UmCFcsFWObinQWFRWoGZkjynl54HKpldbPs67aR4wg,26923
alembic/operations/ops.py,sha256=ftsFgcZIctxRDiuGgkQsaFHsMlRP7cLq7Dj_seKVBnQ,96276
alembic/operations/schemaobj.py,sha256=Wp-bBe4a8lXPTvIHJttBY0ejtpVR5Jvtb2kI-U2PztQ,9468
alembic/operations/toimpl.py,sha256=rgufuSUNwpgrOYzzY3Q3ELW1rQv2fQbQVokXgnIYIrs,7503
alembic/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/runtime/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/runtime/__pycache__/__init__.cpython-312.pyc,,
alembic/runtime/__pycache__/environment.cpython-312.pyc,,
alembic/runtime/__pycache__/migration.cpython-312.pyc,,
alembic/runtime/environment.py,sha256=L6bDW1dvw8L4zwxlTG8KnT0xcCgLXxUfdRpzqlJoFjo,41479
alembic/runtime/migration.py,sha256=lu9_z_qyWmNzSM52_FgdXP_G52PTmTTeOeMBQAkQTFg,49997
alembic/script/__init__.py,sha256=lSj06O391Iy5avWAiq8SPs6N8RBgxkSPjP8wpXcNDGg,100
alembic/script/__pycache__/__init__.cpython-312.pyc,,
alembic/script/__pycache__/base.cpython-312.pyc,,
alembic/script/__pycache__/revision.cpython-312.pyc,,
alembic/script/__pycache__/write_hooks.cpython-312.pyc,,
alembic/script/base.py,sha256=4jINClsNNwQIvnf4Kwp9JPAMrANLXdLItylXmcMqAkI,36896
alembic/script/revision.py,sha256=BQcJoMCIXtSJRLCvdasgLOtCx9O7A8wsSym1FsqLW4s,62307
alembic/script/write_hooks.py,sha256=uQWAtguSCrxU_k9d87NX19y6EzyjJRRQ5HS9cyPnK9o,5092
alembic/templates/async/README,sha256=ISVtAOvqvKk_5ThM5ioJE-lMkvf9IbknFUFVU_vPma4,58
alembic/templates/async/__pycache__/env.cpython-312.pyc,,
alembic/templates/async/alembic.ini.mako,sha256=Bgi4WkaHYsT7xvsX-4WOGkcXKFroNoQLaUvZA23ZwGs,4864
alembic/templates/async/env.py,sha256=zbOCf3Y7w2lg92hxSwmG1MM_7y56i_oRH4AKp0pQBYo,2389
alembic/templates/async/script.py.mako,sha256=04kgeBtNMa4cCnG8CfQcKt6P6rnloIfj8wy0u_DBydM,704
alembic/templates/generic/README,sha256=MVlc9TYmr57RbhXET6QxgyCcwWP7w-vLkEsirENqiIQ,38
alembic/templates/generic/__pycache__/env.cpython-312.pyc,,
alembic/templates/generic/alembic.ini.mako,sha256=LCpLL02bi9Qr3KRTEj9NbQqAu0ckUmYBwPtrMtQkv-Y,4864
alembic/templates/generic/env.py,sha256=TLRWOVW3Xpt_Tpf8JFzlnoPn_qoUu8UV77Y4o9XD6yI,2103
alembic/templates/generic/script.py.mako,sha256=04kgeBtNMa4cCnG8CfQcKt6P6rnloIfj8wy0u_DBydM,704
alembic/templates/multidb/README,sha256=dWLDhnBgphA4Nzb7sNlMfCS3_06YqVbHhz-9O5JNqyI,606
alembic/templates/multidb/__pycache__/env.cpython-312.pyc,,
alembic/templates/multidb/alembic.ini.mako,sha256=rIp1LTdE1xcoFT2G7X72KshzYjUTRrHTvnkvFL___-8,5190
alembic/templates/multidb/env.py,sha256=6zNjnW8mXGUk7erTsAvrfhvqoczJ-gagjVq1Ypg2YIQ,4230
alembic/templates/multidb/script.py.mako,sha256=ZbCXMkI5Wj2dwNKcxuVGkKZ7Iav93BNx_bM4zbGi3c8,1235
alembic/templates/pyproject/README,sha256=dMhIiFoeM7EdeaOXBs3mVQ6zXACMyGXDb_UBB6sGRA0,60
alembic/templates/pyproject/__pycache__/env.cpython-312.pyc,,
alembic/templates/pyproject/alembic.ini.mako,sha256=bQnEoydnLOUgg9vNbTOys4r5MaW8lmwYFXSrlfdEEkw,782
alembic/templates/pyproject/env.py,sha256=TLRWOVW3Xpt_Tpf8JFzlnoPn_qoUu8UV77Y4o9XD6yI,2103
alembic/templates/pyproject/pyproject.toml.mako,sha256=Gf16ZR9OMG9zDlFO5PVQlfiL1DTKwSA--sTNzK7Lba0,2852
alembic/templates/pyproject/script.py.mako,sha256=04kgeBtNMa4cCnG8CfQcKt6P6rnloIfj8wy0u_DBydM,704
alembic/templates/pyproject_async/README,sha256=2Q5XcEouiqQ-TJssO9805LROkVUd0F6d74rTnuLrifA,45
alembic/templates/pyproject_async/__pycache__/env.cpython-312.pyc,,
alembic/templates/pyproject_async/alembic.ini.mako,sha256=bQnEoydnLOUgg9vNbTOys4r5MaW8lmwYFXSrlfdEEkw,782
alembic/templates/pyproject_async/env.py,sha256=zbOCf3Y7w2lg92hxSwmG1MM_7y56i_oRH4AKp0pQBYo,2389
alembic/templates/pyproject_async/pyproject.toml.mako,sha256=Gf16ZR9OMG9zDlFO5PVQlfiL1DTKwSA--sTNzK7Lba0,2852
alembic/templates/pyproject_async/script.py.mako,sha256=04kgeBtNMa4cCnG8CfQcKt6P6rnloIfj8wy0u_DBydM,704
alembic/testing/__init__.py,sha256=PTMhi_2PZ1T_3atQS2CIr0V4YRZzx_doKI-DxKdQS44,1297
alembic/testing/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/__pycache__/assertions.cpython-312.pyc,,
alembic/testing/__pycache__/env.cpython-312.pyc,,
alembic/testing/__pycache__/fixtures.cpython-312.pyc,,
alembic/testing/__pycache__/requirements.cpython-312.pyc,,
alembic/testing/__pycache__/schemacompare.cpython-312.pyc,,
alembic/testing/__pycache__/util.cpython-312.pyc,,
alembic/testing/__pycache__/warnings.cpython-312.pyc,,
alembic/testing/assertions.py,sha256=qcqf3tRAUe-A12NzuK_yxlksuX9OZKRC5E8pKIdBnPg,5302
alembic/testing/env.py,sha256=pka7fjwOC8hYL6X0XE4oPkJpy_1WX01bL7iP7gpO_4I,11551
alembic/testing/fixtures.py,sha256=fOzsRF8SW6CWpAH0sZpUHcgsJjun9EHnp4k2S3Lq5eU,9920
alembic/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
alembic/testing/plugin/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/plugin/__pycache__/bootstrap.cpython-312.pyc,,
alembic/testing/plugin/bootstrap.py,sha256=9C6wtjGrIVztZ928w27hsQE0KcjDLIUtUN3dvZKsMVk,50
alembic/testing/requirements.py,sha256=gNnnvgPCuiqKeHmiNymdQuYIjQ0BrxiPxu_in4eHEsc,4180
alembic/testing/schemacompare.py,sha256=N5UqSNCOJetIKC4vKhpYzQEpj08XkdgIoqBmEPQ3tlc,4838
alembic/testing/suite/__init__.py,sha256=MvE7-hwbaVN1q3NM-ztGxORU9dnIelUCINKqNxewn7Y,288
alembic/testing/suite/__pycache__/__init__.cpython-312.pyc,,
alembic/testing/suite/__pycache__/_autogen_fixtures.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_comments.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_computed.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_diffs.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_fks.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_autogen_identity.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_environment.cpython-312.pyc,,
alembic/testing/suite/__pycache__/test_op.cpython-312.pyc,,
alembic/testing/suite/_autogen_fixtures.py,sha256=Drrz_FKb9KDjq8hkwxtPkJVY1sCY7Biw-Muzb8kANp8,13480
alembic/testing/suite/test_autogen_comments.py,sha256=aEGqKUDw4kHjnDk298aoGcQvXJWmZXcIX_2FxH4cJK8,6283
alembic/testing/suite/test_autogen_computed.py,sha256=-5wran56qXo3afAbSk8cuSDDpbQweyJ61RF-GaVuZbA,4126
alembic/testing/suite/test_autogen_diffs.py,sha256=T4SR1n_kmcOKYhR4W1-dA0e5sddJ69DSVL2HW96kAkE,8394
alembic/testing/suite/test_autogen_fks.py,sha256=AqFmb26Buex167HYa9dZWOk8x-JlB1OK3bwcvvjDFaU,32927
alembic/testing/suite/test_autogen_identity.py,sha256=kcuqngG7qXAKPJDX4U8sRzPKHEJECHuZ0DtuaS6tVkk,5824
alembic/testing/suite/test_environment.py,sha256=OwD-kpESdLoc4byBrGrXbZHvqtPbzhFCG4W9hJOJXPQ,11877
alembic/testing/suite/test_op.py,sha256=2XQCdm_NmnPxHGuGj7hmxMzIhKxXNotUsKdACXzE1mM,1343
alembic/testing/util.py,sha256=CQrcQDA8fs_7ME85z5ydb-Bt70soIIID-qNY1vbR2dg,3350
alembic/testing/warnings.py,sha256=cDDWzvxNZE6x9dME2ACTXSv01G81JcIbE1GIE_s1kvg,831
alembic/util/__init__.py,sha256=_Zj_xp6ssKLyoLHUFzmKhnc8mhwXW8D8h7qyX-wO56M,1519
alembic/util/__pycache__/__init__.cpython-312.pyc,,
alembic/util/__pycache__/compat.cpython-312.pyc,,
alembic/util/__pycache__/editor.cpython-312.pyc,,
alembic/util/__pycache__/exc.cpython-312.pyc,,
alembic/util/__pycache__/langhelpers.cpython-312.pyc,,
alembic/util/__pycache__/messaging.cpython-312.pyc,,
alembic/util/__pycache__/pyfiles.cpython-312.pyc,,
alembic/util/__pycache__/sqla_compat.cpython-312.pyc,,
alembic/util/compat.py,sha256=Vt5xCn5Y675jI4seKNBV4IVnCl9V4wyH3OBI2w7U0EY,4248
alembic/util/editor.py,sha256=JIz6_BdgV8_oKtnheR6DZoB7qnrHrlRgWjx09AsTsUw,2546
alembic/util/exc.py,sha256=ZBlTQ8g-Jkb1iYFhFHs9djilRz0SSQ0Foc5SSoENs5o,564
alembic/util/langhelpers.py,sha256=LpOcovnhMnP45kTt8zNJ4BHpyQrlF40OL6yDXjqKtsE,10026
alembic/util/messaging.py,sha256=3bEBoDy4EAXETXAvArlYjeMITXDTgPTu6ZoE3ytnzSw,3294
alembic/util/pyfiles.py,sha256=kOBjZEytRkBKsQl0LAj2sbKJMQazjwQ_5UeMKSIvVFo,4730
alembic/util/sqla_compat.py,sha256=9OYPTf-GCultAIuv1PoiaqYXAApZQxUOqjrOaeJDAik,14790

View File

@@ -1,5 +1,5 @@
Wheel-Version: 1.0 Wheel-Version: 1.0
Generator: bdist_wheel (0.41.3) Generator: setuptools (80.9.0)
Root-Is-Purelib: true Root-Is-Purelib: true
Tag: py3-none-any Tag: py3-none-any

View File

@@ -1,4 +1,4 @@
Copyright 2009-2023 Michael Bayer. Copyright 2009-2025 Michael Bayer.
Permission is hereby granted, free of charge, to any person obtaining a copy of Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in this software and associated documentation files (the "Software"), to deal in
@@ -16,4 +16,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. SOFTWARE.

View File

@@ -1,6 +1,4 @@
import sys
from . import context from . import context
from . import op from . import op
__version__ = "1.12.1" __version__ = "1.16.5"

View File

@@ -1,10 +1,10 @@
from .api import _render_migration_diffs from .api import _render_migration_diffs as _render_migration_diffs
from .api import compare_metadata from .api import compare_metadata as compare_metadata
from .api import produce_migrations from .api import produce_migrations as produce_migrations
from .api import render_python_code from .api import render_python_code as render_python_code
from .api import RevisionContext from .api import RevisionContext as RevisionContext
from .compare import _produce_net_changes from .compare import _produce_net_changes as _produce_net_changes
from .compare import comparators from .compare import comparators as comparators
from .render import render_op_text from .render import render_op_text as render_op_text
from .render import renderers from .render import renderers as renderers
from .rewriter import Rewriter from .rewriter import Rewriter as Rewriter

View File

@@ -17,6 +17,7 @@ from . import compare
from . import render from . import render
from .. import util from .. import util
from ..operations import ops from ..operations import ops
from ..util import sqla_compat
"""Provide the 'autogenerate' feature which can produce migration operations """Provide the 'autogenerate' feature which can produce migration operations
automatically.""" automatically."""
@@ -27,6 +28,7 @@ if TYPE_CHECKING:
from sqlalchemy.engine import Inspector from sqlalchemy.engine import Inspector
from sqlalchemy.sql.schema import MetaData from sqlalchemy.sql.schema import MetaData
from sqlalchemy.sql.schema import SchemaItem from sqlalchemy.sql.schema import SchemaItem
from sqlalchemy.sql.schema import Table
from ..config import Config from ..config import Config
from ..operations.ops import DowngradeOps from ..operations.ops import DowngradeOps
@@ -164,6 +166,7 @@ def compare_metadata(context: MigrationContext, metadata: MetaData) -> Any:
""" """
migration_script = produce_migrations(context, metadata) migration_script = produce_migrations(context, metadata)
assert migration_script.upgrade_ops is not None
return migration_script.upgrade_ops.as_diffs() return migration_script.upgrade_ops.as_diffs()
@@ -274,7 +277,7 @@ class AutogenContext:
"""Maintains configuration and state that's specific to an """Maintains configuration and state that's specific to an
autogenerate operation.""" autogenerate operation."""
metadata: Optional[MetaData] = None metadata: Union[MetaData, Sequence[MetaData], None] = None
"""The :class:`~sqlalchemy.schema.MetaData` object """The :class:`~sqlalchemy.schema.MetaData` object
representing the destination. representing the destination.
@@ -329,8 +332,8 @@ class AutogenContext:
def __init__( def __init__(
self, self,
migration_context: MigrationContext, migration_context: MigrationContext,
metadata: Optional[MetaData] = None, metadata: Union[MetaData, Sequence[MetaData], None] = None,
opts: Optional[dict] = None, opts: Optional[Dict[str, Any]] = None,
autogenerate: bool = True, autogenerate: bool = True,
) -> None: ) -> None:
if ( if (
@@ -440,7 +443,7 @@ class AutogenContext:
def run_object_filters( def run_object_filters(
self, self,
object_: SchemaItem, object_: SchemaItem,
name: Optional[str], name: sqla_compat._ConstraintName,
type_: NameFilterType, type_: NameFilterType,
reflected: bool, reflected: bool,
compare_to: Optional[SchemaItem], compare_to: Optional[SchemaItem],
@@ -464,7 +467,7 @@ class AutogenContext:
run_filters = run_object_filters run_filters = run_object_filters
@util.memoized_property @util.memoized_property
def sorted_tables(self): def sorted_tables(self) -> List[Table]:
"""Return an aggregate of the :attr:`.MetaData.sorted_tables` """Return an aggregate of the :attr:`.MetaData.sorted_tables`
collection(s). collection(s).
@@ -480,7 +483,7 @@ class AutogenContext:
return result return result
@util.memoized_property @util.memoized_property
def table_key_to_table(self): def table_key_to_table(self) -> Dict[str, Table]:
"""Return an aggregate of the :attr:`.MetaData.tables` dictionaries. """Return an aggregate of the :attr:`.MetaData.tables` dictionaries.
The :attr:`.MetaData.tables` collection is a dictionary of table key The :attr:`.MetaData.tables` collection is a dictionary of table key
@@ -491,7 +494,7 @@ class AutogenContext:
objects contain the same table key, an exception is raised. objects contain the same table key, an exception is raised.
""" """
result = {} result: Dict[str, Table] = {}
for m in util.to_list(self.metadata): for m in util.to_list(self.metadata):
intersect = set(result).intersection(set(m.tables)) intersect = set(result).intersection(set(m.tables))
if intersect: if intersect:
@@ -593,9 +596,9 @@ class RevisionContext:
migration_script = self.generated_revisions[-1] migration_script = self.generated_revisions[-1]
if not getattr(migration_script, "_needs_render", False): if not getattr(migration_script, "_needs_render", False):
migration_script.upgrade_ops_list[-1].upgrade_token = upgrade_token migration_script.upgrade_ops_list[-1].upgrade_token = upgrade_token
migration_script.downgrade_ops_list[ migration_script.downgrade_ops_list[-1].downgrade_token = (
-1 downgrade_token
].downgrade_token = downgrade_token )
migration_script._needs_render = True migration_script._needs_render = True
else: else:
migration_script._upgrade_ops.append( migration_script._upgrade_ops.append(

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import contextlib import contextlib
@@ -7,12 +10,12 @@ from typing import Any
from typing import cast from typing import cast
from typing import Dict from typing import Dict
from typing import Iterator from typing import Iterator
from typing import List
from typing import Mapping from typing import Mapping
from typing import Optional from typing import Optional
from typing import Set from typing import Set
from typing import Tuple from typing import Tuple
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from typing import TypeVar
from typing import Union from typing import Union
from sqlalchemy import event from sqlalchemy import event
@@ -21,10 +24,15 @@ from sqlalchemy import schema as sa_schema
from sqlalchemy import text from sqlalchemy import text
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.sql import expression from sqlalchemy.sql import expression
from sqlalchemy.sql.elements import conv
from sqlalchemy.sql.schema import ForeignKeyConstraint
from sqlalchemy.sql.schema import Index
from sqlalchemy.sql.schema import UniqueConstraint
from sqlalchemy.util import OrderedSet from sqlalchemy.util import OrderedSet
from alembic.ddl.base import _fk_spec
from .. import util from .. import util
from ..ddl._autogen import is_index_sig
from ..ddl._autogen import is_uq_sig
from ..operations import ops from ..operations import ops
from ..util import sqla_compat from ..util import sqla_compat
@@ -35,10 +43,7 @@ if TYPE_CHECKING:
from sqlalchemy.sql.elements import quoted_name from sqlalchemy.sql.elements import quoted_name
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
from sqlalchemy.sql.schema import Column from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import ForeignKeyConstraint
from sqlalchemy.sql.schema import Index
from sqlalchemy.sql.schema import Table from sqlalchemy.sql.schema import Table
from sqlalchemy.sql.schema import UniqueConstraint
from alembic.autogenerate.api import AutogenContext from alembic.autogenerate.api import AutogenContext
from alembic.ddl.impl import DefaultImpl from alembic.ddl.impl import DefaultImpl
@@ -46,6 +51,8 @@ if TYPE_CHECKING:
from alembic.operations.ops import MigrationScript from alembic.operations.ops import MigrationScript
from alembic.operations.ops import ModifyTableOps from alembic.operations.ops import ModifyTableOps
from alembic.operations.ops import UpgradeOps from alembic.operations.ops import UpgradeOps
from ..ddl._autogen import _constraint_sig
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@@ -210,7 +217,7 @@ def _compare_tables(
(inspector), (inspector),
# fmt: on # fmt: on
) )
sqla_compat._reflect_table(inspector, t) _InspectorConv(inspector).reflect_table(t, include_columns=None)
if autogen_context.run_object_filters(t, tname, "table", True, None): if autogen_context.run_object_filters(t, tname, "table", True, None):
modify_table_ops = ops.ModifyTableOps(tname, [], schema=s) modify_table_ops = ops.ModifyTableOps(tname, [], schema=s)
@@ -240,7 +247,8 @@ def _compare_tables(
_compat_autogen_column_reflect(inspector), _compat_autogen_column_reflect(inspector),
# fmt: on # fmt: on
) )
sqla_compat._reflect_table(inspector, t) _InspectorConv(inspector).reflect_table(t, include_columns=None)
conn_column_info[(s, tname)] = t conn_column_info[(s, tname)] = t
for s, tname in sorted(existing_tables, key=lambda x: (x[0] or "", x[1])): for s, tname in sorted(existing_tables, key=lambda x: (x[0] or "", x[1])):
@@ -429,102 +437,56 @@ def _compare_columns(
log.info("Detected removed column '%s.%s'", name, cname) log.info("Detected removed column '%s.%s'", name, cname)
class _constraint_sig: _C = TypeVar("_C", bound=Union[UniqueConstraint, ForeignKeyConstraint, Index])
const: Union[UniqueConstraint, ForeignKeyConstraint, Index]
def md_name_to_sql_name(self, context: AutogenContext) -> Optional[str]:
return sqla_compat._get_constraint_final_name( class _InspectorConv:
self.const, context.dialect __slots__ = ("inspector",)
def __init__(self, inspector):
self.inspector = inspector
def _apply_reflectinfo_conv(self, consts):
if not consts:
return consts
for const in consts:
if const["name"] is not None and not isinstance(
const["name"], conv
):
const["name"] = conv(const["name"])
return consts
def _apply_constraint_conv(self, consts):
if not consts:
return consts
for const in consts:
if const.name is not None and not isinstance(const.name, conv):
const.name = conv(const.name)
return consts
def get_indexes(self, *args, **kw):
return self._apply_reflectinfo_conv(
self.inspector.get_indexes(*args, **kw)
) )
def __eq__(self, other): def get_unique_constraints(self, *args, **kw):
return self.const == other.const return self._apply_reflectinfo_conv(
self.inspector.get_unique_constraints(*args, **kw)
def __ne__(self, other):
return self.const != other.const
def __hash__(self) -> int:
return hash(self.const)
class _uq_constraint_sig(_constraint_sig):
is_index = False
is_unique = True
def __init__(self, const: UniqueConstraint, impl: DefaultImpl) -> None:
self.const = const
self.name = const.name
self.sig = ("UNIQUE_CONSTRAINT",) + impl.create_unique_constraint_sig(
const
) )
@property def get_foreign_keys(self, *args, **kw):
def column_names(self) -> List[str]: return self._apply_reflectinfo_conv(
return [col.name for col in self.const.columns] self.inspector.get_foreign_keys(*args, **kw)
class _ix_constraint_sig(_constraint_sig):
is_index = True
def __init__(self, const: Index, impl: DefaultImpl) -> None:
self.const = const
self.name = const.name
self.sig = ("INDEX",) + impl.create_index_sig(const)
self.is_unique = bool(const.unique)
def md_name_to_sql_name(self, context: AutogenContext) -> Optional[str]:
return sqla_compat._get_constraint_final_name(
self.const, context.dialect
) )
@property def reflect_table(self, table, *, include_columns):
def column_names(self) -> Union[List[quoted_name], List[None]]: self.inspector.reflect_table(table, include_columns=include_columns)
return sqla_compat._get_index_column_names(self.const)
# I had a cool version of this using _ReflectInfo, however that doesn't
class _fk_constraint_sig(_constraint_sig): # work in 1.4 and it's not public API in 2.x. Then this is just a two
def __init__( # liner. So there's no competition...
self, const: ForeignKeyConstraint, include_options: bool = False self._apply_constraint_conv(table.constraints)
) -> None: self._apply_constraint_conv(table.indexes)
self.const = const
self.name = const.name
(
self.source_schema,
self.source_table,
self.source_columns,
self.target_schema,
self.target_table,
self.target_columns,
onupdate,
ondelete,
deferrable,
initially,
) = _fk_spec(const)
self.sig: Tuple[Any, ...] = (
self.source_schema,
self.source_table,
tuple(self.source_columns),
self.target_schema,
self.target_table,
tuple(self.target_columns),
)
if include_options:
self.sig += (
(None if onupdate.lower() == "no action" else onupdate.lower())
if onupdate
else None,
(None if ondelete.lower() == "no action" else ondelete.lower())
if ondelete
else None,
# convert initially + deferrable into one three-state value
"initially_deferrable"
if initially and initially.lower() == "deferred"
else "deferrable"
if deferrable
else "not deferrable",
)
@comparators.dispatch_for("table") @comparators.dispatch_for("table")
@@ -561,34 +523,34 @@ def _compare_indexes_and_uniques(
if conn_table is not None: if conn_table is not None:
# 1b. ... and from connection, if the table exists # 1b. ... and from connection, if the table exists
if hasattr(inspector, "get_unique_constraints"):
try:
conn_uniques = inspector.get_unique_constraints( # type:ignore[assignment] # noqa
tname, schema=schema
)
supports_unique_constraints = True
except NotImplementedError:
pass
except TypeError:
# number of arguments is off for the base
# method in SQLAlchemy due to the cache decorator
# not being present
pass
else:
conn_uniques = [ # type:ignore[assignment]
uq
for uq in conn_uniques
if autogen_context.run_name_filters(
uq["name"],
"unique_constraint",
{"table_name": tname, "schema_name": schema},
)
]
for uq in conn_uniques:
if uq.get("duplicates_index"):
unique_constraints_duplicate_unique_indexes = True
try: try:
conn_indexes = inspector.get_indexes( # type:ignore[assignment] conn_uniques = _InspectorConv(inspector).get_unique_constraints(
tname, schema=schema
)
supports_unique_constraints = True
except NotImplementedError:
pass
except TypeError:
# number of arguments is off for the base
# method in SQLAlchemy due to the cache decorator
# not being present
pass
else:
conn_uniques = [ # type:ignore[assignment]
uq
for uq in conn_uniques
if autogen_context.run_name_filters(
uq["name"],
"unique_constraint",
{"table_name": tname, "schema_name": schema},
)
]
for uq in conn_uniques:
if uq.get("duplicates_index"):
unique_constraints_duplicate_unique_indexes = True
try:
conn_indexes = _InspectorConv(inspector).get_indexes(
tname, schema=schema tname, schema=schema
) )
except NotImplementedError: except NotImplementedError:
@@ -639,7 +601,7 @@ def _compare_indexes_and_uniques(
# 3. give the dialect a chance to omit indexes and constraints that # 3. give the dialect a chance to omit indexes and constraints that
# we know are either added implicitly by the DB or that the DB # we know are either added implicitly by the DB or that the DB
# can't accurately report on # can't accurately report on
autogen_context.migration_context.impl.correct_for_autogen_constraints( impl.correct_for_autogen_constraints(
conn_uniques, # type: ignore[arg-type] conn_uniques, # type: ignore[arg-type]
conn_indexes, # type: ignore[arg-type] conn_indexes, # type: ignore[arg-type]
metadata_unique_constraints, metadata_unique_constraints,
@@ -651,31 +613,31 @@ def _compare_indexes_and_uniques(
# Index and UniqueConstraint so we can easily work with them # Index and UniqueConstraint so we can easily work with them
# interchangeably # interchangeably
metadata_unique_constraints_sig = { metadata_unique_constraints_sig = {
_uq_constraint_sig(uq, impl) for uq in metadata_unique_constraints impl._create_metadata_constraint_sig(uq)
for uq in metadata_unique_constraints
} }
metadata_indexes_sig = { metadata_indexes_sig = {
_ix_constraint_sig(ix, impl) for ix in metadata_indexes impl._create_metadata_constraint_sig(ix) for ix in metadata_indexes
} }
conn_unique_constraints = { conn_unique_constraints = {
_uq_constraint_sig(uq, impl) for uq in conn_uniques impl._create_reflected_constraint_sig(uq) for uq in conn_uniques
} }
conn_indexes_sig = {_ix_constraint_sig(ix, impl) for ix in conn_indexes} conn_indexes_sig = {
impl._create_reflected_constraint_sig(ix) for ix in conn_indexes
}
# 5. index things by name, for those objects that have names # 5. index things by name, for those objects that have names
metadata_names = { metadata_names = {
cast(str, c.md_name_to_sql_name(autogen_context)): c cast(str, c.md_name_to_sql_name(autogen_context)): c
for c in metadata_unique_constraints_sig.union( for c in metadata_unique_constraints_sig.union(metadata_indexes_sig)
metadata_indexes_sig # type:ignore[arg-type] if c.is_named
)
if isinstance(c, _ix_constraint_sig)
or sqla_compat._constraint_is_named(c.const, autogen_context.dialect)
} }
conn_uniques_by_name: Dict[sqla_compat._ConstraintName, _uq_constraint_sig] conn_uniques_by_name: Dict[sqla_compat._ConstraintName, _constraint_sig]
conn_indexes_by_name: Dict[sqla_compat._ConstraintName, _ix_constraint_sig] conn_indexes_by_name: Dict[sqla_compat._ConstraintName, _constraint_sig]
conn_uniques_by_name = {c.name: c for c in conn_unique_constraints} conn_uniques_by_name = {c.name: c for c in conn_unique_constraints}
conn_indexes_by_name = {c.name: c for c in conn_indexes_sig} conn_indexes_by_name = {c.name: c for c in conn_indexes_sig}
@@ -694,13 +656,12 @@ def _compare_indexes_and_uniques(
# 6. index things by "column signature", to help with unnamed unique # 6. index things by "column signature", to help with unnamed unique
# constraints. # constraints.
conn_uniques_by_sig = {uq.sig: uq for uq in conn_unique_constraints} conn_uniques_by_sig = {uq.unnamed: uq for uq in conn_unique_constraints}
metadata_uniques_by_sig = { metadata_uniques_by_sig = {
uq.sig: uq for uq in metadata_unique_constraints_sig uq.unnamed: uq for uq in metadata_unique_constraints_sig
} }
metadata_indexes_by_sig = {ix.sig: ix for ix in metadata_indexes_sig}
unnamed_metadata_uniques = { unnamed_metadata_uniques = {
uq.sig: uq uq.unnamed: uq
for uq in metadata_unique_constraints_sig for uq in metadata_unique_constraints_sig
if not sqla_compat._constraint_is_named( if not sqla_compat._constraint_is_named(
uq.const, autogen_context.dialect uq.const, autogen_context.dialect
@@ -715,18 +676,18 @@ def _compare_indexes_and_uniques(
# 4. The backend may double up indexes as unique constraints and # 4. The backend may double up indexes as unique constraints and
# vice versa (e.g. MySQL, Postgresql) # vice versa (e.g. MySQL, Postgresql)
def obj_added(obj): def obj_added(obj: _constraint_sig):
if obj.is_index: if is_index_sig(obj):
if autogen_context.run_object_filters( if autogen_context.run_object_filters(
obj.const, obj.name, "index", False, None obj.const, obj.name, "index", False, None
): ):
modify_ops.ops.append(ops.CreateIndexOp.from_index(obj.const)) modify_ops.ops.append(ops.CreateIndexOp.from_index(obj.const))
log.info( log.info(
"Detected added index '%s' on %s", "Detected added index %r on '%s'",
obj.name, obj.name,
", ".join(["'%s'" % obj.column_names]), obj.column_names,
) )
else: elif is_uq_sig(obj):
if not supports_unique_constraints: if not supports_unique_constraints:
# can't report unique indexes as added if we don't # can't report unique indexes as added if we don't
# detect them # detect them
@@ -741,13 +702,15 @@ def _compare_indexes_and_uniques(
ops.AddConstraintOp.from_constraint(obj.const) ops.AddConstraintOp.from_constraint(obj.const)
) )
log.info( log.info(
"Detected added unique constraint '%s' on %s", "Detected added unique constraint %r on '%s'",
obj.name, obj.name,
", ".join(["'%s'" % obj.column_names]), obj.column_names,
) )
else:
assert False
def obj_removed(obj): def obj_removed(obj: _constraint_sig):
if obj.is_index: if is_index_sig(obj):
if obj.is_unique and not supports_unique_constraints: if obj.is_unique and not supports_unique_constraints:
# many databases double up unique constraints # many databases double up unique constraints
# as unique indexes. without that list we can't # as unique indexes. without that list we can't
@@ -758,10 +721,8 @@ def _compare_indexes_and_uniques(
obj.const, obj.name, "index", True, None obj.const, obj.name, "index", True, None
): ):
modify_ops.ops.append(ops.DropIndexOp.from_index(obj.const)) modify_ops.ops.append(ops.DropIndexOp.from_index(obj.const))
log.info( log.info("Detected removed index %r on %r", obj.name, tname)
"Detected removed index '%s' on '%s'", obj.name, tname elif is_uq_sig(obj):
)
else:
if is_create_table or is_drop_table: if is_create_table or is_drop_table:
# if the whole table is being dropped, we don't need to # if the whole table is being dropped, we don't need to
# consider unique constraint separately # consider unique constraint separately
@@ -773,33 +734,40 @@ def _compare_indexes_and_uniques(
ops.DropConstraintOp.from_constraint(obj.const) ops.DropConstraintOp.from_constraint(obj.const)
) )
log.info( log.info(
"Detected removed unique constraint '%s' on '%s'", "Detected removed unique constraint %r on %r",
obj.name, obj.name,
tname, tname,
) )
else:
assert False
def obj_changed(
old: _constraint_sig,
new: _constraint_sig,
msg: str,
):
if is_index_sig(old):
assert is_index_sig(new)
def obj_changed(old, new, msg):
if old.is_index:
if autogen_context.run_object_filters( if autogen_context.run_object_filters(
new.const, new.name, "index", False, old.const new.const, new.name, "index", False, old.const
): ):
log.info( log.info(
"Detected changed index '%s' on '%s':%s", "Detected changed index %r on %r: %s", old.name, tname, msg
old.name,
tname,
", ".join(msg),
) )
modify_ops.ops.append(ops.DropIndexOp.from_index(old.const)) modify_ops.ops.append(ops.DropIndexOp.from_index(old.const))
modify_ops.ops.append(ops.CreateIndexOp.from_index(new.const)) modify_ops.ops.append(ops.CreateIndexOp.from_index(new.const))
else: elif is_uq_sig(old):
assert is_uq_sig(new)
if autogen_context.run_object_filters( if autogen_context.run_object_filters(
new.const, new.name, "unique_constraint", False, old.const new.const, new.name, "unique_constraint", False, old.const
): ):
log.info( log.info(
"Detected changed unique constraint '%s' on '%s':%s", "Detected changed unique constraint %r on %r: %s",
old.name, old.name,
tname, tname,
", ".join(msg), msg,
) )
modify_ops.ops.append( modify_ops.ops.append(
ops.DropConstraintOp.from_constraint(old.const) ops.DropConstraintOp.from_constraint(old.const)
@@ -807,18 +775,24 @@ def _compare_indexes_and_uniques(
modify_ops.ops.append( modify_ops.ops.append(
ops.AddConstraintOp.from_constraint(new.const) ops.AddConstraintOp.from_constraint(new.const)
) )
else:
assert False
for removed_name in sorted(set(conn_names).difference(metadata_names)): for removed_name in sorted(set(conn_names).difference(metadata_names)):
conn_obj: Union[_ix_constraint_sig, _uq_constraint_sig] = conn_names[ conn_obj = conn_names[removed_name]
removed_name if (
] is_uq_sig(conn_obj)
if not conn_obj.is_index and conn_obj.sig in unnamed_metadata_uniques: and conn_obj.unnamed in unnamed_metadata_uniques
):
continue continue
elif removed_name in doubled_constraints: elif removed_name in doubled_constraints:
conn_uq, conn_idx = doubled_constraints[removed_name] conn_uq, conn_idx = doubled_constraints[removed_name]
if ( if (
conn_idx.sig not in metadata_indexes_by_sig all(
and conn_uq.sig not in metadata_uniques_by_sig conn_idx.unnamed != meta_idx.unnamed
for meta_idx in metadata_indexes_sig
)
and conn_uq.unnamed not in metadata_uniques_by_sig
): ):
obj_removed(conn_uq) obj_removed(conn_uq)
obj_removed(conn_idx) obj_removed(conn_idx)
@@ -830,30 +804,36 @@ def _compare_indexes_and_uniques(
if existing_name in doubled_constraints: if existing_name in doubled_constraints:
conn_uq, conn_idx = doubled_constraints[existing_name] conn_uq, conn_idx = doubled_constraints[existing_name]
if metadata_obj.is_index: if is_index_sig(metadata_obj):
conn_obj = conn_idx conn_obj = conn_idx
else: else:
conn_obj = conn_uq conn_obj = conn_uq
else: else:
conn_obj = conn_names[existing_name] conn_obj = conn_names[existing_name]
if conn_obj.is_index != metadata_obj.is_index: if type(conn_obj) != type(metadata_obj):
obj_removed(conn_obj) obj_removed(conn_obj)
obj_added(metadata_obj) obj_added(metadata_obj)
else: else:
msg = [] comparison = metadata_obj.compare_to_reflected(conn_obj)
if conn_obj.is_unique != metadata_obj.is_unique:
msg.append(
" unique=%r to unique=%r"
% (conn_obj.is_unique, metadata_obj.is_unique)
)
if conn_obj.sig != metadata_obj.sig:
msg.append(
" expression %r to %r" % (conn_obj.sig, metadata_obj.sig)
)
if msg: if comparison.is_different:
obj_changed(conn_obj, metadata_obj, msg) # constraint are different
obj_changed(conn_obj, metadata_obj, comparison.message)
elif comparison.is_skip:
# constraint cannot be compared, skip them
thing = (
"index" if is_index_sig(conn_obj) else "unique constraint"
)
log.info(
"Cannot compare %s %r, assuming equal and skipping. %s",
thing,
conn_obj.name,
comparison.message,
)
else:
# constraint are equal
assert comparison.is_equal
for added_name in sorted(set(metadata_names).difference(conn_names)): for added_name in sorted(set(metadata_names).difference(conn_names)):
obj = metadata_names[added_name] obj = metadata_names[added_name]
@@ -893,7 +873,7 @@ def _correct_for_uq_duplicates_uix(
} }
unnamed_metadata_uqs = { unnamed_metadata_uqs = {
_uq_constraint_sig(cons, impl).sig impl._create_metadata_constraint_sig(cons).unnamed
for name, cons in metadata_cons_names for name, cons in metadata_cons_names
if name is None if name is None
} }
@@ -917,7 +897,9 @@ def _correct_for_uq_duplicates_uix(
for overlap in uqs_dupe_indexes: for overlap in uqs_dupe_indexes:
if overlap not in metadata_uq_names: if overlap not in metadata_uq_names:
if ( if (
_uq_constraint_sig(uqs_dupe_indexes[overlap], impl).sig impl._create_reflected_constraint_sig(
uqs_dupe_indexes[overlap]
).unnamed
not in unnamed_metadata_uqs not in unnamed_metadata_uqs
): ):
conn_unique_constraints.discard(uqs_dupe_indexes[overlap]) conn_unique_constraints.discard(uqs_dupe_indexes[overlap])
@@ -1053,7 +1035,7 @@ def _normalize_computed_default(sqltext: str) -> str:
""" """
return re.sub(r"[ \(\)'\"`\[\]]", "", sqltext).lower() return re.sub(r"[ \(\)'\"`\[\]\t\r\n]", "", sqltext).lower()
def _compare_computed_default( def _compare_computed_default(
@@ -1137,27 +1119,15 @@ def _compare_server_default(
return False return False
if sqla_compat._server_default_is_computed(metadata_default): if sqla_compat._server_default_is_computed(metadata_default):
# return False in case of a computed column as the server return _compare_computed_default( # type:ignore[func-returns-value]
# default. Note that DDL for adding or removing "GENERATED AS" from autogen_context,
# an existing column is not currently known for any backend. alter_column_op,
# Once SQLAlchemy can reflect "GENERATED" as the "computed" element, schema,
# we would also want to ignore and/or warn for changes vs. the tname,
# metadata (or support backend specific DDL if applicable). cname,
if not sqla_compat.has_computed_reflection: conn_col,
return False metadata_col,
)
else:
return (
_compare_computed_default( # type:ignore[func-returns-value]
autogen_context,
alter_column_op,
schema,
tname,
cname,
conn_col,
metadata_col,
)
)
if sqla_compat._server_default_is_computed(conn_col_default): if sqla_compat._server_default_is_computed(conn_col_default):
_warn_computed_not_supported(tname, cname) _warn_computed_not_supported(tname, cname)
return False return False
@@ -1243,8 +1213,8 @@ def _compare_foreign_keys(
modify_table_ops: ModifyTableOps, modify_table_ops: ModifyTableOps,
schema: Optional[str], schema: Optional[str],
tname: Union[quoted_name, str], tname: Union[quoted_name, str],
conn_table: Optional[Table], conn_table: Table,
metadata_table: Optional[Table], metadata_table: Table,
) -> None: ) -> None:
# if we're doing CREATE TABLE, all FKs are created # if we're doing CREATE TABLE, all FKs are created
# inline within the table def # inline within the table def
@@ -1260,7 +1230,9 @@ def _compare_foreign_keys(
conn_fks_list = [ conn_fks_list = [
fk fk
for fk in inspector.get_foreign_keys(tname, schema=schema) for fk in _InspectorConv(inspector).get_foreign_keys(
tname, schema=schema
)
if autogen_context.run_name_filters( if autogen_context.run_name_filters(
fk["name"], fk["name"],
"foreign_key_constraint", "foreign_key_constraint",
@@ -1268,15 +1240,12 @@ def _compare_foreign_keys(
) )
] ]
backend_reflects_fk_options = bool(
conn_fks_list and "options" in conn_fks_list[0]
)
conn_fks = { conn_fks = {
_make_foreign_key(const, conn_table) # type: ignore[arg-type] _make_foreign_key(const, conn_table) for const in conn_fks_list
for const in conn_fks_list
} }
impl = autogen_context.migration_context.impl
# give the dialect a chance to correct the FKs to match more # give the dialect a chance to correct the FKs to match more
# closely # closely
autogen_context.migration_context.impl.correct_for_autogen_foreignkeys( autogen_context.migration_context.impl.correct_for_autogen_foreignkeys(
@@ -1284,17 +1253,24 @@ def _compare_foreign_keys(
) )
metadata_fks_sig = { metadata_fks_sig = {
_fk_constraint_sig(fk, include_options=backend_reflects_fk_options) impl._create_metadata_constraint_sig(fk) for fk in metadata_fks
for fk in metadata_fks
} }
conn_fks_sig = { conn_fks_sig = {
_fk_constraint_sig(fk, include_options=backend_reflects_fk_options) impl._create_reflected_constraint_sig(fk) for fk in conn_fks
for fk in conn_fks
} }
conn_fks_by_sig = {c.sig: c for c in conn_fks_sig} # check if reflected FKs include options, indicating the backend
metadata_fks_by_sig = {c.sig: c for c in metadata_fks_sig} # can reflect FK options
if conn_fks_list and "options" in conn_fks_list[0]:
conn_fks_by_sig = {c.unnamed: c for c in conn_fks_sig}
metadata_fks_by_sig = {c.unnamed: c for c in metadata_fks_sig}
else:
# otherwise compare by sig without options added
conn_fks_by_sig = {c.unnamed_no_options: c for c in conn_fks_sig}
metadata_fks_by_sig = {
c.unnamed_no_options: c for c in metadata_fks_sig
}
metadata_fks_by_name = { metadata_fks_by_name = {
c.name: c for c in metadata_fks_sig if c.name is not None c.name: c for c in metadata_fks_sig if c.name is not None

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
from io import StringIO from io import StringIO
@@ -15,7 +18,9 @@ from mako.pygen import PythonPrinter
from sqlalchemy import schema as sa_schema from sqlalchemy import schema as sa_schema
from sqlalchemy import sql from sqlalchemy import sql
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.sql.base import _DialectArgView
from sqlalchemy.sql.elements import conv from sqlalchemy.sql.elements import conv
from sqlalchemy.sql.elements import Label
from sqlalchemy.sql.elements import quoted_name from sqlalchemy.sql.elements import quoted_name
from .. import util from .. import util
@@ -25,7 +30,8 @@ from ..util import sqla_compat
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Literal from typing import Literal
from sqlalchemy.sql.base import DialectKWArgs from sqlalchemy import Computed
from sqlalchemy import Identity
from sqlalchemy.sql.elements import ColumnElement from sqlalchemy.sql.elements import ColumnElement
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
from sqlalchemy.sql.schema import CheckConstraint from sqlalchemy.sql.schema import CheckConstraint
@@ -45,8 +51,6 @@ if TYPE_CHECKING:
from alembic.config import Config from alembic.config import Config
from alembic.operations.ops import MigrationScript from alembic.operations.ops import MigrationScript
from alembic.operations.ops import ModifyTableOps from alembic.operations.ops import ModifyTableOps
from alembic.util.sqla_compat import Computed
from alembic.util.sqla_compat import Identity
MAX_PYTHON_ARGS = 255 MAX_PYTHON_ARGS = 255
@@ -164,21 +168,31 @@ def _render_modify_table(
def _render_create_table_comment( def _render_create_table_comment(
autogen_context: AutogenContext, op: ops.CreateTableCommentOp autogen_context: AutogenContext, op: ops.CreateTableCommentOp
) -> str: ) -> str:
templ = ( if autogen_context._has_batch:
"{prefix}create_table_comment(\n" templ = (
"{indent}'{tname}',\n" "{prefix}create_table_comment(\n"
"{indent}{comment},\n" "{indent}{comment},\n"
"{indent}existing_comment={existing},\n" "{indent}existing_comment={existing}\n"
"{indent}schema={schema}\n" ")"
")" )
) else:
templ = (
"{prefix}create_table_comment(\n"
"{indent}'{tname}',\n"
"{indent}{comment},\n"
"{indent}existing_comment={existing},\n"
"{indent}schema={schema}\n"
")"
)
return templ.format( return templ.format(
prefix=_alembic_autogenerate_prefix(autogen_context), prefix=_alembic_autogenerate_prefix(autogen_context),
tname=op.table_name, tname=op.table_name,
comment="%r" % op.comment if op.comment is not None else None, comment="%r" % op.comment if op.comment is not None else None,
existing="%r" % op.existing_comment existing=(
if op.existing_comment is not None "%r" % op.existing_comment
else None, if op.existing_comment is not None
else None
),
schema="'%s'" % op.schema if op.schema is not None else None, schema="'%s'" % op.schema if op.schema is not None else None,
indent=" ", indent=" ",
) )
@@ -188,19 +202,28 @@ def _render_create_table_comment(
def _render_drop_table_comment( def _render_drop_table_comment(
autogen_context: AutogenContext, op: ops.DropTableCommentOp autogen_context: AutogenContext, op: ops.DropTableCommentOp
) -> str: ) -> str:
templ = ( if autogen_context._has_batch:
"{prefix}drop_table_comment(\n" templ = (
"{indent}'{tname}',\n" "{prefix}drop_table_comment(\n"
"{indent}existing_comment={existing},\n" "{indent}existing_comment={existing}\n"
"{indent}schema={schema}\n" ")"
")" )
) else:
templ = (
"{prefix}drop_table_comment(\n"
"{indent}'{tname}',\n"
"{indent}existing_comment={existing},\n"
"{indent}schema={schema}\n"
")"
)
return templ.format( return templ.format(
prefix=_alembic_autogenerate_prefix(autogen_context), prefix=_alembic_autogenerate_prefix(autogen_context),
tname=op.table_name, tname=op.table_name,
existing="%r" % op.existing_comment existing=(
if op.existing_comment is not None "%r" % op.existing_comment
else None, if op.existing_comment is not None
else None
),
schema="'%s'" % op.schema if op.schema is not None else None, schema="'%s'" % op.schema if op.schema is not None else None,
indent=" ", indent=" ",
) )
@@ -257,6 +280,9 @@ def _add_table(autogen_context: AutogenContext, op: ops.CreateTableOp) -> str:
prefixes = ", ".join("'%s'" % p for p in table._prefixes) prefixes = ", ".join("'%s'" % p for p in table._prefixes)
text += ",\nprefixes=[%s]" % prefixes text += ",\nprefixes=[%s]" % prefixes
if op.if_not_exists is not None:
text += ",\nif_not_exists=%r" % bool(op.if_not_exists)
text += "\n)" text += "\n)"
return text return text
@@ -269,16 +295,20 @@ def _drop_table(autogen_context: AutogenContext, op: ops.DropTableOp) -> str:
} }
if op.schema: if op.schema:
text += ", schema=%r" % _ident(op.schema) text += ", schema=%r" % _ident(op.schema)
if op.if_exists is not None:
text += ", if_exists=%r" % bool(op.if_exists)
text += ")" text += ")"
return text return text
def _render_dialect_kwargs_items( def _render_dialect_kwargs_items(
autogen_context: AutogenContext, item: DialectKWArgs autogen_context: AutogenContext, dialect_kwargs: _DialectArgView
) -> list[str]: ) -> list[str]:
return [ return [
f"{key}={_render_potential_expr(val, autogen_context)}" f"{key}={_render_potential_expr(val, autogen_context)}"
for key, val in item.dialect_kwargs.items() for key, val in dialect_kwargs.items()
] ]
@@ -301,7 +331,9 @@ def _add_index(autogen_context: AutogenContext, op: ops.CreateIndexOp) -> str:
assert index.table is not None assert index.table is not None
opts = _render_dialect_kwargs_items(autogen_context, index) opts = _render_dialect_kwargs_items(autogen_context, index.dialect_kwargs)
if op.if_not_exists is not None:
opts.append("if_not_exists=%r" % bool(op.if_not_exists))
text = tmpl % { text = tmpl % {
"prefix": _alembic_autogenerate_prefix(autogen_context), "prefix": _alembic_autogenerate_prefix(autogen_context),
"name": _render_gen_name(autogen_context, index.name), "name": _render_gen_name(autogen_context, index.name),
@@ -310,9 +342,11 @@ def _add_index(autogen_context: AutogenContext, op: ops.CreateIndexOp) -> str:
_get_index_rendered_expressions(index, autogen_context) _get_index_rendered_expressions(index, autogen_context)
), ),
"unique": index.unique or False, "unique": index.unique or False,
"schema": (", schema=%r" % _ident(index.table.schema)) "schema": (
if index.table.schema (", schema=%r" % _ident(index.table.schema))
else "", if index.table.schema
else ""
),
"kwargs": ", " + ", ".join(opts) if opts else "", "kwargs": ", " + ", ".join(opts) if opts else "",
} }
return text return text
@@ -331,7 +365,9 @@ def _drop_index(autogen_context: AutogenContext, op: ops.DropIndexOp) -> str:
"%(prefix)sdrop_index(%(name)r, " "%(prefix)sdrop_index(%(name)r, "
"table_name=%(table_name)r%(schema)s%(kwargs)s)" "table_name=%(table_name)r%(schema)s%(kwargs)s)"
) )
opts = _render_dialect_kwargs_items(autogen_context, index) opts = _render_dialect_kwargs_items(autogen_context, index.dialect_kwargs)
if op.if_exists is not None:
opts.append("if_exists=%r" % bool(op.if_exists))
text = tmpl % { text = tmpl % {
"prefix": _alembic_autogenerate_prefix(autogen_context), "prefix": _alembic_autogenerate_prefix(autogen_context),
"name": _render_gen_name(autogen_context, op.index_name), "name": _render_gen_name(autogen_context, op.index_name),
@@ -353,6 +389,7 @@ def _add_unique_constraint(
def _add_fk_constraint( def _add_fk_constraint(
autogen_context: AutogenContext, op: ops.CreateForeignKeyOp autogen_context: AutogenContext, op: ops.CreateForeignKeyOp
) -> str: ) -> str:
constraint = op.to_constraint()
args = [repr(_render_gen_name(autogen_context, op.constraint_name))] args = [repr(_render_gen_name(autogen_context, op.constraint_name))]
if not autogen_context._has_batch: if not autogen_context._has_batch:
args.append(repr(_ident(op.source_table))) args.append(repr(_ident(op.source_table)))
@@ -382,9 +419,16 @@ def _add_fk_constraint(
if value is not None: if value is not None:
args.append("%s=%r" % (k, value)) args.append("%s=%r" % (k, value))
return "%(prefix)screate_foreign_key(%(args)s)" % { dialect_kwargs = _render_dialect_kwargs_items(
autogen_context, constraint.dialect_kwargs
)
return "%(prefix)screate_foreign_key(%(args)s%(dialect_kwargs)s)" % {
"prefix": _alembic_autogenerate_prefix(autogen_context), "prefix": _alembic_autogenerate_prefix(autogen_context),
"args": ", ".join(args), "args": ", ".join(args),
"dialect_kwargs": (
", " + ", ".join(dialect_kwargs) if dialect_kwargs else ""
),
} }
@@ -406,7 +450,7 @@ def _drop_constraint(
name = _render_gen_name(autogen_context, op.constraint_name) name = _render_gen_name(autogen_context, op.constraint_name)
schema = _ident(op.schema) if op.schema else None schema = _ident(op.schema) if op.schema else None
type_ = _ident(op.constraint_type) if op.constraint_type else None type_ = _ident(op.constraint_type) if op.constraint_type else None
if_exists = op.if_exists
params_strs = [] params_strs = []
params_strs.append(repr(name)) params_strs.append(repr(name))
if not autogen_context._has_batch: if not autogen_context._has_batch:
@@ -415,32 +459,47 @@ def _drop_constraint(
params_strs.append(f"schema={schema!r}") params_strs.append(f"schema={schema!r}")
if type_ is not None: if type_ is not None:
params_strs.append(f"type_={type_!r}") params_strs.append(f"type_={type_!r}")
if if_exists is not None:
params_strs.append(f"if_exists={if_exists}")
return f"{prefix}drop_constraint({', '.join(params_strs)})" return f"{prefix}drop_constraint({', '.join(params_strs)})"
@renderers.dispatch_for(ops.AddColumnOp) @renderers.dispatch_for(ops.AddColumnOp)
def _add_column(autogen_context: AutogenContext, op: ops.AddColumnOp) -> str: def _add_column(autogen_context: AutogenContext, op: ops.AddColumnOp) -> str:
schema, tname, column = op.schema, op.table_name, op.column schema, tname, column, if_not_exists = (
op.schema,
op.table_name,
op.column,
op.if_not_exists,
)
if autogen_context._has_batch: if autogen_context._has_batch:
template = "%(prefix)sadd_column(%(column)s)" template = "%(prefix)sadd_column(%(column)s)"
else: else:
template = "%(prefix)sadd_column(%(tname)r, %(column)s" template = "%(prefix)sadd_column(%(tname)r, %(column)s"
if schema: if schema:
template += ", schema=%(schema)r" template += ", schema=%(schema)r"
if if_not_exists is not None:
template += ", if_not_exists=%(if_not_exists)r"
template += ")" template += ")"
text = template % { text = template % {
"prefix": _alembic_autogenerate_prefix(autogen_context), "prefix": _alembic_autogenerate_prefix(autogen_context),
"tname": tname, "tname": tname,
"column": _render_column(column, autogen_context), "column": _render_column(column, autogen_context),
"schema": schema, "schema": schema,
"if_not_exists": if_not_exists,
} }
return text return text
@renderers.dispatch_for(ops.DropColumnOp) @renderers.dispatch_for(ops.DropColumnOp)
def _drop_column(autogen_context: AutogenContext, op: ops.DropColumnOp) -> str: def _drop_column(autogen_context: AutogenContext, op: ops.DropColumnOp) -> str:
schema, tname, column_name = op.schema, op.table_name, op.column_name schema, tname, column_name, if_exists = (
op.schema,
op.table_name,
op.column_name,
op.if_exists,
)
if autogen_context._has_batch: if autogen_context._has_batch:
template = "%(prefix)sdrop_column(%(cname)r)" template = "%(prefix)sdrop_column(%(cname)r)"
@@ -448,6 +507,8 @@ def _drop_column(autogen_context: AutogenContext, op: ops.DropColumnOp) -> str:
template = "%(prefix)sdrop_column(%(tname)r, %(cname)r" template = "%(prefix)sdrop_column(%(tname)r, %(cname)r"
if schema: if schema:
template += ", schema=%(schema)r" template += ", schema=%(schema)r"
if if_exists is not None:
template += ", if_exists=%(if_exists)r"
template += ")" template += ")"
text = template % { text = template % {
@@ -455,6 +516,7 @@ def _drop_column(autogen_context: AutogenContext, op: ops.DropColumnOp) -> str:
"tname": _ident(tname), "tname": _ident(tname),
"cname": _ident(column_name), "cname": _ident(column_name),
"schema": _ident(schema), "schema": _ident(schema),
"if_exists": if_exists,
} }
return text return text
@@ -469,6 +531,7 @@ def _alter_column(
type_ = op.modify_type type_ = op.modify_type
nullable = op.modify_nullable nullable = op.modify_nullable
comment = op.modify_comment comment = op.modify_comment
newname = op.modify_name
autoincrement = op.kw.get("autoincrement", None) autoincrement = op.kw.get("autoincrement", None)
existing_type = op.existing_type existing_type = op.existing_type
existing_nullable = op.existing_nullable existing_nullable = op.existing_nullable
@@ -497,6 +560,8 @@ def _alter_column(
rendered = _render_server_default(server_default, autogen_context) rendered = _render_server_default(server_default, autogen_context)
text += ",\n%sserver_default=%s" % (indent, rendered) text += ",\n%sserver_default=%s" % (indent, rendered)
if newname is not None:
text += ",\n%snew_column_name=%r" % (indent, newname)
if type_ is not None: if type_ is not None:
text += ",\n%stype_=%s" % (indent, _repr_type(type_, autogen_context)) text += ",\n%stype_=%s" % (indent, _repr_type(type_, autogen_context))
if nullable is not None: if nullable is not None:
@@ -549,23 +614,28 @@ def _render_potential_expr(
value: Any, value: Any,
autogen_context: AutogenContext, autogen_context: AutogenContext,
*, *,
wrap_in_text: bool = True, wrap_in_element: bool = True,
is_server_default: bool = False, is_server_default: bool = False,
is_index: bool = False, is_index: bool = False,
) -> str: ) -> str:
if isinstance(value, sql.ClauseElement): if isinstance(value, sql.ClauseElement):
if wrap_in_text: sql_text = autogen_context.migration_context.impl.render_ddl_sql_expr(
template = "%(prefix)stext(%(sql)r)" value, is_server_default=is_server_default, is_index=is_index
)
if wrap_in_element:
prefix = _sqlalchemy_autogenerate_prefix(autogen_context)
element = "literal_column" if is_index else "text"
value_str = f"{prefix}{element}({sql_text!r})"
if (
is_index
and isinstance(value, Label)
and type(value.name) is str
):
return value_str + f".label({value.name!r})"
else:
return value_str
else: else:
template = "%(sql)r" return repr(sql_text)
return template % {
"prefix": _sqlalchemy_autogenerate_prefix(autogen_context),
"sql": autogen_context.migration_context.impl.render_ddl_sql_expr(
value, is_server_default=is_server_default, is_index=is_index
),
}
else: else:
return repr(value) return repr(value)
@@ -574,9 +644,11 @@ def _get_index_rendered_expressions(
idx: Index, autogen_context: AutogenContext idx: Index, autogen_context: AutogenContext
) -> List[str]: ) -> List[str]:
return [ return [
repr(_ident(getattr(exp, "name", None))) (
if isinstance(exp, sa_schema.Column) repr(_ident(getattr(exp, "name", None)))
else _render_potential_expr(exp, autogen_context, is_index=True) if isinstance(exp, sa_schema.Column)
else _render_potential_expr(exp, autogen_context, is_index=True)
)
for exp in idx.expressions for exp in idx.expressions
] ]
@@ -591,16 +663,18 @@ def _uq_constraint(
has_batch = autogen_context._has_batch has_batch = autogen_context._has_batch
if constraint.deferrable: if constraint.deferrable:
opts.append(("deferrable", str(constraint.deferrable))) opts.append(("deferrable", constraint.deferrable))
if constraint.initially: if constraint.initially:
opts.append(("initially", str(constraint.initially))) opts.append(("initially", constraint.initially))
if not has_batch and alter and constraint.table.schema: if not has_batch and alter and constraint.table.schema:
opts.append(("schema", _ident(constraint.table.schema))) opts.append(("schema", _ident(constraint.table.schema)))
if not alter and constraint.name: if not alter and constraint.name:
opts.append( opts.append(
("name", _render_gen_name(autogen_context, constraint.name)) ("name", _render_gen_name(autogen_context, constraint.name))
) )
dialect_options = _render_dialect_kwargs_items(autogen_context, constraint) dialect_options = _render_dialect_kwargs_items(
autogen_context, constraint.dialect_kwargs
)
if alter: if alter:
args = [repr(_render_gen_name(autogen_context, constraint.name))] args = [repr(_render_gen_name(autogen_context, constraint.name))]
@@ -704,7 +778,7 @@ def _render_column(
+ [ + [
"%s=%s" "%s=%s"
% (key, _render_potential_expr(val, autogen_context)) % (key, _render_potential_expr(val, autogen_context))
for key, val in sqla_compat._column_kwargs(column).items() for key, val in column.kwargs.items()
] ]
) )
), ),
@@ -739,6 +813,8 @@ def _render_server_default(
return _render_potential_expr( return _render_potential_expr(
default.arg, autogen_context, is_server_default=True default.arg, autogen_context, is_server_default=True
) )
elif isinstance(default, sa_schema.FetchedValue):
return _render_fetched_value(autogen_context)
if isinstance(default, str) and repr_: if isinstance(default, str) and repr_:
default = repr(re.sub(r"^'|'$", "", default)) default = repr(re.sub(r"^'|'$", "", default))
@@ -750,7 +826,7 @@ def _render_computed(
computed: Computed, autogen_context: AutogenContext computed: Computed, autogen_context: AutogenContext
) -> str: ) -> str:
text = _render_potential_expr( text = _render_potential_expr(
computed.sqltext, autogen_context, wrap_in_text=False computed.sqltext, autogen_context, wrap_in_element=False
) )
kwargs = {} kwargs = {}
@@ -776,6 +852,12 @@ def _render_identity(
} }
def _render_fetched_value(autogen_context: AutogenContext) -> str:
return "%(prefix)sFetchedValue()" % {
"prefix": _sqlalchemy_autogenerate_prefix(autogen_context),
}
def _repr_type( def _repr_type(
type_: TypeEngine, type_: TypeEngine,
autogen_context: AutogenContext, autogen_context: AutogenContext,
@@ -794,7 +876,10 @@ def _repr_type(
mod = type(type_).__module__ mod = type(type_).__module__
imports = autogen_context.imports imports = autogen_context.imports
if mod.startswith("sqlalchemy.dialects"):
if not _skip_variants and sqla_compat._type_has_variants(type_):
return _render_Variant_type(type_, autogen_context)
elif mod.startswith("sqlalchemy.dialects"):
match = re.match(r"sqlalchemy\.dialects\.(\w+)", mod) match = re.match(r"sqlalchemy\.dialects\.(\w+)", mod)
assert match is not None assert match is not None
dname = match.group(1) dname = match.group(1)
@@ -806,8 +891,6 @@ def _repr_type(
return "%s.%r" % (dname, type_) return "%s.%r" % (dname, type_)
elif impl_rt: elif impl_rt:
return impl_rt return impl_rt
elif not _skip_variants and sqla_compat._type_has_variants(type_):
return _render_Variant_type(type_, autogen_context)
elif mod.startswith("sqlalchemy."): elif mod.startswith("sqlalchemy."):
if "_render_%s_type" % type_.__visit_name__ in globals(): if "_render_%s_type" % type_.__visit_name__ in globals():
fn = globals()["_render_%s_type" % type_.__visit_name__] fn = globals()["_render_%s_type" % type_.__visit_name__]
@@ -834,7 +917,7 @@ def _render_Variant_type(
) -> str: ) -> str:
base_type, variant_mapping = sqla_compat._get_variant_mapping(type_) base_type, variant_mapping = sqla_compat._get_variant_mapping(type_)
base = _repr_type(base_type, autogen_context, _skip_variants=True) base = _repr_type(base_type, autogen_context, _skip_variants=True)
assert base is not None and base is not False assert base is not None and base is not False # type: ignore[comparison-overlap] # noqa:E501
for dialect in sorted(variant_mapping): for dialect in sorted(variant_mapping):
typ = variant_mapping[dialect] typ = variant_mapping[dialect]
base += ".with_variant(%s, %r)" % ( base += ".with_variant(%s, %r)" % (
@@ -925,13 +1008,13 @@ def _render_primary_key(
def _fk_colspec( def _fk_colspec(
fk: ForeignKey, fk: ForeignKey,
metadata_schema: Optional[str], metadata_schema: Optional[str],
namespace_metadata: MetaData, namespace_metadata: Optional[MetaData],
) -> str: ) -> str:
"""Implement a 'safe' version of ForeignKey._get_colspec() that """Implement a 'safe' version of ForeignKey._get_colspec() that
won't fail if the remote table can't be resolved. won't fail if the remote table can't be resolved.
""" """
colspec = fk._get_colspec() # type:ignore[attr-defined] colspec = fk._get_colspec()
tokens = colspec.split(".") tokens = colspec.split(".")
tname, colname = tokens[-2:] tname, colname = tokens[-2:]
@@ -949,7 +1032,10 @@ def _fk_colspec(
# the FK constraint needs to be rendered in terms of the column # the FK constraint needs to be rendered in terms of the column
# name. # name.
if table_fullname in namespace_metadata.tables: if (
namespace_metadata is not None
and table_fullname in namespace_metadata.tables
):
col = namespace_metadata.tables[table_fullname].c.get(colname) col = namespace_metadata.tables[table_fullname].c.get(colname)
if col is not None: if col is not None:
colname = _ident(col.name) # type: ignore[assignment] colname = _ident(col.name) # type: ignore[assignment]
@@ -980,7 +1066,7 @@ def _populate_render_fk_opts(
def _render_foreign_key( def _render_foreign_key(
constraint: ForeignKeyConstraint, constraint: ForeignKeyConstraint,
autogen_context: AutogenContext, autogen_context: AutogenContext,
namespace_metadata: MetaData, namespace_metadata: Optional[MetaData],
) -> Optional[str]: ) -> Optional[str]:
rendered = _user_defined_render("foreign_key", constraint, autogen_context) rendered = _user_defined_render("foreign_key", constraint, autogen_context)
if rendered is not False: if rendered is not False:
@@ -994,15 +1080,16 @@ def _render_foreign_key(
_populate_render_fk_opts(constraint, opts) _populate_render_fk_opts(constraint, opts)
apply_metadata_schema = namespace_metadata.schema apply_metadata_schema = (
namespace_metadata.schema if namespace_metadata is not None else None
)
return ( return (
"%(prefix)sForeignKeyConstraint([%(cols)s], " "%(prefix)sForeignKeyConstraint([%(cols)s], "
"[%(refcols)s], %(args)s)" "[%(refcols)s], %(args)s)"
% { % {
"prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "prefix": _sqlalchemy_autogenerate_prefix(autogen_context),
"cols": ", ".join( "cols": ", ".join(
"%r" % _ident(cast("Column", f.parent).name) repr(_ident(f.parent.name)) for f in constraint.elements
for f in constraint.elements
), ),
"refcols": ", ".join( "refcols": ", ".join(
repr(_fk_colspec(f, apply_metadata_schema, namespace_metadata)) repr(_fk_colspec(f, apply_metadata_schema, namespace_metadata))
@@ -1043,12 +1130,10 @@ def _render_check_constraint(
# ideally SQLAlchemy would give us more of a first class # ideally SQLAlchemy would give us more of a first class
# way to detect this. # way to detect this.
if ( if (
constraint._create_rule # type:ignore[attr-defined] constraint._create_rule
and hasattr( and hasattr(constraint._create_rule, "target")
constraint._create_rule, "target" # type:ignore[attr-defined]
)
and isinstance( and isinstance(
constraint._create_rule.target, # type:ignore[attr-defined] constraint._create_rule.target,
sqltypes.TypeEngine, sqltypes.TypeEngine,
) )
): ):
@@ -1060,11 +1145,13 @@ def _render_check_constraint(
) )
return "%(prefix)sCheckConstraint(%(sqltext)s%(opts)s)" % { return "%(prefix)sCheckConstraint(%(sqltext)s%(opts)s)" % {
"prefix": _sqlalchemy_autogenerate_prefix(autogen_context), "prefix": _sqlalchemy_autogenerate_prefix(autogen_context),
"opts": ", " + (", ".join("%s=%s" % (k, v) for k, v in opts)) "opts": (
if opts ", " + (", ".join("%s=%s" % (k, v) for k, v in opts))
else "", if opts
else ""
),
"sqltext": _render_potential_expr( "sqltext": _render_potential_expr(
constraint.sqltext, autogen_context, wrap_in_text=False constraint.sqltext, autogen_context, wrap_in_element=False
), ),
} }
@@ -1076,7 +1163,10 @@ def _execute_sql(autogen_context: AutogenContext, op: ops.ExecuteSQLOp) -> str:
"Autogenerate rendering of SQL Expression language constructs " "Autogenerate rendering of SQL Expression language constructs "
"not supported here; please use a plain SQL string" "not supported here; please use a plain SQL string"
) )
return "op.execute(%r)" % op.sqltext return "{prefix}execute({sqltext!r})".format(
prefix=_alembic_autogenerate_prefix(autogen_context),
sqltext=op.sqltext,
)
renderers = default_renderers.branch() renderers = default_renderers.branch()

View File

@@ -4,7 +4,7 @@ from typing import Any
from typing import Callable from typing import Callable
from typing import Iterator from typing import Iterator
from typing import List from typing import List
from typing import Optional from typing import Tuple
from typing import Type from typing import Type
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from typing import Union from typing import Union
@@ -16,12 +16,18 @@ if TYPE_CHECKING:
from ..operations.ops import AddColumnOp from ..operations.ops import AddColumnOp
from ..operations.ops import AlterColumnOp from ..operations.ops import AlterColumnOp
from ..operations.ops import CreateTableOp from ..operations.ops import CreateTableOp
from ..operations.ops import DowngradeOps
from ..operations.ops import MigrateOperation from ..operations.ops import MigrateOperation
from ..operations.ops import MigrationScript from ..operations.ops import MigrationScript
from ..operations.ops import ModifyTableOps from ..operations.ops import ModifyTableOps
from ..operations.ops import OpContainer from ..operations.ops import OpContainer
from ..runtime.environment import _GetRevArg from ..operations.ops import UpgradeOps
from ..runtime.migration import MigrationContext from ..runtime.migration import MigrationContext
from ..script.revision import _GetRevArg
ProcessRevisionDirectiveFn = Callable[
["MigrationContext", "_GetRevArg", List["MigrationScript"]], None
]
class Rewriter: class Rewriter:
@@ -52,15 +58,21 @@ class Rewriter:
_traverse = util.Dispatcher() _traverse = util.Dispatcher()
_chained: Optional[Rewriter] = None _chained: Tuple[Union[ProcessRevisionDirectiveFn, Rewriter], ...] = ()
def __init__(self) -> None: def __init__(self) -> None:
self.dispatch = util.Dispatcher() self.dispatch = util.Dispatcher()
def chain(self, other: Rewriter) -> Rewriter: def chain(
self,
other: Union[
ProcessRevisionDirectiveFn,
Rewriter,
],
) -> Rewriter:
"""Produce a "chain" of this :class:`.Rewriter` to another. """Produce a "chain" of this :class:`.Rewriter` to another.
This allows two rewriters to operate serially on a stream, This allows two or more rewriters to operate serially on a stream,
e.g.:: e.g.::
writer1 = autogenerate.Rewriter() writer1 = autogenerate.Rewriter()
@@ -89,7 +101,7 @@ class Rewriter:
""" """
wr = self.__class__.__new__(self.__class__) wr = self.__class__.__new__(self.__class__)
wr.__dict__.update(self.__dict__) wr.__dict__.update(self.__dict__)
wr._chained = other wr._chained += (other,)
return wr return wr
def rewrites( def rewrites(
@@ -101,7 +113,7 @@ class Rewriter:
Type[CreateTableOp], Type[CreateTableOp],
Type[ModifyTableOps], Type[ModifyTableOps],
], ],
) -> Callable: ) -> Callable[..., Any]:
"""Register a function as rewriter for a given type. """Register a function as rewriter for a given type.
The function should receive three arguments, which are The function should receive three arguments, which are
@@ -146,8 +158,8 @@ class Rewriter:
directives: List[MigrationScript], directives: List[MigrationScript],
) -> None: ) -> None:
self.process_revision_directives(context, revision, directives) self.process_revision_directives(context, revision, directives)
if self._chained: for process_revision_directives in self._chained:
self._chained(context, revision, directives) process_revision_directives(context, revision, directives)
@_traverse.dispatch_for(ops.MigrationScript) @_traverse.dispatch_for(ops.MigrationScript)
def _traverse_script( def _traverse_script(
@@ -156,7 +168,7 @@ class Rewriter:
revision: _GetRevArg, revision: _GetRevArg,
directive: MigrationScript, directive: MigrationScript,
) -> None: ) -> None:
upgrade_ops_list = [] upgrade_ops_list: List[UpgradeOps] = []
for upgrade_ops in directive.upgrade_ops_list: for upgrade_ops in directive.upgrade_ops_list:
ret = self._traverse_for(context, revision, upgrade_ops) ret = self._traverse_for(context, revision, upgrade_ops)
if len(ret) != 1: if len(ret) != 1:
@@ -164,9 +176,10 @@ class Rewriter:
"Can only return single object for UpgradeOps traverse" "Can only return single object for UpgradeOps traverse"
) )
upgrade_ops_list.append(ret[0]) upgrade_ops_list.append(ret[0])
directive.upgrade_ops = upgrade_ops_list directive.upgrade_ops = upgrade_ops_list
downgrade_ops_list = [] downgrade_ops_list: List[DowngradeOps] = []
for downgrade_ops in directive.downgrade_ops_list: for downgrade_ops in directive.downgrade_ops_list:
ret = self._traverse_for(context, revision, downgrade_ops) ret = self._traverse_for(context, revision, downgrade_ops)
if len(ret) != 1: if len(ret) != 1:

View File

@@ -1,6 +1,9 @@
# mypy: allow-untyped-defs, allow-untyped-calls
from __future__ import annotations from __future__ import annotations
import os import os
import pathlib
from typing import List from typing import List
from typing import Optional from typing import Optional
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
@@ -10,6 +13,7 @@ from . import autogenerate as autogen
from . import util from . import util
from .runtime.environment import EnvironmentContext from .runtime.environment import EnvironmentContext
from .script import ScriptDirectory from .script import ScriptDirectory
from .util import compat
if TYPE_CHECKING: if TYPE_CHECKING:
from alembic.config import Config from alembic.config import Config
@@ -18,7 +22,7 @@ if TYPE_CHECKING:
from .runtime.environment import ProcessRevisionDirectiveFn from .runtime.environment import ProcessRevisionDirectiveFn
def list_templates(config: Config): def list_templates(config: Config) -> None:
"""List available templates. """List available templates.
:param config: a :class:`.Config` object. :param config: a :class:`.Config` object.
@@ -26,12 +30,10 @@ def list_templates(config: Config):
""" """
config.print_stdout("Available templates:\n") config.print_stdout("Available templates:\n")
for tempname in os.listdir(config.get_template_directory()): for tempname in config._get_template_path().iterdir():
with open( with (tempname / "README").open() as readme:
os.path.join(config.get_template_directory(), tempname, "README")
) as readme:
synopsis = next(readme).rstrip() synopsis = next(readme).rstrip()
config.print_stdout("%s - %s", tempname, synopsis) config.print_stdout("%s - %s", tempname.name, synopsis)
config.print_stdout("\nTemplates are used via the 'init' command, e.g.:") config.print_stdout("\nTemplates are used via the 'init' command, e.g.:")
config.print_stdout("\n alembic init --template generic ./scripts") config.print_stdout("\n alembic init --template generic ./scripts")
@@ -47,7 +49,7 @@ def init(
:param config: a :class:`.Config` object. :param config: a :class:`.Config` object.
:param directory: string path of the target directory :param directory: string path of the target directory.
:param template: string name of the migration environment template to :param template: string name of the migration environment template to
use. use.
@@ -57,65 +59,136 @@ def init(
""" """
if os.access(directory, os.F_OK) and os.listdir(directory): directory_path = pathlib.Path(directory)
if directory_path.exists() and list(directory_path.iterdir()):
raise util.CommandError( raise util.CommandError(
"Directory %s already exists and is not empty" % directory "Directory %s already exists and is not empty" % directory_path
) )
template_dir = os.path.join(config.get_template_directory(), template) template_path = config._get_template_path() / template
if not os.access(template_dir, os.F_OK):
raise util.CommandError("No such template %r" % template)
if not os.access(directory, os.F_OK): if not template_path.exists():
raise util.CommandError(f"No such template {template_path}")
# left as os.access() to suit unit test mocking
if not os.access(directory_path, os.F_OK):
with util.status( with util.status(
f"Creating directory {os.path.abspath(directory)!r}", f"Creating directory {directory_path.absolute()}",
**config.messaging_opts, **config.messaging_opts,
): ):
os.makedirs(directory) os.makedirs(directory_path)
versions = os.path.join(directory, "versions") versions = directory_path / "versions"
with util.status( with util.status(
f"Creating directory {os.path.abspath(versions)!r}", f"Creating directory {versions.absolute()}",
**config.messaging_opts, **config.messaging_opts,
): ):
os.makedirs(versions) os.makedirs(versions)
script = ScriptDirectory(directory) if not directory_path.is_absolute():
# for non-absolute path, state config file in .ini / pyproject
# as relative to the %(here)s token, which is where the config
# file itself would be
config_file: str | None = None if config._config_file_path is not None:
for file_ in os.listdir(template_dir): rel_dir = compat.path_relative_to(
file_path = os.path.join(template_dir, file_) directory_path.absolute(),
config._config_file_path.absolute().parent,
walk_up=True,
)
ini_script_location_directory = ("%(here)s" / rel_dir).as_posix()
if config._toml_file_path is not None:
rel_dir = compat.path_relative_to(
directory_path.absolute(),
config._toml_file_path.absolute().parent,
walk_up=True,
)
toml_script_location_directory = ("%(here)s" / rel_dir).as_posix()
else:
ini_script_location_directory = directory_path.as_posix()
toml_script_location_directory = directory_path.as_posix()
script = ScriptDirectory(directory_path)
has_toml = False
config_file: pathlib.Path | None = None
for file_path in template_path.iterdir():
file_ = file_path.name
if file_ == "alembic.ini.mako": if file_ == "alembic.ini.mako":
assert config.config_file_name is not None assert config.config_file_name is not None
config_file = os.path.abspath(config.config_file_name) config_file = pathlib.Path(config.config_file_name).absolute()
if os.access(config_file, os.F_OK): if config_file.exists():
util.msg( util.msg(
f"File {config_file!r} already exists, skipping", f"File {config_file} already exists, skipping",
**config.messaging_opts, **config.messaging_opts,
) )
else: else:
script._generate_template( script._generate_template(
file_path, config_file, script_location=directory file_path,
config_file,
script_location=ini_script_location_directory,
) )
elif os.path.isfile(file_path): elif file_ == "pyproject.toml.mako":
output_file = os.path.join(directory, file_) has_toml = True
assert config._toml_file_path is not None
toml_path = config._toml_file_path.absolute()
if toml_path.exists():
# left as open() to suit unit test mocking
with open(toml_path, "rb") as f:
toml_data = compat.tomllib.load(f)
if "tool" in toml_data and "alembic" in toml_data["tool"]:
util.msg(
f"File {toml_path} already exists "
"and already has a [tool.alembic] section, "
"skipping",
)
continue
script._append_template(
file_path,
toml_path,
script_location=toml_script_location_directory,
)
else:
script._generate_template(
file_path,
toml_path,
script_location=toml_script_location_directory,
)
elif file_path.is_file():
output_file = directory_path / file_
script._copy_file(file_path, output_file) script._copy_file(file_path, output_file)
if package: if package:
for path in [ for path in [
os.path.join(os.path.abspath(directory), "__init__.py"), directory_path.absolute() / "__init__.py",
os.path.join(os.path.abspath(versions), "__init__.py"), versions.absolute() / "__init__.py",
]: ]:
with util.status(f"Adding {path!r}", **config.messaging_opts): with util.status(f"Adding {path!s}", **config.messaging_opts):
# left as open() to suit unit test mocking
with open(path, "w"): with open(path, "w"):
pass pass
assert config_file is not None assert config_file is not None
util.msg(
"Please edit configuration/connection/logging " if has_toml:
f"settings in {config_file!r} before proceeding.", util.msg(
**config.messaging_opts, f"Please edit configuration settings in {toml_path} and "
) "configuration/connection/logging "
f"settings in {config_file} before proceeding.",
**config.messaging_opts,
)
else:
util.msg(
"Please edit configuration/connection/logging "
f"settings in {config_file} before proceeding.",
**config.messaging_opts,
)
def revision( def revision(
@@ -126,7 +199,7 @@ def revision(
head: str = "head", head: str = "head",
splice: bool = False, splice: bool = False,
branch_label: Optional[_RevIdType] = None, branch_label: Optional[_RevIdType] = None,
version_path: Optional[str] = None, version_path: Union[str, os.PathLike[str], None] = None,
rev_id: Optional[str] = None, rev_id: Optional[str] = None,
depends_on: Optional[str] = None, depends_on: Optional[str] = None,
process_revision_directives: Optional[ProcessRevisionDirectiveFn] = None, process_revision_directives: Optional[ProcessRevisionDirectiveFn] = None,
@@ -172,7 +245,7 @@ def revision(
will be applied to the structure generated by the revision process will be applied to the structure generated by the revision process
where it can be altered programmatically. Note that unlike all where it can be altered programmatically. Note that unlike all
the other parameters, this option is only available via programmatic the other parameters, this option is only available via programmatic
use of :func:`.command.revision` use of :func:`.command.revision`.
""" """
@@ -196,7 +269,9 @@ def revision(
process_revision_directives=process_revision_directives, process_revision_directives=process_revision_directives,
) )
environment = util.asbool(config.get_main_option("revision_environment")) environment = util.asbool(
config.get_alembic_option("revision_environment")
)
if autogenerate: if autogenerate:
environment = True environment = True
@@ -290,10 +365,15 @@ def check(config: "Config") -> None:
# the revision_context now has MigrationScript structure(s) present. # the revision_context now has MigrationScript structure(s) present.
migration_script = revision_context.generated_revisions[-1] migration_script = revision_context.generated_revisions[-1]
diffs = migration_script.upgrade_ops.as_diffs() diffs = []
for upgrade_ops in migration_script.upgrade_ops_list:
diffs.extend(upgrade_ops.as_diffs())
if diffs: if diffs:
raise util.AutogenerateDiffsDetected( raise util.AutogenerateDiffsDetected(
f"New upgrade operations detected: {diffs}" f"New upgrade operations detected: {diffs}",
revision_context=revision_context,
diffs=diffs,
) )
else: else:
config.print_stdout("No new upgrade operations detected.") config.print_stdout("No new upgrade operations detected.")
@@ -310,9 +390,11 @@ def merge(
:param config: a :class:`.Config` instance :param config: a :class:`.Config` instance
:param message: string message to apply to the revision :param revisions: The revisions to merge.
:param branch_label: string label name to apply to the new revision :param message: string message to apply to the revision.
:param branch_label: string label name to apply to the new revision.
:param rev_id: hardcoded revision identifier instead of generating a new :param rev_id: hardcoded revision identifier instead of generating a new
one. one.
@@ -329,7 +411,9 @@ def merge(
# e.g. multiple databases # e.g. multiple databases
} }
environment = util.asbool(config.get_main_option("revision_environment")) environment = util.asbool(
config.get_alembic_option("revision_environment")
)
if environment: if environment:
@@ -365,9 +449,10 @@ def upgrade(
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param revision: string revision target or range for --sql mode :param revision: string revision target or range for --sql mode. May be
``"heads"`` to target the most recent revision(s).
:param sql: if True, use ``--sql`` mode :param sql: if True, use ``--sql`` mode.
:param tag: an arbitrary "tag" that can be intercepted by custom :param tag: an arbitrary "tag" that can be intercepted by custom
``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument` ``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument`
@@ -408,9 +493,10 @@ def downgrade(
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param revision: string revision target or range for --sql mode :param revision: string revision target or range for --sql mode. May
be ``"base"`` to target the first revision.
:param sql: if True, use ``--sql`` mode :param sql: if True, use ``--sql`` mode.
:param tag: an arbitrary "tag" that can be intercepted by custom :param tag: an arbitrary "tag" that can be intercepted by custom
``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument` ``env.py`` scripts via the :meth:`.EnvironmentContext.get_tag_argument`
@@ -444,12 +530,13 @@ def downgrade(
script.run_env() script.run_env()
def show(config, rev): def show(config: Config, rev: str) -> None:
"""Show the revision(s) denoted by the given symbol. """Show the revision(s) denoted by the given symbol.
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param revision: string revision target :param rev: string revision target. May be ``"current"`` to show the
revision(s) currently applied in the database.
""" """
@@ -479,7 +566,7 @@ def history(
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param rev_range: string revision range :param rev_range: string revision range.
:param verbose: output in verbose mode. :param verbose: output in verbose mode.
@@ -499,7 +586,7 @@ def history(
base = head = None base = head = None
environment = ( environment = (
util.asbool(config.get_main_option("revision_environment")) util.asbool(config.get_alembic_option("revision_environment"))
or indicate_current or indicate_current
) )
@@ -538,7 +625,9 @@ def history(
_display_history(config, script, base, head) _display_history(config, script, base, head)
def heads(config, verbose=False, resolve_dependencies=False): def heads(
config: Config, verbose: bool = False, resolve_dependencies: bool = False
) -> None:
"""Show current available heads in the script directory. """Show current available heads in the script directory.
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
@@ -563,7 +652,7 @@ def heads(config, verbose=False, resolve_dependencies=False):
) )
def branches(config, verbose=False): def branches(config: Config, verbose: bool = False) -> None:
"""Show current branch points. """Show current branch points.
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
@@ -633,7 +722,9 @@ def stamp(
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param revision: target revision or list of revisions. May be a list :param revision: target revision or list of revisions. May be a list
to indicate stamping of multiple branch heads. to indicate stamping of multiple branch heads; may be ``"base"``
to remove all revisions from the table or ``"heads"`` to stamp the
most recent revision(s).
.. note:: this parameter is called "revisions" in the command line .. note:: this parameter is called "revisions" in the command line
interface. interface.
@@ -723,7 +814,7 @@ def ensure_version(config: Config, sql: bool = False) -> None:
:param config: a :class:`.Config` instance. :param config: a :class:`.Config` instance.
:param sql: use ``--sql`` mode :param sql: use ``--sql`` mode.
.. versionadded:: 1.7.6 .. versionadded:: 1.7.6

File diff suppressed because it is too large Load Diff

View File

@@ -5,7 +5,6 @@ from __future__ import annotations
from typing import Any from typing import Any
from typing import Callable from typing import Callable
from typing import Collection from typing import Collection
from typing import ContextManager
from typing import Dict from typing import Dict
from typing import Iterable from typing import Iterable
from typing import List from typing import List
@@ -14,11 +13,14 @@ from typing import Mapping
from typing import MutableMapping from typing import MutableMapping
from typing import Optional from typing import Optional
from typing import overload from typing import overload
from typing import Sequence
from typing import TextIO from typing import TextIO
from typing import Tuple from typing import Tuple
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from typing import Union from typing import Union
from typing_extensions import ContextManager
if TYPE_CHECKING: if TYPE_CHECKING:
from sqlalchemy.engine.base import Connection from sqlalchemy.engine.base import Connection
from sqlalchemy.engine.url import URL from sqlalchemy.engine.url import URL
@@ -39,7 +41,9 @@ if TYPE_CHECKING:
### end imports ### ### end imports ###
def begin_transaction() -> Union[_ProxyTransaction, ContextManager[None]]: def begin_transaction() -> (
Union[_ProxyTransaction, ContextManager[None, Optional[bool]]]
):
"""Return a context manager that will """Return a context manager that will
enclose an operation within a "transaction", enclose an operation within a "transaction",
as defined by the environment's offline as defined by the environment's offline
@@ -97,7 +101,7 @@ def configure(
tag: Optional[str] = None, tag: Optional[str] = None,
template_args: Optional[Dict[str, Any]] = None, template_args: Optional[Dict[str, Any]] = None,
render_as_batch: bool = False, render_as_batch: bool = False,
target_metadata: Optional[MetaData] = None, target_metadata: Union[MetaData, Sequence[MetaData], None] = None,
include_name: Optional[ include_name: Optional[
Callable[ Callable[
[ [
@@ -159,8 +163,8 @@ def configure(
MigrationContext, MigrationContext,
Column[Any], Column[Any],
Column[Any], Column[Any],
TypeEngine, TypeEngine[Any],
TypeEngine, TypeEngine[Any],
], ],
Optional[bool], Optional[bool],
], ],
@@ -635,7 +639,8 @@ def configure(
""" """
def execute( def execute(
sql: Union[Executable, str], execution_options: Optional[dict] = None sql: Union[Executable, str],
execution_options: Optional[Dict[str, Any]] = None,
) -> None: ) -> None:
"""Execute the given SQL using the current change context. """Execute the given SQL using the current change context.
@@ -758,7 +763,11 @@ def get_x_argument(
The return value is a list, returned directly from the ``argparse`` The return value is a list, returned directly from the ``argparse``
structure. If ``as_dictionary=True`` is passed, the ``x`` arguments structure. If ``as_dictionary=True`` is passed, the ``x`` arguments
are parsed using ``key=value`` format into a dictionary that is are parsed using ``key=value`` format into a dictionary that is
then returned. then returned. If there is no ``=`` in the argument, value is an empty
string.
.. versionchanged:: 1.13.1 Support ``as_dictionary=True`` when
arguments are passed without the ``=`` symbol.
For example, to support passing a database URL on the command line, For example, to support passing a database URL on the command line,
the standard ``env.py`` script can be modified like this:: the standard ``env.py`` script can be modified like this::
@@ -800,7 +809,7 @@ def is_offline_mode() -> bool:
""" """
def is_transactional_ddl(): def is_transactional_ddl() -> bool:
"""Return True if the context is configured to expect a """Return True if the context is configured to expect a
transactional DDL capable backend. transactional DDL capable backend.

View File

@@ -3,4 +3,4 @@ from . import mysql
from . import oracle from . import oracle
from . import postgresql from . import postgresql
from . import sqlite from . import sqlite
from .impl import DefaultImpl from .impl import DefaultImpl as DefaultImpl

View File

@@ -0,0 +1,329 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations
from typing import Any
from typing import ClassVar
from typing import Dict
from typing import Generic
from typing import NamedTuple
from typing import Optional
from typing import Sequence
from typing import Tuple
from typing import Type
from typing import TYPE_CHECKING
from typing import TypeVar
from typing import Union
from sqlalchemy.sql.schema import Constraint
from sqlalchemy.sql.schema import ForeignKeyConstraint
from sqlalchemy.sql.schema import Index
from sqlalchemy.sql.schema import UniqueConstraint
from typing_extensions import TypeGuard
from .. import util
from ..util import sqla_compat
if TYPE_CHECKING:
from typing import Literal
from alembic.autogenerate.api import AutogenContext
from alembic.ddl.impl import DefaultImpl
CompareConstraintType = Union[Constraint, Index]
_C = TypeVar("_C", bound=CompareConstraintType)
_clsreg: Dict[str, Type[_constraint_sig]] = {}
class ComparisonResult(NamedTuple):
status: Literal["equal", "different", "skip"]
message: str
@property
def is_equal(self) -> bool:
return self.status == "equal"
@property
def is_different(self) -> bool:
return self.status == "different"
@property
def is_skip(self) -> bool:
return self.status == "skip"
@classmethod
def Equal(cls) -> ComparisonResult:
"""the constraints are equal."""
return cls("equal", "The two constraints are equal")
@classmethod
def Different(cls, reason: Union[str, Sequence[str]]) -> ComparisonResult:
"""the constraints are different for the provided reason(s)."""
return cls("different", ", ".join(util.to_list(reason)))
@classmethod
def Skip(cls, reason: Union[str, Sequence[str]]) -> ComparisonResult:
"""the constraint cannot be compared for the provided reason(s).
The message is logged, but the constraints will be otherwise
considered equal, meaning that no migration command will be
generated.
"""
return cls("skip", ", ".join(util.to_list(reason)))
class _constraint_sig(Generic[_C]):
const: _C
_sig: Tuple[Any, ...]
name: Optional[sqla_compat._ConstraintNameDefined]
impl: DefaultImpl
_is_index: ClassVar[bool] = False
_is_fk: ClassVar[bool] = False
_is_uq: ClassVar[bool] = False
_is_metadata: bool
def __init_subclass__(cls) -> None:
cls._register()
@classmethod
def _register(cls):
raise NotImplementedError()
def __init__(
self, is_metadata: bool, impl: DefaultImpl, const: _C
) -> None:
raise NotImplementedError()
def compare_to_reflected(
self, other: _constraint_sig[Any]
) -> ComparisonResult:
assert self.impl is other.impl
assert self._is_metadata
assert not other._is_metadata
return self._compare_to_reflected(other)
def _compare_to_reflected(
self, other: _constraint_sig[_C]
) -> ComparisonResult:
raise NotImplementedError()
@classmethod
def from_constraint(
cls, is_metadata: bool, impl: DefaultImpl, constraint: _C
) -> _constraint_sig[_C]:
# these could be cached by constraint/impl, however, if the
# constraint is modified in place, then the sig is wrong. the mysql
# impl currently does this, and if we fixed that we can't be sure
# someone else might do it too, so play it safe.
sig = _clsreg[constraint.__visit_name__](is_metadata, impl, constraint)
return sig
def md_name_to_sql_name(self, context: AutogenContext) -> Optional[str]:
return sqla_compat._get_constraint_final_name(
self.const, context.dialect
)
@util.memoized_property
def is_named(self):
return sqla_compat._constraint_is_named(self.const, self.impl.dialect)
@util.memoized_property
def unnamed(self) -> Tuple[Any, ...]:
return self._sig
@util.memoized_property
def unnamed_no_options(self) -> Tuple[Any, ...]:
raise NotImplementedError()
@util.memoized_property
def _full_sig(self) -> Tuple[Any, ...]:
return (self.name,) + self.unnamed
def __eq__(self, other) -> bool:
return self._full_sig == other._full_sig
def __ne__(self, other) -> bool:
return self._full_sig != other._full_sig
def __hash__(self) -> int:
return hash(self._full_sig)
class _uq_constraint_sig(_constraint_sig[UniqueConstraint]):
_is_uq = True
@classmethod
def _register(cls) -> None:
_clsreg["unique_constraint"] = cls
is_unique = True
def __init__(
self,
is_metadata: bool,
impl: DefaultImpl,
const: UniqueConstraint,
) -> None:
self.impl = impl
self.const = const
self.name = sqla_compat.constraint_name_or_none(const.name)
self._sig = tuple(sorted([col.name for col in const.columns]))
self._is_metadata = is_metadata
@property
def column_names(self) -> Tuple[str, ...]:
return tuple([col.name for col in self.const.columns])
def _compare_to_reflected(
self, other: _constraint_sig[_C]
) -> ComparisonResult:
assert self._is_metadata
metadata_obj = self
conn_obj = other
assert is_uq_sig(conn_obj)
return self.impl.compare_unique_constraint(
metadata_obj.const, conn_obj.const
)
class _ix_constraint_sig(_constraint_sig[Index]):
_is_index = True
name: sqla_compat._ConstraintName
@classmethod
def _register(cls) -> None:
_clsreg["index"] = cls
def __init__(
self, is_metadata: bool, impl: DefaultImpl, const: Index
) -> None:
self.impl = impl
self.const = const
self.name = const.name
self.is_unique = bool(const.unique)
self._is_metadata = is_metadata
def _compare_to_reflected(
self, other: _constraint_sig[_C]
) -> ComparisonResult:
assert self._is_metadata
metadata_obj = self
conn_obj = other
assert is_index_sig(conn_obj)
return self.impl.compare_indexes(metadata_obj.const, conn_obj.const)
@util.memoized_property
def has_expressions(self):
return sqla_compat.is_expression_index(self.const)
@util.memoized_property
def column_names(self) -> Tuple[str, ...]:
return tuple([col.name for col in self.const.columns])
@util.memoized_property
def column_names_optional(self) -> Tuple[Optional[str], ...]:
return tuple(
[getattr(col, "name", None) for col in self.const.expressions]
)
@util.memoized_property
def is_named(self):
return True
@util.memoized_property
def unnamed(self):
return (self.is_unique,) + self.column_names_optional
class _fk_constraint_sig(_constraint_sig[ForeignKeyConstraint]):
_is_fk = True
@classmethod
def _register(cls) -> None:
_clsreg["foreign_key_constraint"] = cls
def __init__(
self,
is_metadata: bool,
impl: DefaultImpl,
const: ForeignKeyConstraint,
) -> None:
self._is_metadata = is_metadata
self.impl = impl
self.const = const
self.name = sqla_compat.constraint_name_or_none(const.name)
(
self.source_schema,
self.source_table,
self.source_columns,
self.target_schema,
self.target_table,
self.target_columns,
onupdate,
ondelete,
deferrable,
initially,
) = sqla_compat._fk_spec(const)
self._sig: Tuple[Any, ...] = (
self.source_schema,
self.source_table,
tuple(self.source_columns),
self.target_schema,
self.target_table,
tuple(self.target_columns),
) + (
(
(None if onupdate.lower() == "no action" else onupdate.lower())
if onupdate
else None
),
(
(None if ondelete.lower() == "no action" else ondelete.lower())
if ondelete
else None
),
# convert initially + deferrable into one three-state value
(
"initially_deferrable"
if initially and initially.lower() == "deferred"
else "deferrable" if deferrable else "not deferrable"
),
)
@util.memoized_property
def unnamed_no_options(self):
return (
self.source_schema,
self.source_table,
tuple(self.source_columns),
self.target_schema,
self.target_table,
tuple(self.target_columns),
)
def is_index_sig(sig: _constraint_sig) -> TypeGuard[_ix_constraint_sig]:
return sig._is_index
def is_uq_sig(sig: _constraint_sig) -> TypeGuard[_uq_constraint_sig]:
return sig._is_uq
def is_fk_sig(sig: _constraint_sig) -> TypeGuard[_fk_constraint_sig]:
return sig._is_fk

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import functools import functools
@@ -22,6 +25,8 @@ from ..util.sqla_compat import _table_for_constraint # noqa
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Any from typing import Any
from sqlalchemy import Computed
from sqlalchemy import Identity
from sqlalchemy.sql.compiler import Compiled from sqlalchemy.sql.compiler import Compiled
from sqlalchemy.sql.compiler import DDLCompiler from sqlalchemy.sql.compiler import DDLCompiler
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
@@ -30,14 +35,11 @@ if TYPE_CHECKING:
from sqlalchemy.sql.type_api import TypeEngine from sqlalchemy.sql.type_api import TypeEngine
from .impl import DefaultImpl from .impl import DefaultImpl
from ..util.sqla_compat import Computed
from ..util.sqla_compat import Identity
_ServerDefault = Union["TextClause", "FetchedValue", "Function[Any]", str] _ServerDefault = Union["TextClause", "FetchedValue", "Function[Any]", str]
class AlterTable(DDLElement): class AlterTable(DDLElement):
"""Represent an ALTER TABLE statement. """Represent an ALTER TABLE statement.
Only the string name and optional schema name of the table Only the string name and optional schema name of the table
@@ -152,17 +154,24 @@ class AddColumn(AlterTable):
name: str, name: str,
column: Column[Any], column: Column[Any],
schema: Optional[Union[quoted_name, str]] = None, schema: Optional[Union[quoted_name, str]] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
super().__init__(name, schema=schema) super().__init__(name, schema=schema)
self.column = column self.column = column
self.if_not_exists = if_not_exists
class DropColumn(AlterTable): class DropColumn(AlterTable):
def __init__( def __init__(
self, name: str, column: Column[Any], schema: Optional[str] = None self,
name: str,
column: Column[Any],
schema: Optional[str] = None,
if_exists: Optional[bool] = None,
) -> None: ) -> None:
super().__init__(name, schema=schema) super().__init__(name, schema=schema)
self.column = column self.column = column
self.if_exists = if_exists
class ColumnComment(AlterColumn): class ColumnComment(AlterColumn):
@@ -187,7 +196,9 @@ def visit_rename_table(
def visit_add_column(element: AddColumn, compiler: DDLCompiler, **kw) -> str: def visit_add_column(element: AddColumn, compiler: DDLCompiler, **kw) -> str:
return "%s %s" % ( return "%s %s" % (
alter_table(compiler, element.table_name, element.schema), alter_table(compiler, element.table_name, element.schema),
add_column(compiler, element.column, **kw), add_column(
compiler, element.column, if_not_exists=element.if_not_exists, **kw
),
) )
@@ -195,7 +206,9 @@ def visit_add_column(element: AddColumn, compiler: DDLCompiler, **kw) -> str:
def visit_drop_column(element: DropColumn, compiler: DDLCompiler, **kw) -> str: def visit_drop_column(element: DropColumn, compiler: DDLCompiler, **kw) -> str:
return "%s %s" % ( return "%s %s" % (
alter_table(compiler, element.table_name, element.schema), alter_table(compiler, element.table_name, element.schema),
drop_column(compiler, element.column.name, **kw), drop_column(
compiler, element.column.name, if_exists=element.if_exists, **kw
),
) )
@@ -235,9 +248,11 @@ def visit_column_default(
return "%s %s %s" % ( return "%s %s %s" % (
alter_table(compiler, element.table_name, element.schema), alter_table(compiler, element.table_name, element.schema),
alter_column(compiler, element.column_name), alter_column(compiler, element.column_name),
"SET DEFAULT %s" % format_server_default(compiler, element.default) (
if element.default is not None "SET DEFAULT %s" % format_server_default(compiler, element.default)
else "DROP DEFAULT", if element.default is not None
else "DROP DEFAULT"
),
) )
@@ -295,9 +310,13 @@ def format_server_default(
compiler: DDLCompiler, compiler: DDLCompiler,
default: Optional[_ServerDefault], default: Optional[_ServerDefault],
) -> str: ) -> str:
return compiler.get_column_default_string( # this can be updated to use compiler.render_default_string
# for SQLAlchemy 2.0 and above; not in 1.4
default_str = compiler.get_column_default_string(
Column("x", Integer, server_default=default) Column("x", Integer, server_default=default)
) )
assert default_str is not None
return default_str
def format_type(compiler: DDLCompiler, type_: TypeEngine) -> str: def format_type(compiler: DDLCompiler, type_: TypeEngine) -> str:
@@ -312,16 +331,29 @@ def alter_table(
return "ALTER TABLE %s" % format_table_name(compiler, name, schema) return "ALTER TABLE %s" % format_table_name(compiler, name, schema)
def drop_column(compiler: DDLCompiler, name: str, **kw) -> str: def drop_column(
return "DROP COLUMN %s" % format_column_name(compiler, name) compiler: DDLCompiler, name: str, if_exists: Optional[bool] = None, **kw
) -> str:
return "DROP COLUMN %s%s" % (
"IF EXISTS " if if_exists else "",
format_column_name(compiler, name),
)
def alter_column(compiler: DDLCompiler, name: str) -> str: def alter_column(compiler: DDLCompiler, name: str) -> str:
return "ALTER COLUMN %s" % format_column_name(compiler, name) return "ALTER COLUMN %s" % format_column_name(compiler, name)
def add_column(compiler: DDLCompiler, column: Column[Any], **kw) -> str: def add_column(
text = "ADD COLUMN %s" % compiler.get_column_specification(column, **kw) compiler: DDLCompiler,
column: Column[Any],
if_not_exists: Optional[bool] = None,
**kw,
) -> str:
text = "ADD COLUMN %s%s" % (
"IF NOT EXISTS " if if_not_exists else "",
compiler.get_column_specification(column, **kw),
)
const = " ".join( const = " ".join(
compiler.process(constraint) for constraint in column.constraints compiler.process(constraint) for constraint in column.constraints

View File

@@ -1,6 +1,9 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
from collections import namedtuple import logging
import re import re
from typing import Any from typing import Any
from typing import Callable from typing import Callable
@@ -8,6 +11,7 @@ from typing import Dict
from typing import Iterable from typing import Iterable
from typing import List from typing import List
from typing import Mapping from typing import Mapping
from typing import NamedTuple
from typing import Optional from typing import Optional
from typing import Sequence from typing import Sequence
from typing import Set from typing import Set
@@ -17,10 +21,18 @@ from typing import TYPE_CHECKING
from typing import Union from typing import Union
from sqlalchemy import cast from sqlalchemy import cast
from sqlalchemy import Column
from sqlalchemy import MetaData
from sqlalchemy import PrimaryKeyConstraint
from sqlalchemy import schema from sqlalchemy import schema
from sqlalchemy import String
from sqlalchemy import Table
from sqlalchemy import text from sqlalchemy import text
from . import _autogen
from . import base from . import base
from ._autogen import _constraint_sig as _constraint_sig
from ._autogen import ComparisonResult as ComparisonResult
from .. import util from .. import util
from ..util import sqla_compat from ..util import sqla_compat
@@ -34,13 +46,10 @@ if TYPE_CHECKING:
from sqlalchemy.engine.reflection import Inspector from sqlalchemy.engine.reflection import Inspector
from sqlalchemy.sql import ClauseElement from sqlalchemy.sql import ClauseElement
from sqlalchemy.sql import Executable from sqlalchemy.sql import Executable
from sqlalchemy.sql.elements import ColumnElement
from sqlalchemy.sql.elements import quoted_name from sqlalchemy.sql.elements import quoted_name
from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import Constraint from sqlalchemy.sql.schema import Constraint
from sqlalchemy.sql.schema import ForeignKeyConstraint from sqlalchemy.sql.schema import ForeignKeyConstraint
from sqlalchemy.sql.schema import Index from sqlalchemy.sql.schema import Index
from sqlalchemy.sql.schema import Table
from sqlalchemy.sql.schema import UniqueConstraint from sqlalchemy.sql.schema import UniqueConstraint
from sqlalchemy.sql.selectable import TableClause from sqlalchemy.sql.selectable import TableClause
from sqlalchemy.sql.type_api import TypeEngine from sqlalchemy.sql.type_api import TypeEngine
@@ -50,6 +59,8 @@ if TYPE_CHECKING:
from ..operations.batch import ApplyBatchImpl from ..operations.batch import ApplyBatchImpl
from ..operations.batch import BatchOperationsImpl from ..operations.batch import BatchOperationsImpl
log = logging.getLogger(__name__)
class ImplMeta(type): class ImplMeta(type):
def __init__( def __init__(
@@ -66,11 +77,8 @@ class ImplMeta(type):
_impls: Dict[str, Type[DefaultImpl]] = {} _impls: Dict[str, Type[DefaultImpl]] = {}
Params = namedtuple("Params", ["token0", "tokens", "args", "kwargs"])
class DefaultImpl(metaclass=ImplMeta): class DefaultImpl(metaclass=ImplMeta):
"""Provide the entrypoint for major migration operations, """Provide the entrypoint for major migration operations,
including database-specific behavioral variances. including database-specific behavioral variances.
@@ -130,6 +138,40 @@ class DefaultImpl(metaclass=ImplMeta):
self.output_buffer.write(text + "\n\n") self.output_buffer.write(text + "\n\n")
self.output_buffer.flush() self.output_buffer.flush()
def version_table_impl(
self,
*,
version_table: str,
version_table_schema: Optional[str],
version_table_pk: bool,
**kw: Any,
) -> Table:
"""Generate a :class:`.Table` object which will be used as the
structure for the Alembic version table.
Third party dialects may override this hook to provide an alternate
structure for this :class:`.Table`; requirements are only that it
be named based on the ``version_table`` parameter and contains
at least a single string-holding column named ``version_num``.
.. versionadded:: 1.14
"""
vt = Table(
version_table,
MetaData(),
Column("version_num", String(32), nullable=False),
schema=version_table_schema,
)
if version_table_pk:
vt.append_constraint(
PrimaryKeyConstraint(
"version_num", name=f"{version_table}_pkc"
)
)
return vt
def requires_recreate_in_batch( def requires_recreate_in_batch(
self, batch_op: BatchOperationsImpl self, batch_op: BatchOperationsImpl
) -> bool: ) -> bool:
@@ -161,16 +203,15 @@ class DefaultImpl(metaclass=ImplMeta):
def _exec( def _exec(
self, self,
construct: Union[Executable, str], construct: Union[Executable, str],
execution_options: Optional[dict[str, Any]] = None, execution_options: Optional[Mapping[str, Any]] = None,
multiparams: Sequence[dict] = (), multiparams: Optional[Sequence[Mapping[str, Any]]] = None,
params: Dict[str, Any] = util.immutabledict(), params: Mapping[str, Any] = util.immutabledict(),
) -> Optional[CursorResult]: ) -> Optional[CursorResult]:
if isinstance(construct, str): if isinstance(construct, str):
construct = text(construct) construct = text(construct)
if self.as_sql: if self.as_sql:
if multiparams or params: if multiparams is not None or params:
# TODO: coverage raise TypeError("SQL parameters not allowed with as_sql")
raise Exception("Execution arguments not allowed with as_sql")
compile_kw: dict[str, Any] compile_kw: dict[str, Any]
if self.literal_binds and not isinstance( if self.literal_binds and not isinstance(
@@ -193,11 +234,16 @@ class DefaultImpl(metaclass=ImplMeta):
assert conn is not None assert conn is not None
if execution_options: if execution_options:
conn = conn.execution_options(**execution_options) conn = conn.execution_options(**execution_options)
if params:
assert isinstance(multiparams, tuple)
multiparams += (params,)
return conn.execute(construct, multiparams) if params and multiparams is not None:
raise TypeError(
"Can't send params and multiparams at the same time"
)
if multiparams:
return conn.execute(construct, multiparams)
else:
return conn.execute(construct, params)
def execute( def execute(
self, self,
@@ -210,8 +256,11 @@ class DefaultImpl(metaclass=ImplMeta):
self, self,
table_name: str, table_name: str,
column_name: str, column_name: str,
*,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
server_default: Union[_ServerDefault, Literal[False]] = False, server_default: Optional[
Union[_ServerDefault, Literal[False]]
] = False,
name: Optional[str] = None, name: Optional[str] = None,
type_: Optional[TypeEngine] = None, type_: Optional[TypeEngine] = None,
schema: Optional[str] = None, schema: Optional[str] = None,
@@ -322,25 +371,40 @@ class DefaultImpl(metaclass=ImplMeta):
self, self,
table_name: str, table_name: str,
column: Column[Any], column: Column[Any],
*,
schema: Optional[Union[str, quoted_name]] = None, schema: Optional[Union[str, quoted_name]] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
self._exec(base.AddColumn(table_name, column, schema=schema)) self._exec(
base.AddColumn(
table_name,
column,
schema=schema,
if_not_exists=if_not_exists,
)
)
def drop_column( def drop_column(
self, self,
table_name: str, table_name: str,
column: Column[Any], column: Column[Any],
*,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
**kw, **kw,
) -> None: ) -> None:
self._exec(base.DropColumn(table_name, column, schema=schema)) self._exec(
base.DropColumn(
table_name, column, schema=schema, if_exists=if_exists
)
)
def add_constraint(self, const: Any) -> None: def add_constraint(self, const: Any) -> None:
if const._create_rule is None or const._create_rule(self): if const._create_rule is None or const._create_rule(self):
self._exec(schema.AddConstraint(const)) self._exec(schema.AddConstraint(const))
def drop_constraint(self, const: Constraint) -> None: def drop_constraint(self, const: Constraint, **kw: Any) -> None:
self._exec(schema.DropConstraint(const)) self._exec(schema.DropConstraint(const, **kw))
def rename_table( def rename_table(
self, self,
@@ -352,11 +416,11 @@ class DefaultImpl(metaclass=ImplMeta):
base.RenameTable(old_table_name, new_table_name, schema=schema) base.RenameTable(old_table_name, new_table_name, schema=schema)
) )
def create_table(self, table: Table) -> None: def create_table(self, table: Table, **kw: Any) -> None:
table.dispatch.before_create( table.dispatch.before_create(
table, self.connection, checkfirst=False, _ddl_runner=self table, self.connection, checkfirst=False, _ddl_runner=self
) )
self._exec(schema.CreateTable(table)) self._exec(schema.CreateTable(table, **kw))
table.dispatch.after_create( table.dispatch.after_create(
table, self.connection, checkfirst=False, _ddl_runner=self table, self.connection, checkfirst=False, _ddl_runner=self
) )
@@ -375,11 +439,11 @@ class DefaultImpl(metaclass=ImplMeta):
if comment and with_comment: if comment and with_comment:
self.create_column_comment(column) self.create_column_comment(column)
def drop_table(self, table: Table) -> None: def drop_table(self, table: Table, **kw: Any) -> None:
table.dispatch.before_drop( table.dispatch.before_drop(
table, self.connection, checkfirst=False, _ddl_runner=self table, self.connection, checkfirst=False, _ddl_runner=self
) )
self._exec(schema.DropTable(table)) self._exec(schema.DropTable(table, **kw))
table.dispatch.after_drop( table.dispatch.after_drop(
table, self.connection, checkfirst=False, _ddl_runner=self table, self.connection, checkfirst=False, _ddl_runner=self
) )
@@ -393,7 +457,7 @@ class DefaultImpl(metaclass=ImplMeta):
def drop_table_comment(self, table: Table) -> None: def drop_table_comment(self, table: Table) -> None:
self._exec(schema.DropTableComment(table)) self._exec(schema.DropTableComment(table))
def create_column_comment(self, column: ColumnElement[Any]) -> None: def create_column_comment(self, column: Column[Any]) -> None:
self._exec(schema.SetColumnComment(column)) self._exec(schema.SetColumnComment(column))
def drop_index(self, index: Index, **kw: Any) -> None: def drop_index(self, index: Index, **kw: Any) -> None:
@@ -412,15 +476,19 @@ class DefaultImpl(metaclass=ImplMeta):
if self.as_sql: if self.as_sql:
for row in rows: for row in rows:
self._exec( self._exec(
sqla_compat._insert_inline(table).values( table.insert()
.inline()
.values(
**{ **{
k: sqla_compat._literal_bindparam( k: (
k, v, type_=table.c[k].type sqla_compat._literal_bindparam(
k, v, type_=table.c[k].type
)
if not isinstance(
v, sqla_compat._literal_bindparam
)
else v
) )
if not isinstance(
v, sqla_compat._literal_bindparam
)
else v
for k, v in row.items() for k, v in row.items()
} }
) )
@@ -428,16 +496,13 @@ class DefaultImpl(metaclass=ImplMeta):
else: else:
if rows: if rows:
if multiinsert: if multiinsert:
self._exec( self._exec(table.insert().inline(), multiparams=rows)
sqla_compat._insert_inline(table), multiparams=rows
)
else: else:
for row in rows: for row in rows:
self._exec( self._exec(table.insert().inline().values(**row))
sqla_compat._insert_inline(table).values(**row)
)
def _tokenize_column_type(self, column: Column) -> Params: def _tokenize_column_type(self, column: Column) -> Params:
definition: str
definition = self.dialect.type_compiler.process(column.type).lower() definition = self.dialect.type_compiler.process(column.type).lower()
# tokenize the SQLAlchemy-generated version of a type, so that # tokenize the SQLAlchemy-generated version of a type, so that
@@ -452,9 +517,9 @@ class DefaultImpl(metaclass=ImplMeta):
# varchar character set utf8 # varchar character set utf8
# #
tokens = re.findall(r"[\w\-_]+|\(.+?\)", definition) tokens: List[str] = re.findall(r"[\w\-_]+|\(.+?\)", definition)
term_tokens = [] term_tokens: List[str] = []
paren_term = None paren_term = None
for token in tokens: for token in tokens:
@@ -466,6 +531,7 @@ class DefaultImpl(metaclass=ImplMeta):
params = Params(term_tokens[0], term_tokens[1:], [], {}) params = Params(term_tokens[0], term_tokens[1:], [], {})
if paren_term: if paren_term:
term: str
for term in re.findall("[^(),]+", paren_term): for term in re.findall("[^(),]+", paren_term):
if "=" in term: if "=" in term:
key, val = term.split("=") key, val = term.split("=")
@@ -642,7 +708,7 @@ class DefaultImpl(metaclass=ImplMeta):
diff, ignored = _compare_identity_options( diff, ignored = _compare_identity_options(
metadata_identity, metadata_identity,
inspector_identity, inspector_identity,
sqla_compat.Identity(), schema.Identity(),
skip={"always"}, skip={"always"},
) )
@@ -664,15 +730,96 @@ class DefaultImpl(metaclass=ImplMeta):
bool(diff) or bool(metadata_identity) != bool(inspector_identity), bool(diff) or bool(metadata_identity) != bool(inspector_identity),
) )
def create_index_sig(self, index: Index) -> Tuple[Any, ...]: def _compare_index_unique(
# order of col matters in an index self, metadata_index: Index, reflected_index: Index
return tuple(col.name for col in index.columns) ) -> Optional[str]:
conn_unique = bool(reflected_index.unique)
meta_unique = bool(metadata_index.unique)
if conn_unique != meta_unique:
return f"unique={conn_unique} to unique={meta_unique}"
else:
return None
def create_unique_constraint_sig( def _create_metadata_constraint_sig(
self, const: UniqueConstraint self, constraint: _autogen._C, **opts: Any
) -> Tuple[Any, ...]: ) -> _constraint_sig[_autogen._C]:
# order of col does not matters in an unique constraint return _constraint_sig.from_constraint(True, self, constraint, **opts)
return tuple(sorted([col.name for col in const.columns]))
def _create_reflected_constraint_sig(
self, constraint: _autogen._C, **opts: Any
) -> _constraint_sig[_autogen._C]:
return _constraint_sig.from_constraint(False, self, constraint, **opts)
def compare_indexes(
self,
metadata_index: Index,
reflected_index: Index,
) -> ComparisonResult:
"""Compare two indexes by comparing the signature generated by
``create_index_sig``.
This method returns a ``ComparisonResult``.
"""
msg: List[str] = []
unique_msg = self._compare_index_unique(
metadata_index, reflected_index
)
if unique_msg:
msg.append(unique_msg)
m_sig = self._create_metadata_constraint_sig(metadata_index)
r_sig = self._create_reflected_constraint_sig(reflected_index)
assert _autogen.is_index_sig(m_sig)
assert _autogen.is_index_sig(r_sig)
# The assumption is that the index have no expression
for sig in m_sig, r_sig:
if sig.has_expressions:
log.warning(
"Generating approximate signature for index %s. "
"The dialect "
"implementation should either skip expression indexes "
"or provide a custom implementation.",
sig.const,
)
if m_sig.column_names != r_sig.column_names:
msg.append(
f"expression {r_sig.column_names} to {m_sig.column_names}"
)
if msg:
return ComparisonResult.Different(msg)
else:
return ComparisonResult.Equal()
def compare_unique_constraint(
self,
metadata_constraint: UniqueConstraint,
reflected_constraint: UniqueConstraint,
) -> ComparisonResult:
"""Compare two unique constraints by comparing the two signatures.
The arguments are two tuples that contain the unique constraint and
the signatures generated by ``create_unique_constraint_sig``.
This method returns a ``ComparisonResult``.
"""
metadata_tup = self._create_metadata_constraint_sig(
metadata_constraint
)
reflected_tup = self._create_reflected_constraint_sig(
reflected_constraint
)
meta_sig = metadata_tup.unnamed
conn_sig = reflected_tup.unnamed
if conn_sig != meta_sig:
return ComparisonResult.Different(
f"expression {conn_sig} to {meta_sig}"
)
else:
return ComparisonResult.Equal()
def _skip_functional_indexes(self, metadata_indexes, conn_indexes): def _skip_functional_indexes(self, metadata_indexes, conn_indexes):
conn_indexes_by_name = {c.name: c for c in conn_indexes} conn_indexes_by_name = {c.name: c for c in conn_indexes}
@@ -697,6 +844,13 @@ class DefaultImpl(metaclass=ImplMeta):
return reflected_object.get("dialect_options", {}) return reflected_object.get("dialect_options", {})
class Params(NamedTuple):
token0: str
tokens: List[str]
args: List[str]
kwargs: Dict[str, str]
def _compare_identity_options( def _compare_identity_options(
metadata_io: Union[schema.Identity, schema.Sequence, None], metadata_io: Union[schema.Identity, schema.Sequence, None],
inspector_io: Union[schema.Identity, schema.Sequence, None], inspector_io: Union[schema.Identity, schema.Sequence, None],
@@ -735,12 +889,13 @@ def _compare_identity_options(
set(meta_d).union(insp_d), set(meta_d).union(insp_d),
) )
if sqla_compat.identity_has_dialect_kwargs: if sqla_compat.identity_has_dialect_kwargs:
assert hasattr(default_io, "dialect_kwargs")
# use only the dialect kwargs in inspector_io since metadata_io # use only the dialect kwargs in inspector_io since metadata_io
# can have options for many backends # can have options for many backends
check_dicts( check_dicts(
getattr(metadata_io, "dialect_kwargs", {}), getattr(metadata_io, "dialect_kwargs", {}),
getattr(inspector_io, "dialect_kwargs", {}), getattr(inspector_io, "dialect_kwargs", {}),
default_io.dialect_kwargs, # type: ignore[union-attr] default_io.dialect_kwargs,
getattr(inspector_io, "dialect_kwargs", {}), getattr(inspector_io, "dialect_kwargs", {}),
) )

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import re import re
@@ -9,7 +12,6 @@ from typing import TYPE_CHECKING
from typing import Union from typing import Union
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.ext.compiler import compiles
from sqlalchemy.schema import Column from sqlalchemy.schema import Column
from sqlalchemy.schema import CreateIndex from sqlalchemy.schema import CreateIndex
from sqlalchemy.sql.base import Executable from sqlalchemy.sql.base import Executable
@@ -30,6 +32,7 @@ from .base import RenameTable
from .impl import DefaultImpl from .impl import DefaultImpl
from .. import util from .. import util
from ..util import sqla_compat from ..util import sqla_compat
from ..util.sqla_compat import compiles
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Literal from typing import Literal
@@ -80,10 +83,11 @@ class MSSQLImpl(DefaultImpl):
if self.as_sql and self.batch_separator: if self.as_sql and self.batch_separator:
self.static_output(self.batch_separator) self.static_output(self.batch_separator)
def alter_column( # type:ignore[override] def alter_column(
self, self,
table_name: str, table_name: str,
column_name: str, column_name: str,
*,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
server_default: Optional[ server_default: Optional[
Union[_ServerDefault, Literal[False]] Union[_ServerDefault, Literal[False]]
@@ -199,6 +203,7 @@ class MSSQLImpl(DefaultImpl):
self, self,
table_name: str, table_name: str,
column: Column[Any], column: Column[Any],
*,
schema: Optional[str] = None, schema: Optional[str] = None,
**kw, **kw,
) -> None: ) -> None:

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import re import re
@@ -8,7 +11,9 @@ from typing import Union
from sqlalchemy import schema from sqlalchemy import schema
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.ext.compiler import compiles from sqlalchemy.sql import elements
from sqlalchemy.sql import functions
from sqlalchemy.sql import operators
from .base import alter_table from .base import alter_table
from .base import AlterColumn from .base import AlterColumn
@@ -20,16 +25,16 @@ from .base import format_column_name
from .base import format_server_default from .base import format_server_default
from .impl import DefaultImpl from .impl import DefaultImpl
from .. import util from .. import util
from ..autogenerate import compare
from ..util import sqla_compat from ..util import sqla_compat
from ..util.sqla_compat import _is_mariadb
from ..util.sqla_compat import _is_type_bound from ..util.sqla_compat import _is_type_bound
from ..util.sqla_compat import compiles
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Literal from typing import Literal
from sqlalchemy.dialects.mysql.base import MySQLDDLCompiler from sqlalchemy.dialects.mysql.base import MySQLDDLCompiler
from sqlalchemy.sql.ddl import DropConstraint from sqlalchemy.sql.ddl import DropConstraint
from sqlalchemy.sql.elements import ClauseElement
from sqlalchemy.sql.schema import Constraint from sqlalchemy.sql.schema import Constraint
from sqlalchemy.sql.type_api import TypeEngine from sqlalchemy.sql.type_api import TypeEngine
@@ -46,12 +51,40 @@ class MySQLImpl(DefaultImpl):
) )
type_arg_extract = [r"character set ([\w\-_]+)", r"collate ([\w\-_]+)"] type_arg_extract = [r"character set ([\w\-_]+)", r"collate ([\w\-_]+)"]
def alter_column( # type:ignore[override] def render_ddl_sql_expr(
self,
expr: ClauseElement,
is_server_default: bool = False,
is_index: bool = False,
**kw: Any,
) -> str:
# apply Grouping to index expressions;
# see https://github.com/sqlalchemy/sqlalchemy/blob/
# 36da2eaf3e23269f2cf28420ae73674beafd0661/
# lib/sqlalchemy/dialects/mysql/base.py#L2191
if is_index and (
isinstance(expr, elements.BinaryExpression)
or (
isinstance(expr, elements.UnaryExpression)
and expr.modifier not in (operators.desc_op, operators.asc_op)
)
or isinstance(expr, functions.FunctionElement)
):
expr = elements.Grouping(expr)
return super().render_ddl_sql_expr(
expr, is_server_default=is_server_default, is_index=is_index, **kw
)
def alter_column(
self, self,
table_name: str, table_name: str,
column_name: str, column_name: str,
*,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
server_default: Union[_ServerDefault, Literal[False]] = False, server_default: Optional[
Union[_ServerDefault, Literal[False]]
] = False,
name: Optional[str] = None, name: Optional[str] = None,
type_: Optional[TypeEngine] = None, type_: Optional[TypeEngine] = None,
schema: Optional[str] = None, schema: Optional[str] = None,
@@ -92,21 +125,29 @@ class MySQLImpl(DefaultImpl):
column_name, column_name,
schema=schema, schema=schema,
newname=name if name is not None else column_name, newname=name if name is not None else column_name,
nullable=nullable nullable=(
if nullable is not None nullable
else existing_nullable if nullable is not None
if existing_nullable is not None else (
else True, existing_nullable
if existing_nullable is not None
else True
)
),
type_=type_ if type_ is not None else existing_type, type_=type_ if type_ is not None else existing_type,
default=server_default default=(
if server_default is not False server_default
else existing_server_default, if server_default is not False
autoincrement=autoincrement else existing_server_default
if autoincrement is not None ),
else existing_autoincrement, autoincrement=(
comment=comment autoincrement
if comment is not False if autoincrement is not None
else existing_comment, else existing_autoincrement
),
comment=(
comment if comment is not False else existing_comment
),
) )
) )
elif ( elif (
@@ -121,21 +162,29 @@ class MySQLImpl(DefaultImpl):
column_name, column_name,
schema=schema, schema=schema,
newname=name if name is not None else column_name, newname=name if name is not None else column_name,
nullable=nullable nullable=(
if nullable is not None nullable
else existing_nullable if nullable is not None
if existing_nullable is not None else (
else True, existing_nullable
if existing_nullable is not None
else True
)
),
type_=type_ if type_ is not None else existing_type, type_=type_ if type_ is not None else existing_type,
default=server_default default=(
if server_default is not False server_default
else existing_server_default, if server_default is not False
autoincrement=autoincrement else existing_server_default
if autoincrement is not None ),
else existing_autoincrement, autoincrement=(
comment=comment autoincrement
if comment is not False if autoincrement is not None
else existing_comment, else existing_autoincrement
),
comment=(
comment if comment is not False else existing_comment
),
) )
) )
elif server_default is not False: elif server_default is not False:
@@ -148,6 +197,7 @@ class MySQLImpl(DefaultImpl):
def drop_constraint( def drop_constraint(
self, self,
const: Constraint, const: Constraint,
**kw: Any,
) -> None: ) -> None:
if isinstance(const, schema.CheckConstraint) and _is_type_bound(const): if isinstance(const, schema.CheckConstraint) and _is_type_bound(const):
return return
@@ -157,12 +207,11 @@ class MySQLImpl(DefaultImpl):
def _is_mysql_allowed_functional_default( def _is_mysql_allowed_functional_default(
self, self,
type_: Optional[TypeEngine], type_: Optional[TypeEngine],
server_default: Union[_ServerDefault, Literal[False]], server_default: Optional[Union[_ServerDefault, Literal[False]]],
) -> bool: ) -> bool:
return ( return (
type_ is not None type_ is not None
and type_._type_affinity # type:ignore[attr-defined] and type_._type_affinity is sqltypes.DateTime
is sqltypes.DateTime
and server_default is not None and server_default is not None
) )
@@ -272,10 +321,12 @@ class MySQLImpl(DefaultImpl):
def correct_for_autogen_foreignkeys(self, conn_fks, metadata_fks): def correct_for_autogen_foreignkeys(self, conn_fks, metadata_fks):
conn_fk_by_sig = { conn_fk_by_sig = {
compare._fk_constraint_sig(fk).sig: fk for fk in conn_fks self._create_reflected_constraint_sig(fk).unnamed_no_options: fk
for fk in conn_fks
} }
metadata_fk_by_sig = { metadata_fk_by_sig = {
compare._fk_constraint_sig(fk).sig: fk for fk in metadata_fks self._create_metadata_constraint_sig(fk).unnamed_no_options: fk
for fk in metadata_fks
} }
for sig in set(conn_fk_by_sig).intersection(metadata_fk_by_sig): for sig in set(conn_fk_by_sig).intersection(metadata_fk_by_sig):
@@ -307,7 +358,7 @@ class MySQLAlterDefault(AlterColumn):
self, self,
name: str, name: str,
column_name: str, column_name: str,
default: _ServerDefault, default: Optional[_ServerDefault],
schema: Optional[str] = None, schema: Optional[str] = None,
) -> None: ) -> None:
super(AlterColumn, self).__init__(name, schema=schema) super(AlterColumn, self).__init__(name, schema=schema)
@@ -365,9 +416,11 @@ def _mysql_alter_default(
return "%s ALTER COLUMN %s %s" % ( return "%s ALTER COLUMN %s %s" % (
alter_table(compiler, element.table_name, element.schema), alter_table(compiler, element.table_name, element.schema),
format_column_name(compiler, element.column_name), format_column_name(compiler, element.column_name),
"SET DEFAULT %s" % format_server_default(compiler, element.default) (
if element.default is not None "SET DEFAULT %s" % format_server_default(compiler, element.default)
else "DROP DEFAULT", if element.default is not None
else "DROP DEFAULT"
),
) )
@@ -454,7 +507,7 @@ def _mysql_drop_constraint(
# note that SQLAlchemy as of 1.2 does not yet support # note that SQLAlchemy as of 1.2 does not yet support
# DROP CONSTRAINT for MySQL/MariaDB, so we implement fully # DROP CONSTRAINT for MySQL/MariaDB, so we implement fully
# here. # here.
if _is_mariadb(compiler.dialect): if compiler.dialect.is_mariadb:
return "ALTER TABLE %s DROP CONSTRAINT %s" % ( return "ALTER TABLE %s DROP CONSTRAINT %s" % (
compiler.preparer.format_table(constraint.table), compiler.preparer.format_table(constraint.table),
compiler.preparer.format_constraint(constraint), compiler.preparer.format_constraint(constraint),

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import re import re
@@ -5,7 +8,6 @@ from typing import Any
from typing import Optional from typing import Optional
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from sqlalchemy.ext.compiler import compiles
from sqlalchemy.sql import sqltypes from sqlalchemy.sql import sqltypes
from .base import AddColumn from .base import AddColumn
@@ -22,6 +24,7 @@ from .base import format_type
from .base import IdentityColumnDefault from .base import IdentityColumnDefault
from .base import RenameTable from .base import RenameTable
from .impl import DefaultImpl from .impl import DefaultImpl
from ..util.sqla_compat import compiles
if TYPE_CHECKING: if TYPE_CHECKING:
from sqlalchemy.dialects.oracle.base import OracleDDLCompiler from sqlalchemy.dialects.oracle.base import OracleDDLCompiler
@@ -138,9 +141,11 @@ def visit_column_default(
return "%s %s %s" % ( return "%s %s %s" % (
alter_table(compiler, element.table_name, element.schema), alter_table(compiler, element.table_name, element.schema),
alter_column(compiler, element.column_name), alter_column(compiler, element.column_name),
"DEFAULT %s" % format_server_default(compiler, element.default) (
if element.default is not None "DEFAULT %s" % format_server_default(compiler, element.default)
else "DEFAULT NULL", if element.default is not None
else "DEFAULT NULL"
),
) )

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import logging import logging
@@ -13,18 +16,19 @@ from typing import TYPE_CHECKING
from typing import Union from typing import Union
from sqlalchemy import Column from sqlalchemy import Column
from sqlalchemy import Float
from sqlalchemy import Identity
from sqlalchemy import literal_column from sqlalchemy import literal_column
from sqlalchemy import Numeric from sqlalchemy import Numeric
from sqlalchemy import select
from sqlalchemy import text from sqlalchemy import text
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.dialects.postgresql import BIGINT from sqlalchemy.dialects.postgresql import BIGINT
from sqlalchemy.dialects.postgresql import ExcludeConstraint from sqlalchemy.dialects.postgresql import ExcludeConstraint
from sqlalchemy.dialects.postgresql import INTEGER from sqlalchemy.dialects.postgresql import INTEGER
from sqlalchemy.schema import CreateIndex from sqlalchemy.schema import CreateIndex
from sqlalchemy.sql import operators
from sqlalchemy.sql.elements import ColumnClause from sqlalchemy.sql.elements import ColumnClause
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
from sqlalchemy.sql.elements import UnaryExpression
from sqlalchemy.sql.functions import FunctionElement from sqlalchemy.sql.functions import FunctionElement
from sqlalchemy.types import NULLTYPE from sqlalchemy.types import NULLTYPE
@@ -32,12 +36,12 @@ from .base import alter_column
from .base import alter_table from .base import alter_table
from .base import AlterColumn from .base import AlterColumn
from .base import ColumnComment from .base import ColumnComment
from .base import compiles
from .base import format_column_name from .base import format_column_name
from .base import format_table_name from .base import format_table_name
from .base import format_type from .base import format_type
from .base import IdentityColumnDefault from .base import IdentityColumnDefault
from .base import RenameTable from .base import RenameTable
from .impl import ComparisonResult
from .impl import DefaultImpl from .impl import DefaultImpl
from .. import util from .. import util
from ..autogenerate import render from ..autogenerate import render
@@ -46,6 +50,8 @@ from ..operations import schemaobj
from ..operations.base import BatchOperations from ..operations.base import BatchOperations
from ..operations.base import Operations from ..operations.base import Operations
from ..util import sqla_compat from ..util import sqla_compat
from ..util.sqla_compat import compiles
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Literal from typing import Literal
@@ -130,25 +136,28 @@ class PostgresqlImpl(DefaultImpl):
metadata_default = metadata_column.server_default.arg metadata_default = metadata_column.server_default.arg
if isinstance(metadata_default, str): if isinstance(metadata_default, str):
if not isinstance(inspector_column.type, Numeric): if not isinstance(inspector_column.type, (Numeric, Float)):
metadata_default = re.sub(r"^'|'$", "", metadata_default) metadata_default = re.sub(r"^'|'$", "", metadata_default)
metadata_default = f"'{metadata_default}'" metadata_default = f"'{metadata_default}'"
metadata_default = literal_column(metadata_default) metadata_default = literal_column(metadata_default)
# run a real compare against the server # run a real compare against the server
return not self.connection.scalar( conn = self.connection
sqla_compat._select( assert conn is not None
literal_column(conn_col_default) == metadata_default return not conn.scalar(
) select(literal_column(conn_col_default) == metadata_default)
) )
def alter_column( # type:ignore[override] def alter_column(
self, self,
table_name: str, table_name: str,
column_name: str, column_name: str,
*,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
server_default: Union[_ServerDefault, Literal[False]] = False, server_default: Optional[
Union[_ServerDefault, Literal[False]]
] = False,
name: Optional[str] = None, name: Optional[str] = None,
type_: Optional[TypeEngine] = None, type_: Optional[TypeEngine] = None,
schema: Optional[str] = None, schema: Optional[str] = None,
@@ -214,7 +223,8 @@ class PostgresqlImpl(DefaultImpl):
"join pg_class t on t.oid=d.refobjid " "join pg_class t on t.oid=d.refobjid "
"join pg_attribute a on a.attrelid=t.oid and " "join pg_attribute a on a.attrelid=t.oid and "
"a.attnum=d.refobjsubid " "a.attnum=d.refobjsubid "
"where c.relkind='S' and c.relname=:seqname" "where c.relkind='S' and "
"c.oid=cast(:seqname as regclass)"
), ),
seqname=seq_match.group(1), seqname=seq_match.group(1),
).first() ).first()
@@ -252,62 +262,60 @@ class PostgresqlImpl(DefaultImpl):
if not sqla_compat.sqla_2: if not sqla_compat.sqla_2:
self._skip_functional_indexes(metadata_indexes, conn_indexes) self._skip_functional_indexes(metadata_indexes, conn_indexes)
def _cleanup_index_expr( # pg behavior regarding modifiers
self, index: Index, expr: str, remove_suffix: str # | # | compiled sql | returned sql | regexp. group is removed |
) -> str: # | - | ---------------- | -----------------| ------------------------ |
# start = expr # | 1 | nulls first | nulls first | - |
# | 2 | nulls last | | (?<! desc)( nulls last)$ |
# | 3 | asc | | ( asc)$ |
# | 4 | asc nulls first | nulls first | ( asc) nulls first$ |
# | 5 | asc nulls last | | ( asc nulls last)$ |
# | 6 | desc | desc | - |
# | 7 | desc nulls first | desc | desc( nulls first)$ |
# | 8 | desc nulls last | desc nulls last | - |
_default_modifiers_re = ( # order of case 2 and 5 matters
re.compile("( asc nulls last)$"), # case 5
re.compile("(?<! desc)( nulls last)$"), # case 2
re.compile("( asc)$"), # case 3
re.compile("( asc) nulls first$"), # case 4
re.compile(" desc( nulls first)$"), # case 7
)
def _cleanup_index_expr(self, index: Index, expr: str) -> str:
expr = expr.lower().replace('"', "").replace("'", "") expr = expr.lower().replace('"', "").replace("'", "")
if index.table is not None: if index.table is not None:
# should not be needed, since include_table=False is in compile # should not be needed, since include_table=False is in compile
expr = expr.replace(f"{index.table.name.lower()}.", "") expr = expr.replace(f"{index.table.name.lower()}.", "")
while expr and expr[0] == "(" and expr[-1] == ")":
expr = expr[1:-1]
if "::" in expr: if "::" in expr:
# strip :: cast. types can have spaces in them # strip :: cast. types can have spaces in them
expr = re.sub(r"(::[\w ]+\w)", "", expr) expr = re.sub(r"(::[\w ]+\w)", "", expr)
if remove_suffix and expr.endswith(remove_suffix): while expr and expr[0] == "(" and expr[-1] == ")":
expr = expr[: -len(remove_suffix)] expr = expr[1:-1]
# print(f"START: {start} END: {expr}") # NOTE: when parsing the connection expression this cleanup could
# be skipped
for rs in self._default_modifiers_re:
if match := rs.search(expr):
start, end = match.span(1)
expr = expr[:start] + expr[end:]
break
while expr and expr[0] == "(" and expr[-1] == ")":
expr = expr[1:-1]
# strip casts
cast_re = re.compile(r"cast\s*\(")
if cast_re.match(expr):
expr = cast_re.sub("", expr)
# remove the as type
expr = re.sub(r"as\s+[^)]+\)", "", expr)
# remove spaces
expr = expr.replace(" ", "")
return expr return expr
def _default_modifiers(self, exp: ClauseElement) -> str: def _dialect_options(
to_remove = ""
while isinstance(exp, UnaryExpression):
if exp.modifier is None:
exp = exp.element
else:
op = exp.modifier
if isinstance(exp.element, UnaryExpression):
inner_op = exp.element.modifier
else:
inner_op = None
if inner_op is None:
if op == operators.asc_op:
# default is asc
to_remove = " asc"
elif op == operators.nullslast_op:
# default is nulls last
to_remove = " nulls last"
else:
if (
inner_op == operators.asc_op
and op == operators.nullslast_op
):
# default is asc nulls last
to_remove = " asc nulls last"
elif (
inner_op == operators.desc_op
and op == operators.nullsfirst_op
):
# default for desc is nulls first
to_remove = " nulls first"
break
return to_remove
def _dialect_sig(
self, item: Union[Index, UniqueConstraint] self, item: Union[Index, UniqueConstraint]
) -> Tuple[Any, ...]: ) -> Tuple[Any, ...]:
# only the positive case is returned by sqlalchemy reflection so # only the positive case is returned by sqlalchemy reflection so
@@ -316,25 +324,93 @@ class PostgresqlImpl(DefaultImpl):
return ("nulls_not_distinct",) return ("nulls_not_distinct",)
return () return ()
def create_index_sig(self, index: Index) -> Tuple[Any, ...]: def compare_indexes(
return tuple( self,
self._cleanup_index_expr( metadata_index: Index,
index, reflected_index: Index,
*( ) -> ComparisonResult:
(e, "") msg = []
if isinstance(e, str) unique_msg = self._compare_index_unique(
else (self._compile_element(e), self._default_modifiers(e)) metadata_index, reflected_index
), )
) if unique_msg:
for e in index.expressions msg.append(unique_msg)
) + self._dialect_sig(index) m_exprs = metadata_index.expressions
r_exprs = reflected_index.expressions
if len(m_exprs) != len(r_exprs):
msg.append(f"expression number {len(r_exprs)} to {len(m_exprs)}")
if msg:
# no point going further, return early
return ComparisonResult.Different(msg)
skip = []
for pos, (m_e, r_e) in enumerate(zip(m_exprs, r_exprs), 1):
m_compile = self._compile_element(m_e)
m_text = self._cleanup_index_expr(metadata_index, m_compile)
# print(f"META ORIG: {m_compile!r} CLEANUP: {m_text!r}")
r_compile = self._compile_element(r_e)
r_text = self._cleanup_index_expr(metadata_index, r_compile)
# print(f"CONN ORIG: {r_compile!r} CLEANUP: {r_text!r}")
if m_text == r_text:
continue # expressions these are equal
elif m_compile.strip().endswith("_ops") and (
" " in m_compile or ")" in m_compile # is an expression
):
skip.append(
f"expression #{pos} {m_compile!r} detected "
"as including operator clause."
)
util.warn(
f"Expression #{pos} {m_compile!r} in index "
f"{reflected_index.name!r} detected to include "
"an operator clause. Expression compare cannot proceed. "
"Please move the operator clause to the "
"``postgresql_ops`` dict to enable proper compare "
"of the index expressions: "
"https://docs.sqlalchemy.org/en/latest/dialects/postgresql.html#operator-classes", # noqa: E501
)
else:
msg.append(f"expression #{pos} {r_compile!r} to {m_compile!r}")
def create_unique_constraint_sig( m_options = self._dialect_options(metadata_index)
self, const: UniqueConstraint r_options = self._dialect_options(reflected_index)
) -> Tuple[Any, ...]: if m_options != r_options:
return tuple( msg.extend(f"options {r_options} to {m_options}")
sorted([col.name for col in const.columns])
) + self._dialect_sig(const) if msg:
return ComparisonResult.Different(msg)
elif skip:
# if there are other changes detected don't skip the index
return ComparisonResult.Skip(skip)
else:
return ComparisonResult.Equal()
def compare_unique_constraint(
self,
metadata_constraint: UniqueConstraint,
reflected_constraint: UniqueConstraint,
) -> ComparisonResult:
metadata_tup = self._create_metadata_constraint_sig(
metadata_constraint
)
reflected_tup = self._create_reflected_constraint_sig(
reflected_constraint
)
meta_sig = metadata_tup.unnamed
conn_sig = reflected_tup.unnamed
if conn_sig != meta_sig:
return ComparisonResult.Different(
f"expression {conn_sig} to {meta_sig}"
)
metadata_do = self._dialect_options(metadata_tup.const)
conn_do = self._dialect_options(reflected_tup.const)
if metadata_do != conn_do:
return ComparisonResult.Different(
f"expression {conn_do} to {metadata_do}"
)
return ComparisonResult.Equal()
def adjust_reflected_dialect_options( def adjust_reflected_dialect_options(
self, reflected_options: Dict[str, Any], kind: str self, reflected_options: Dict[str, Any], kind: str
@@ -345,7 +421,9 @@ class PostgresqlImpl(DefaultImpl):
options.pop("postgresql_include", None) options.pop("postgresql_include", None)
return options return options
def _compile_element(self, element: ClauseElement) -> str: def _compile_element(self, element: Union[ClauseElement, str]) -> str:
if isinstance(element, str):
return element
return element.compile( return element.compile(
dialect=self.dialect, dialect=self.dialect,
compile_kwargs={"literal_binds": True, "include_table": False}, compile_kwargs={"literal_binds": True, "include_table": False},
@@ -512,7 +590,7 @@ def visit_identity_column(
) )
else: else:
text += "SET %s " % compiler.get_identity_options( text += "SET %s " % compiler.get_identity_options(
sqla_compat.Identity(**{attr: getattr(identity, attr)}) Identity(**{attr: getattr(identity, attr)})
) )
return text return text
@@ -556,9 +634,8 @@ class CreateExcludeConstraintOp(ops.AddConstraintOp):
return cls( return cls(
constraint.name, constraint.name,
constraint_table.name, constraint_table.name,
[ [ # type: ignore
(expr, op) (expr, op) for expr, name, op in constraint._render_exprs
for expr, name, op in constraint._render_exprs # type:ignore[attr-defined] # noqa
], ],
where=cast("ColumnElement[bool] | None", constraint.where), where=cast("ColumnElement[bool] | None", constraint.where),
schema=constraint_table.schema, schema=constraint_table.schema,
@@ -585,7 +662,7 @@ class CreateExcludeConstraintOp(ops.AddConstraintOp):
expr, expr,
name, name,
oper, oper,
) in excl._render_exprs: # type:ignore[attr-defined] ) in excl._render_exprs:
t.append_column(Column(name, NULLTYPE)) t.append_column(Column(name, NULLTYPE))
t.append_constraint(excl) t.append_constraint(excl)
return excl return excl
@@ -643,7 +720,7 @@ class CreateExcludeConstraintOp(ops.AddConstraintOp):
constraint_name: str, constraint_name: str,
*elements: Any, *elements: Any,
**kw: Any, **kw: Any,
): ) -> Optional[Table]:
"""Issue a "create exclude constraint" instruction using the """Issue a "create exclude constraint" instruction using the
current batch migration context. current batch migration context.
@@ -715,10 +792,13 @@ def _exclude_constraint(
args = [ args = [
"(%s, %r)" "(%s, %r)"
% ( % (
_render_potential_column(sqltext, autogen_context), _render_potential_column(
sqltext, # type:ignore[arg-type]
autogen_context,
),
opstring, opstring,
) )
for sqltext, name, opstring in constraint._render_exprs # type:ignore[attr-defined] # noqa for sqltext, name, opstring in constraint._render_exprs
] ]
if constraint.where is not None: if constraint.where is not None:
args.append( args.append(
@@ -770,5 +850,5 @@ def _render_potential_column(
return render._render_potential_expr( return render._render_potential_expr(
value, value,
autogen_context, autogen_context,
wrap_in_text=isinstance(value, (TextClause, FunctionElement)), wrap_in_element=isinstance(value, (TextClause, FunctionElement)),
) )

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import re import re
@@ -8,16 +11,19 @@ from typing import TYPE_CHECKING
from typing import Union from typing import Union
from sqlalchemy import cast from sqlalchemy import cast
from sqlalchemy import Computed
from sqlalchemy import JSON from sqlalchemy import JSON
from sqlalchemy import schema from sqlalchemy import schema
from sqlalchemy import sql from sqlalchemy import sql
from sqlalchemy.ext.compiler import compiles
from .base import alter_table from .base import alter_table
from .base import ColumnName
from .base import format_column_name
from .base import format_table_name from .base import format_table_name
from .base import RenameTable from .base import RenameTable
from .impl import DefaultImpl from .impl import DefaultImpl
from .. import util from .. import util
from ..util.sqla_compat import compiles
if TYPE_CHECKING: if TYPE_CHECKING:
from sqlalchemy.engine.reflection import Inspector from sqlalchemy.engine.reflection import Inspector
@@ -59,7 +65,7 @@ class SQLiteImpl(DefaultImpl):
) and isinstance(col.server_default.arg, sql.ClauseElement): ) and isinstance(col.server_default.arg, sql.ClauseElement):
return True return True
elif ( elif (
isinstance(col.server_default, util.sqla_compat.Computed) isinstance(col.server_default, Computed)
and col.server_default.persisted and col.server_default.persisted
): ):
return True return True
@@ -71,13 +77,13 @@ class SQLiteImpl(DefaultImpl):
def add_constraint(self, const: Constraint): def add_constraint(self, const: Constraint):
# attempt to distinguish between an # attempt to distinguish between an
# auto-gen constraint and an explicit one # auto-gen constraint and an explicit one
if const._create_rule is None: # type:ignore[attr-defined] if const._create_rule is None:
raise NotImplementedError( raise NotImplementedError(
"No support for ALTER of constraints in SQLite dialect. " "No support for ALTER of constraints in SQLite dialect. "
"Please refer to the batch mode feature which allows for " "Please refer to the batch mode feature which allows for "
"SQLite migrations using a copy-and-move strategy." "SQLite migrations using a copy-and-move strategy."
) )
elif const._create_rule(self): # type:ignore[attr-defined] elif const._create_rule(self):
util.warn( util.warn(
"Skipping unsupported ALTER for " "Skipping unsupported ALTER for "
"creation of implicit constraint. " "creation of implicit constraint. "
@@ -85,8 +91,8 @@ class SQLiteImpl(DefaultImpl):
"SQLite migrations using a copy-and-move strategy." "SQLite migrations using a copy-and-move strategy."
) )
def drop_constraint(self, const: Constraint): def drop_constraint(self, const: Constraint, **kw: Any):
if const._create_rule is None: # type:ignore[attr-defined] if const._create_rule is None:
raise NotImplementedError( raise NotImplementedError(
"No support for ALTER of constraints in SQLite dialect. " "No support for ALTER of constraints in SQLite dialect. "
"Please refer to the batch mode feature which allows for " "Please refer to the batch mode feature which allows for "
@@ -177,8 +183,7 @@ class SQLiteImpl(DefaultImpl):
new_type: TypeEngine, new_type: TypeEngine,
) -> None: ) -> None:
if ( if (
existing.type._type_affinity # type:ignore[attr-defined] existing.type._type_affinity is not new_type._type_affinity
is not new_type._type_affinity # type:ignore[attr-defined]
and not isinstance(new_type, JSON) and not isinstance(new_type, JSON)
): ):
existing_transfer["expr"] = cast( existing_transfer["expr"] = cast(
@@ -205,6 +210,15 @@ def visit_rename_table(
) )
@compiles(ColumnName, "sqlite")
def visit_column_name(element: ColumnName, compiler: DDLCompiler, **kw) -> str:
return "%s RENAME COLUMN %s TO %s" % (
alter_table(compiler, element.table_name, element.schema),
format_column_name(compiler, element.column_name),
format_column_name(compiler, element.newname),
)
# @compiles(AddColumn, 'sqlite') # @compiles(AddColumn, 'sqlite')
# def visit_add_column(element, compiler, **kw): # def visit_add_column(element, compiler, **kw):
# return "%s %s" % ( # return "%s %s" % (

View File

@@ -12,6 +12,7 @@ from typing import List
from typing import Literal from typing import Literal
from typing import Mapping from typing import Mapping
from typing import Optional from typing import Optional
from typing import overload
from typing import Sequence from typing import Sequence
from typing import Tuple from typing import Tuple
from typing import Type from typing import Type
@@ -26,7 +27,6 @@ if TYPE_CHECKING:
from sqlalchemy.sql.elements import conv from sqlalchemy.sql.elements import conv
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
from sqlalchemy.sql.expression import TableClause from sqlalchemy.sql.expression import TableClause
from sqlalchemy.sql.functions import Function
from sqlalchemy.sql.schema import Column from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import Computed from sqlalchemy.sql.schema import Computed
from sqlalchemy.sql.schema import Identity from sqlalchemy.sql.schema import Identity
@@ -35,16 +35,36 @@ if TYPE_CHECKING:
from sqlalchemy.sql.type_api import TypeEngine from sqlalchemy.sql.type_api import TypeEngine
from sqlalchemy.util import immutabledict from sqlalchemy.util import immutabledict
from .operations.ops import BatchOperations from .operations.base import BatchOperations
from .operations.ops import AddColumnOp
from .operations.ops import AddConstraintOp
from .operations.ops import AlterColumnOp
from .operations.ops import AlterTableOp
from .operations.ops import BulkInsertOp
from .operations.ops import CreateIndexOp
from .operations.ops import CreateTableCommentOp
from .operations.ops import CreateTableOp
from .operations.ops import DropColumnOp
from .operations.ops import DropConstraintOp
from .operations.ops import DropIndexOp
from .operations.ops import DropTableCommentOp
from .operations.ops import DropTableOp
from .operations.ops import ExecuteSQLOp
from .operations.ops import MigrateOperation from .operations.ops import MigrateOperation
from .runtime.migration import MigrationContext from .runtime.migration import MigrationContext
from .util.sqla_compat import _literal_bindparam from .util.sqla_compat import _literal_bindparam
_T = TypeVar("_T") _T = TypeVar("_T")
_C = TypeVar("_C", bound=Callable[..., Any])
### end imports ### ### end imports ###
def add_column( def add_column(
table_name: str, column: Column[Any], *, schema: Optional[str] = None table_name: str,
column: Column[Any],
*,
schema: Optional[str] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
"""Issue an "add column" instruction using the current """Issue an "add column" instruction using the current
migration context. migration context.
@@ -121,6 +141,10 @@ def add_column(
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator
when creating the new column for compatible dialects
.. versionadded:: 1.16.0
""" """
@@ -130,12 +154,14 @@ def alter_column(
*, *,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
comment: Union[str, Literal[False], None] = False, comment: Union[str, Literal[False], None] = False,
server_default: Any = False, server_default: Union[
str, bool, Identity, Computed, TextClause, None
] = False,
new_column_name: Optional[str] = None, new_column_name: Optional[str] = None,
type_: Union[TypeEngine, Type[TypeEngine], None] = None, type_: Union[TypeEngine[Any], Type[TypeEngine[Any]], None] = None,
existing_type: Union[TypeEngine, Type[TypeEngine], None] = None, existing_type: Union[TypeEngine[Any], Type[TypeEngine[Any]], None] = None,
existing_server_default: Union[ existing_server_default: Union[
str, bool, Identity, Computed, None str, bool, Identity, Computed, TextClause, None
] = False, ] = False,
existing_nullable: Optional[bool] = None, existing_nullable: Optional[bool] = None,
existing_comment: Optional[str] = None, existing_comment: Optional[str] = None,
@@ -230,7 +256,7 @@ def batch_alter_table(
table_name: str, table_name: str,
schema: Optional[str] = None, schema: Optional[str] = None,
recreate: Literal["auto", "always", "never"] = "auto", recreate: Literal["auto", "always", "never"] = "auto",
partial_reordering: Optional[tuple] = None, partial_reordering: Optional[Tuple[Any, ...]] = None,
copy_from: Optional[Table] = None, copy_from: Optional[Table] = None,
table_args: Tuple[Any, ...] = (), table_args: Tuple[Any, ...] = (),
table_kwargs: Mapping[str, Any] = immutabledict({}), table_kwargs: Mapping[str, Any] = immutabledict({}),
@@ -377,7 +403,7 @@ def batch_alter_table(
def bulk_insert( def bulk_insert(
table: Union[Table, TableClause], table: Union[Table, TableClause],
rows: List[dict], rows: List[Dict[str, Any]],
*, *,
multiinsert: bool = True, multiinsert: bool = True,
) -> None: ) -> None:
@@ -633,7 +659,7 @@ def create_foreign_key(
def create_index( def create_index(
index_name: Optional[str], index_name: Optional[str],
table_name: str, table_name: str,
columns: Sequence[Union[str, TextClause, Function[Any]]], columns: Sequence[Union[str, TextClause, ColumnElement[Any]]],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
unique: bool = False, unique: bool = False,
@@ -730,7 +756,12 @@ def create_primary_key(
""" """
def create_table(table_name: str, *columns: SchemaItem, **kw: Any) -> Table: def create_table(
table_name: str,
*columns: SchemaItem,
if_not_exists: Optional[bool] = None,
**kw: Any,
) -> Table:
r"""Issue a "create table" instruction using the current migration r"""Issue a "create table" instruction using the current migration
context. context.
@@ -801,6 +832,10 @@ def create_table(table_name: str, *columns: SchemaItem, **kw: Any) -> Table:
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator when
creating the new table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
@@ -900,6 +935,11 @@ def drop_column(
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the new column for compatible dialects
.. versionadded:: 1.16.0
:param mssql_drop_check: Optional boolean. When ``True``, on :param mssql_drop_check: Optional boolean. When ``True``, on
Microsoft SQL Server only, first Microsoft SQL Server only, first
drop the CHECK constraint on the column using a drop the CHECK constraint on the column using a
@@ -921,7 +961,6 @@ def drop_column(
then exec's a separate DROP CONSTRAINT for that default. Only then exec's a separate DROP CONSTRAINT for that default. Only
works if the column has exactly one FK constraint which refers to works if the column has exactly one FK constraint which refers to
it, at the moment. it, at the moment.
""" """
def drop_constraint( def drop_constraint(
@@ -930,6 +969,7 @@ def drop_constraint(
type_: Optional[str] = None, type_: Optional[str] = None,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
) -> None: ) -> None:
r"""Drop a constraint of the given name, typically via DROP CONSTRAINT. r"""Drop a constraint of the given name, typically via DROP CONSTRAINT.
@@ -941,6 +981,10 @@ def drop_constraint(
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the constraint
.. versionadded:: 1.16.0
""" """
@@ -981,7 +1025,11 @@ def drop_index(
""" """
def drop_table( def drop_table(
table_name: str, *, schema: Optional[str] = None, **kw: Any table_name: str,
*,
schema: Optional[str] = None,
if_exists: Optional[bool] = None,
**kw: Any,
) -> None: ) -> None:
r"""Issue a "drop table" instruction using the current r"""Issue a "drop table" instruction using the current
migration context. migration context.
@@ -996,6 +1044,10 @@ def drop_table(
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
@@ -1132,7 +1184,7 @@ def f(name: str) -> conv:
names will be converted along conventions. If the ``target_metadata`` names will be converted along conventions. If the ``target_metadata``
contains the naming convention contains the naming convention
``{"ck": "ck_bool_%(table_name)s_%(constraint_name)s"}``, then the ``{"ck": "ck_bool_%(table_name)s_%(constraint_name)s"}``, then the
output of the following: output of the following::
op.add_column("t", "x", Boolean(name="x")) op.add_column("t", "x", Boolean(name="x"))
@@ -1162,7 +1214,7 @@ def get_context() -> MigrationContext:
""" """
def implementation_for(op_cls: Any) -> Callable[..., Any]: def implementation_for(op_cls: Any) -> Callable[[_C], _C]:
"""Register an implementation for a given :class:`.MigrateOperation`. """Register an implementation for a given :class:`.MigrateOperation`.
This is part of the operation extensibility API. This is part of the operation extensibility API.
@@ -1174,7 +1226,7 @@ def implementation_for(op_cls: Any) -> Callable[..., Any]:
""" """
def inline_literal( def inline_literal(
value: Union[str, int], type_: Optional[TypeEngine] = None value: Union[str, int], type_: Optional[TypeEngine[Any]] = None
) -> _literal_bindparam: ) -> _literal_bindparam:
r"""Produce an 'inline literal' expression, suitable for r"""Produce an 'inline literal' expression, suitable for
using in an INSERT, UPDATE, or DELETE statement. using in an INSERT, UPDATE, or DELETE statement.
@@ -1218,6 +1270,27 @@ def inline_literal(
""" """
@overload
def invoke(operation: CreateTableOp) -> Table: ...
@overload
def invoke(
operation: Union[
AddConstraintOp,
DropConstraintOp,
CreateIndexOp,
DropIndexOp,
AddColumnOp,
AlterColumnOp,
AlterTableOp,
CreateTableCommentOp,
DropTableCommentOp,
DropColumnOp,
BulkInsertOp,
DropTableOp,
ExecuteSQLOp,
],
) -> None: ...
@overload
def invoke(operation: MigrateOperation) -> Any: def invoke(operation: MigrateOperation) -> Any:
"""Given a :class:`.MigrateOperation`, invoke it in terms of """Given a :class:`.MigrateOperation`, invoke it in terms of
this :class:`.Operations` instance. this :class:`.Operations` instance.
@@ -1226,7 +1299,7 @@ def invoke(operation: MigrateOperation) -> Any:
def register_operation( def register_operation(
name: str, sourcename: Optional[str] = None name: str, sourcename: Optional[str] = None
) -> Callable[[_T], _T]: ) -> Callable[[Type[_T]], Type[_T]]:
"""Register a new operation for this class. """Register a new operation for this class.
This method is normally used to add new operations This method is normally used to add new operations

View File

@@ -1,3 +1,5 @@
# mypy: allow-untyped-calls
from __future__ import annotations from __future__ import annotations
from contextlib import contextmanager from contextlib import contextmanager
@@ -10,7 +12,9 @@ from typing import Dict
from typing import Iterator from typing import Iterator
from typing import List # noqa from typing import List # noqa
from typing import Mapping from typing import Mapping
from typing import NoReturn
from typing import Optional from typing import Optional
from typing import overload
from typing import Sequence # noqa from typing import Sequence # noqa
from typing import Tuple from typing import Tuple
from typing import Type # noqa from typing import Type # noqa
@@ -39,7 +43,6 @@ if TYPE_CHECKING:
from sqlalchemy.sql.expression import ColumnElement from sqlalchemy.sql.expression import ColumnElement
from sqlalchemy.sql.expression import TableClause from sqlalchemy.sql.expression import TableClause
from sqlalchemy.sql.expression import TextClause from sqlalchemy.sql.expression import TextClause
from sqlalchemy.sql.functions import Function
from sqlalchemy.sql.schema import Column from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import Computed from sqlalchemy.sql.schema import Computed
from sqlalchemy.sql.schema import Identity from sqlalchemy.sql.schema import Identity
@@ -47,12 +50,28 @@ if TYPE_CHECKING:
from sqlalchemy.types import TypeEngine from sqlalchemy.types import TypeEngine
from .batch import BatchOperationsImpl from .batch import BatchOperationsImpl
from .ops import AddColumnOp
from .ops import AddConstraintOp
from .ops import AlterColumnOp
from .ops import AlterTableOp
from .ops import BulkInsertOp
from .ops import CreateIndexOp
from .ops import CreateTableCommentOp
from .ops import CreateTableOp
from .ops import DropColumnOp
from .ops import DropConstraintOp
from .ops import DropIndexOp
from .ops import DropTableCommentOp
from .ops import DropTableOp
from .ops import ExecuteSQLOp
from .ops import MigrateOperation from .ops import MigrateOperation
from ..ddl import DefaultImpl from ..ddl import DefaultImpl
from ..runtime.migration import MigrationContext from ..runtime.migration import MigrationContext
__all__ = ("Operations", "BatchOperations") __all__ = ("Operations", "BatchOperations")
_T = TypeVar("_T") _T = TypeVar("_T")
_C = TypeVar("_C", bound=Callable[..., Any])
class AbstractOperations(util.ModuleClsProxy): class AbstractOperations(util.ModuleClsProxy):
"""Base class for Operations and BatchOperations. """Base class for Operations and BatchOperations.
@@ -86,7 +105,7 @@ class AbstractOperations(util.ModuleClsProxy):
@classmethod @classmethod
def register_operation( def register_operation(
cls, name: str, sourcename: Optional[str] = None cls, name: str, sourcename: Optional[str] = None
) -> Callable[[_T], _T]: ) -> Callable[[Type[_T]], Type[_T]]:
"""Register a new operation for this class. """Register a new operation for this class.
This method is normally used to add new operations This method is normally used to add new operations
@@ -103,7 +122,7 @@ class AbstractOperations(util.ModuleClsProxy):
""" """
def register(op_cls): def register(op_cls: Type[_T]) -> Type[_T]:
if sourcename is None: if sourcename is None:
fn = getattr(op_cls, name) fn = getattr(op_cls, name)
source_name = fn.__name__ source_name = fn.__name__
@@ -122,8 +141,11 @@ class AbstractOperations(util.ModuleClsProxy):
*spec, formatannotation=formatannotation_fwdref *spec, formatannotation=formatannotation_fwdref
) )
num_defaults = len(spec[3]) if spec[3] else 0 num_defaults = len(spec[3]) if spec[3] else 0
defaulted_vals: Tuple[Any, ...]
if num_defaults: if num_defaults:
defaulted_vals = name_args[0 - num_defaults :] defaulted_vals = tuple(name_args[0 - num_defaults :])
else: else:
defaulted_vals = () defaulted_vals = ()
@@ -164,7 +186,7 @@ class AbstractOperations(util.ModuleClsProxy):
globals_ = dict(globals()) globals_ = dict(globals())
globals_.update({"op_cls": op_cls}) globals_.update({"op_cls": op_cls})
lcl = {} lcl: Dict[str, Any] = {}
exec(func_text, globals_, lcl) exec(func_text, globals_, lcl)
setattr(cls, name, lcl[name]) setattr(cls, name, lcl[name])
@@ -180,7 +202,7 @@ class AbstractOperations(util.ModuleClsProxy):
return register return register
@classmethod @classmethod
def implementation_for(cls, op_cls: Any) -> Callable[..., Any]: def implementation_for(cls, op_cls: Any) -> Callable[[_C], _C]:
"""Register an implementation for a given :class:`.MigrateOperation`. """Register an implementation for a given :class:`.MigrateOperation`.
This is part of the operation extensibility API. This is part of the operation extensibility API.
@@ -191,7 +213,7 @@ class AbstractOperations(util.ModuleClsProxy):
""" """
def decorate(fn): def decorate(fn: _C) -> _C:
cls._to_impl.dispatch_for(op_cls)(fn) cls._to_impl.dispatch_for(op_cls)(fn)
return fn return fn
@@ -213,7 +235,7 @@ class AbstractOperations(util.ModuleClsProxy):
table_name: str, table_name: str,
schema: Optional[str] = None, schema: Optional[str] = None,
recreate: Literal["auto", "always", "never"] = "auto", recreate: Literal["auto", "always", "never"] = "auto",
partial_reordering: Optional[tuple] = None, partial_reordering: Optional[Tuple[Any, ...]] = None,
copy_from: Optional[Table] = None, copy_from: Optional[Table] = None,
table_args: Tuple[Any, ...] = (), table_args: Tuple[Any, ...] = (),
table_kwargs: Mapping[str, Any] = util.immutabledict(), table_kwargs: Mapping[str, Any] = util.immutabledict(),
@@ -382,6 +404,32 @@ class AbstractOperations(util.ModuleClsProxy):
return self.migration_context return self.migration_context
@overload
def invoke(self, operation: CreateTableOp) -> Table: ...
@overload
def invoke(
self,
operation: Union[
AddConstraintOp,
DropConstraintOp,
CreateIndexOp,
DropIndexOp,
AddColumnOp,
AlterColumnOp,
AlterTableOp,
CreateTableCommentOp,
DropTableCommentOp,
DropColumnOp,
BulkInsertOp,
DropTableOp,
ExecuteSQLOp,
],
) -> None: ...
@overload
def invoke(self, operation: MigrateOperation) -> Any: ...
def invoke(self, operation: MigrateOperation) -> Any: def invoke(self, operation: MigrateOperation) -> Any:
"""Given a :class:`.MigrateOperation`, invoke it in terms of """Given a :class:`.MigrateOperation`, invoke it in terms of
this :class:`.Operations` instance. this :class:`.Operations` instance.
@@ -416,7 +464,7 @@ class AbstractOperations(util.ModuleClsProxy):
names will be converted along conventions. If the ``target_metadata`` names will be converted along conventions. If the ``target_metadata``
contains the naming convention contains the naming convention
``{"ck": "ck_bool_%(table_name)s_%(constraint_name)s"}``, then the ``{"ck": "ck_bool_%(table_name)s_%(constraint_name)s"}``, then the
output of the following: output of the following::
op.add_column("t", "x", Boolean(name="x")) op.add_column("t", "x", Boolean(name="x"))
@@ -570,6 +618,7 @@ class Operations(AbstractOperations):
column: Column[Any], column: Column[Any],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
"""Issue an "add column" instruction using the current """Issue an "add column" instruction using the current
migration context. migration context.
@@ -646,6 +695,10 @@ class Operations(AbstractOperations):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator
when creating the new column for compatible dialects
.. versionadded:: 1.16.0
""" # noqa: E501 """ # noqa: E501
... ...
@@ -657,12 +710,16 @@ class Operations(AbstractOperations):
*, *,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
comment: Union[str, Literal[False], None] = False, comment: Union[str, Literal[False], None] = False,
server_default: Any = False, server_default: Union[
str, bool, Identity, Computed, TextClause, None
] = False,
new_column_name: Optional[str] = None, new_column_name: Optional[str] = None,
type_: Union[TypeEngine, Type[TypeEngine], None] = None, type_: Union[TypeEngine[Any], Type[TypeEngine[Any]], None] = None,
existing_type: Union[TypeEngine, Type[TypeEngine], None] = None, existing_type: Union[
TypeEngine[Any], Type[TypeEngine[Any]], None
] = None,
existing_server_default: Union[ existing_server_default: Union[
str, bool, Identity, Computed, None str, bool, Identity, Computed, TextClause, None
] = False, ] = False,
existing_nullable: Optional[bool] = None, existing_nullable: Optional[bool] = None,
existing_comment: Optional[str] = None, existing_comment: Optional[str] = None,
@@ -756,7 +813,7 @@ class Operations(AbstractOperations):
def bulk_insert( def bulk_insert(
self, self,
table: Union[Table, TableClause], table: Union[Table, TableClause],
rows: List[dict], rows: List[Dict[str, Any]],
*, *,
multiinsert: bool = True, multiinsert: bool = True,
) -> None: ) -> None:
@@ -1023,7 +1080,7 @@ class Operations(AbstractOperations):
self, self,
index_name: Optional[str], index_name: Optional[str],
table_name: str, table_name: str,
columns: Sequence[Union[str, TextClause, Function[Any]]], columns: Sequence[Union[str, TextClause, ColumnElement[Any]]],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
unique: bool = False, unique: bool = False,
@@ -1124,7 +1181,11 @@ class Operations(AbstractOperations):
... ...
def create_table( def create_table(
self, table_name: str, *columns: SchemaItem, **kw: Any self,
table_name: str,
*columns: SchemaItem,
if_not_exists: Optional[bool] = None,
**kw: Any,
) -> Table: ) -> Table:
r"""Issue a "create table" instruction using the current migration r"""Issue a "create table" instruction using the current migration
context. context.
@@ -1196,6 +1257,10 @@ class Operations(AbstractOperations):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator when
creating the new table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
@@ -1301,6 +1366,11 @@ class Operations(AbstractOperations):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the new column for compatible dialects
.. versionadded:: 1.16.0
:param mssql_drop_check: Optional boolean. When ``True``, on :param mssql_drop_check: Optional boolean. When ``True``, on
Microsoft SQL Server only, first Microsoft SQL Server only, first
drop the CHECK constraint on the column using a drop the CHECK constraint on the column using a
@@ -1322,7 +1392,6 @@ class Operations(AbstractOperations):
then exec's a separate DROP CONSTRAINT for that default. Only then exec's a separate DROP CONSTRAINT for that default. Only
works if the column has exactly one FK constraint which refers to works if the column has exactly one FK constraint which refers to
it, at the moment. it, at the moment.
""" # noqa: E501 """ # noqa: E501
... ...
@@ -1333,6 +1402,7 @@ class Operations(AbstractOperations):
type_: Optional[str] = None, type_: Optional[str] = None,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
) -> None: ) -> None:
r"""Drop a constraint of the given name, typically via DROP CONSTRAINT. r"""Drop a constraint of the given name, typically via DROP CONSTRAINT.
@@ -1344,6 +1414,10 @@ class Operations(AbstractOperations):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the constraint
.. versionadded:: 1.16.0
""" # noqa: E501 """ # noqa: E501
... ...
@@ -1387,7 +1461,12 @@ class Operations(AbstractOperations):
... ...
def drop_table( def drop_table(
self, table_name: str, *, schema: Optional[str] = None, **kw: Any self,
table_name: str,
*,
schema: Optional[str] = None,
if_exists: Optional[bool] = None,
**kw: Any,
) -> None: ) -> None:
r"""Issue a "drop table" instruction using the current r"""Issue a "drop table" instruction using the current
migration context. migration context.
@@ -1402,6 +1481,10 @@ class Operations(AbstractOperations):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
@@ -1560,7 +1643,7 @@ class BatchOperations(AbstractOperations):
impl: BatchOperationsImpl impl: BatchOperationsImpl
def _noop(self, operation): def _noop(self, operation: Any) -> NoReturn:
raise NotImplementedError( raise NotImplementedError(
"The %s method does not apply to a batch table alter operation." "The %s method does not apply to a batch table alter operation."
% operation % operation
@@ -1577,6 +1660,7 @@ class BatchOperations(AbstractOperations):
*, *,
insert_before: Optional[str] = None, insert_before: Optional[str] = None,
insert_after: Optional[str] = None, insert_after: Optional[str] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
"""Issue an "add column" instruction using the current """Issue an "add column" instruction using the current
batch migration context. batch migration context.
@@ -1596,8 +1680,10 @@ class BatchOperations(AbstractOperations):
comment: Union[str, Literal[False], None] = False, comment: Union[str, Literal[False], None] = False,
server_default: Any = False, server_default: Any = False,
new_column_name: Optional[str] = None, new_column_name: Optional[str] = None,
type_: Union[TypeEngine, Type[TypeEngine], None] = None, type_: Union[TypeEngine[Any], Type[TypeEngine[Any]], None] = None,
existing_type: Union[TypeEngine, Type[TypeEngine], None] = None, existing_type: Union[
TypeEngine[Any], Type[TypeEngine[Any]], None
] = None,
existing_server_default: Union[ existing_server_default: Union[
str, bool, Identity, Computed, None str, bool, Identity, Computed, None
] = False, ] = False,
@@ -1652,7 +1738,7 @@ class BatchOperations(AbstractOperations):
def create_exclude_constraint( def create_exclude_constraint(
self, constraint_name: str, *elements: Any, **kw: Any self, constraint_name: str, *elements: Any, **kw: Any
): ) -> Optional[Table]:
"""Issue a "create exclude constraint" instruction using the """Issue a "create exclude constraint" instruction using the
current batch migration context. current batch migration context.
@@ -1668,7 +1754,7 @@ class BatchOperations(AbstractOperations):
def create_foreign_key( def create_foreign_key(
self, self,
constraint_name: str, constraint_name: Optional[str],
referent_table: str, referent_table: str,
local_cols: List[str], local_cols: List[str],
remote_cols: List[str], remote_cols: List[str],
@@ -1718,7 +1804,7 @@ class BatchOperations(AbstractOperations):
... ...
def create_primary_key( def create_primary_key(
self, constraint_name: str, columns: List[str] self, constraint_name: Optional[str], columns: List[str]
) -> None: ) -> None:
"""Issue a "create primary key" instruction using the """Issue a "create primary key" instruction using the
current batch migration context. current batch migration context.

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
from typing import Any from typing import Any
@@ -15,9 +18,10 @@ from sqlalchemy import Index
from sqlalchemy import MetaData from sqlalchemy import MetaData
from sqlalchemy import PrimaryKeyConstraint from sqlalchemy import PrimaryKeyConstraint
from sqlalchemy import schema as sql_schema from sqlalchemy import schema as sql_schema
from sqlalchemy import select
from sqlalchemy import Table from sqlalchemy import Table
from sqlalchemy import types as sqltypes from sqlalchemy import types as sqltypes
from sqlalchemy.events import SchemaEventTarget from sqlalchemy.sql.schema import SchemaEventTarget
from sqlalchemy.util import OrderedDict from sqlalchemy.util import OrderedDict
from sqlalchemy.util import topological from sqlalchemy.util import topological
@@ -28,11 +32,9 @@ from ..util.sqla_compat import _copy_expression
from ..util.sqla_compat import _ensure_scope_for_ddl from ..util.sqla_compat import _ensure_scope_for_ddl
from ..util.sqla_compat import _fk_is_self_referential from ..util.sqla_compat import _fk_is_self_referential
from ..util.sqla_compat import _idx_table_bound_expressions from ..util.sqla_compat import _idx_table_bound_expressions
from ..util.sqla_compat import _insert_inline
from ..util.sqla_compat import _is_type_bound from ..util.sqla_compat import _is_type_bound
from ..util.sqla_compat import _remove_column_from_collection from ..util.sqla_compat import _remove_column_from_collection
from ..util.sqla_compat import _resolve_for_variant from ..util.sqla_compat import _resolve_for_variant
from ..util.sqla_compat import _select
from ..util.sqla_compat import constraint_name_defined from ..util.sqla_compat import constraint_name_defined
from ..util.sqla_compat import constraint_name_string from ..util.sqla_compat import constraint_name_string
@@ -374,7 +376,7 @@ class ApplyBatchImpl:
for idx_existing in self.indexes.values(): for idx_existing in self.indexes.values():
# this is a lift-and-move from Table.to_metadata # this is a lift-and-move from Table.to_metadata
if idx_existing._column_flag: # type: ignore if idx_existing._column_flag:
continue continue
idx_copy = Index( idx_copy = Index(
@@ -403,9 +405,7 @@ class ApplyBatchImpl:
def _setup_referent( def _setup_referent(
self, metadata: MetaData, constraint: ForeignKeyConstraint self, metadata: MetaData, constraint: ForeignKeyConstraint
) -> None: ) -> None:
spec = constraint.elements[ spec = constraint.elements[0]._get_colspec()
0
]._get_colspec() # type:ignore[attr-defined]
parts = spec.split(".") parts = spec.split(".")
tname = parts[-2] tname = parts[-2]
if len(parts) == 3: if len(parts) == 3:
@@ -448,13 +448,15 @@ class ApplyBatchImpl:
try: try:
op_impl._exec( op_impl._exec(
_insert_inline(self.new_table).from_select( self.new_table.insert()
.inline()
.from_select(
list( list(
k k
for k, transfer in self.column_transfers.items() for k, transfer in self.column_transfers.items()
if "expr" in transfer if "expr" in transfer
), ),
_select( select(
*[ *[
transfer["expr"] transfer["expr"]
for transfer in self.column_transfers.values() for transfer in self.column_transfers.values()
@@ -546,9 +548,7 @@ class ApplyBatchImpl:
else: else:
sql_schema.DefaultClause( sql_schema.DefaultClause(
server_default # type: ignore[arg-type] server_default # type: ignore[arg-type]
)._set_parent( # type:ignore[attr-defined] )._set_parent(existing)
existing
)
if autoincrement is not None: if autoincrement is not None:
existing.autoincrement = bool(autoincrement) existing.autoincrement = bool(autoincrement)

View File

@@ -1,10 +1,13 @@
from __future__ import annotations from __future__ import annotations
from abc import abstractmethod from abc import abstractmethod
import os
import pathlib
import re import re
from typing import Any from typing import Any
from typing import Callable from typing import Callable
from typing import cast from typing import cast
from typing import Dict
from typing import FrozenSet from typing import FrozenSet
from typing import Iterator from typing import Iterator
from typing import List from typing import List
@@ -15,6 +18,7 @@ from typing import Set
from typing import Tuple from typing import Tuple
from typing import Type from typing import Type
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from typing import TypeVar
from typing import Union from typing import Union
from sqlalchemy.types import NULLTYPE from sqlalchemy.types import NULLTYPE
@@ -33,7 +37,6 @@ if TYPE_CHECKING:
from sqlalchemy.sql.elements import conv from sqlalchemy.sql.elements import conv
from sqlalchemy.sql.elements import quoted_name from sqlalchemy.sql.elements import quoted_name
from sqlalchemy.sql.elements import TextClause from sqlalchemy.sql.elements import TextClause
from sqlalchemy.sql.functions import Function
from sqlalchemy.sql.schema import CheckConstraint from sqlalchemy.sql.schema import CheckConstraint
from sqlalchemy.sql.schema import Column from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import Computed from sqlalchemy.sql.schema import Computed
@@ -53,6 +56,9 @@ if TYPE_CHECKING:
from ..runtime.migration import MigrationContext from ..runtime.migration import MigrationContext
from ..script.revision import _RevIdType from ..script.revision import _RevIdType
_T = TypeVar("_T", bound=Any)
_AC = TypeVar("_AC", bound="AddConstraintOp")
class MigrateOperation: class MigrateOperation:
"""base class for migration command and organization objects. """base class for migration command and organization objects.
@@ -70,7 +76,7 @@ class MigrateOperation:
""" """
@util.memoized_property @util.memoized_property
def info(self): def info(self) -> Dict[Any, Any]:
"""A dictionary that may be used to store arbitrary information """A dictionary that may be used to store arbitrary information
along with this :class:`.MigrateOperation` object. along with this :class:`.MigrateOperation` object.
@@ -92,12 +98,14 @@ class AddConstraintOp(MigrateOperation):
add_constraint_ops = util.Dispatcher() add_constraint_ops = util.Dispatcher()
@property @property
def constraint_type(self): def constraint_type(self) -> str:
raise NotImplementedError() raise NotImplementedError()
@classmethod @classmethod
def register_add_constraint(cls, type_: str) -> Callable: def register_add_constraint(
def go(klass): cls, type_: str
) -> Callable[[Type[_AC]], Type[_AC]]:
def go(klass: Type[_AC]) -> Type[_AC]:
cls.add_constraint_ops.dispatch_for(type_)(klass.from_constraint) cls.add_constraint_ops.dispatch_for(type_)(klass.from_constraint)
return klass return klass
@@ -105,7 +113,7 @@ class AddConstraintOp(MigrateOperation):
@classmethod @classmethod
def from_constraint(cls, constraint: Constraint) -> AddConstraintOp: def from_constraint(cls, constraint: Constraint) -> AddConstraintOp:
return cls.add_constraint_ops.dispatch(constraint.__visit_name__)( return cls.add_constraint_ops.dispatch(constraint.__visit_name__)( # type: ignore[no-any-return] # noqa: E501
constraint constraint
) )
@@ -134,12 +142,14 @@ class DropConstraintOp(MigrateOperation):
type_: Optional[str] = None, type_: Optional[str] = None,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
_reverse: Optional[AddConstraintOp] = None, _reverse: Optional[AddConstraintOp] = None,
) -> None: ) -> None:
self.constraint_name = constraint_name self.constraint_name = constraint_name
self.table_name = table_name self.table_name = table_name
self.constraint_type = type_ self.constraint_type = type_
self.schema = schema self.schema = schema
self.if_exists = if_exists
self._reverse = _reverse self._reverse = _reverse
def reverse(self) -> AddConstraintOp: def reverse(self) -> AddConstraintOp:
@@ -197,6 +207,7 @@ class DropConstraintOp(MigrateOperation):
type_: Optional[str] = None, type_: Optional[str] = None,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
) -> None: ) -> None:
r"""Drop a constraint of the given name, typically via DROP CONSTRAINT. r"""Drop a constraint of the given name, typically via DROP CONSTRAINT.
@@ -208,10 +219,20 @@ class DropConstraintOp(MigrateOperation):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the constraint
.. versionadded:: 1.16.0
""" """
op = cls(constraint_name, table_name, type_=type_, schema=schema) op = cls(
constraint_name,
table_name,
type_=type_,
schema=schema,
if_exists=if_exists,
)
return operations.invoke(op) return operations.invoke(op)
@classmethod @classmethod
@@ -342,7 +363,7 @@ class CreatePrimaryKeyOp(AddConstraintOp):
def batch_create_primary_key( def batch_create_primary_key(
cls, cls,
operations: BatchOperations, operations: BatchOperations,
constraint_name: str, constraint_name: Optional[str],
columns: List[str], columns: List[str],
) -> None: ) -> None:
"""Issue a "create primary key" instruction using the """Issue a "create primary key" instruction using the
@@ -398,7 +419,7 @@ class CreateUniqueConstraintOp(AddConstraintOp):
uq_constraint = cast("UniqueConstraint", constraint) uq_constraint = cast("UniqueConstraint", constraint)
kw: dict = {} kw: Dict[str, Any] = {}
if uq_constraint.deferrable: if uq_constraint.deferrable:
kw["deferrable"] = uq_constraint.deferrable kw["deferrable"] = uq_constraint.deferrable
if uq_constraint.initially: if uq_constraint.initially:
@@ -532,7 +553,7 @@ class CreateForeignKeyOp(AddConstraintOp):
@classmethod @classmethod
def from_constraint(cls, constraint: Constraint) -> CreateForeignKeyOp: def from_constraint(cls, constraint: Constraint) -> CreateForeignKeyOp:
fk_constraint = cast("ForeignKeyConstraint", constraint) fk_constraint = cast("ForeignKeyConstraint", constraint)
kw: dict = {} kw: Dict[str, Any] = {}
if fk_constraint.onupdate: if fk_constraint.onupdate:
kw["onupdate"] = fk_constraint.onupdate kw["onupdate"] = fk_constraint.onupdate
if fk_constraint.ondelete: if fk_constraint.ondelete:
@@ -674,7 +695,7 @@ class CreateForeignKeyOp(AddConstraintOp):
def batch_create_foreign_key( def batch_create_foreign_key(
cls, cls,
operations: BatchOperations, operations: BatchOperations,
constraint_name: str, constraint_name: Optional[str],
referent_table: str, referent_table: str,
local_cols: List[str], local_cols: List[str],
remote_cols: List[str], remote_cols: List[str],
@@ -897,9 +918,9 @@ class CreateIndexOp(MigrateOperation):
def from_index(cls, index: Index) -> CreateIndexOp: def from_index(cls, index: Index) -> CreateIndexOp:
assert index.table is not None assert index.table is not None
return cls( return cls(
index.name, # type: ignore[arg-type] index.name,
index.table.name, index.table.name,
sqla_compat._get_index_expressions(index), index.expressions,
schema=index.table.schema, schema=index.table.schema,
unique=index.unique, unique=index.unique,
**index.kwargs, **index.kwargs,
@@ -926,7 +947,7 @@ class CreateIndexOp(MigrateOperation):
operations: Operations, operations: Operations,
index_name: Optional[str], index_name: Optional[str],
table_name: str, table_name: str,
columns: Sequence[Union[str, TextClause, Function[Any]]], columns: Sequence[Union[str, TextClause, ColumnElement[Any]]],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
unique: bool = False, unique: bool = False,
@@ -1054,6 +1075,7 @@ class DropIndexOp(MigrateOperation):
table_name=index.table.name, table_name=index.table.name,
schema=index.table.schema, schema=index.table.schema,
_reverse=CreateIndexOp.from_index(index), _reverse=CreateIndexOp.from_index(index),
unique=index.unique,
**index.kwargs, **index.kwargs,
) )
@@ -1151,6 +1173,7 @@ class CreateTableOp(MigrateOperation):
columns: Sequence[SchemaItem], columns: Sequence[SchemaItem],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_not_exists: Optional[bool] = None,
_namespace_metadata: Optional[MetaData] = None, _namespace_metadata: Optional[MetaData] = None,
_constraints_included: bool = False, _constraints_included: bool = False,
**kw: Any, **kw: Any,
@@ -1158,6 +1181,7 @@ class CreateTableOp(MigrateOperation):
self.table_name = table_name self.table_name = table_name
self.columns = columns self.columns = columns
self.schema = schema self.schema = schema
self.if_not_exists = if_not_exists
self.info = kw.pop("info", {}) self.info = kw.pop("info", {})
self.comment = kw.pop("comment", None) self.comment = kw.pop("comment", None)
self.prefixes = kw.pop("prefixes", None) self.prefixes = kw.pop("prefixes", None)
@@ -1182,7 +1206,7 @@ class CreateTableOp(MigrateOperation):
return cls( return cls(
table.name, table.name,
list(table.c) + list(table.constraints), # type:ignore[arg-type] list(table.c) + list(table.constraints),
schema=table.schema, schema=table.schema,
_namespace_metadata=_namespace_metadata, _namespace_metadata=_namespace_metadata,
# given a Table() object, this Table will contain full Index() # given a Table() object, this Table will contain full Index()
@@ -1220,6 +1244,7 @@ class CreateTableOp(MigrateOperation):
operations: Operations, operations: Operations,
table_name: str, table_name: str,
*columns: SchemaItem, *columns: SchemaItem,
if_not_exists: Optional[bool] = None,
**kw: Any, **kw: Any,
) -> Table: ) -> Table:
r"""Issue a "create table" instruction using the current migration r"""Issue a "create table" instruction using the current migration
@@ -1292,6 +1317,10 @@ class CreateTableOp(MigrateOperation):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator when
creating the new table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
@@ -1299,7 +1328,7 @@ class CreateTableOp(MigrateOperation):
to the parameters given. to the parameters given.
""" """
op = cls(table_name, columns, **kw) op = cls(table_name, columns, if_not_exists=if_not_exists, **kw)
return operations.invoke(op) return operations.invoke(op)
@@ -1312,11 +1341,13 @@ class DropTableOp(MigrateOperation):
table_name: str, table_name: str,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
table_kw: Optional[MutableMapping[Any, Any]] = None, table_kw: Optional[MutableMapping[Any, Any]] = None,
_reverse: Optional[CreateTableOp] = None, _reverse: Optional[CreateTableOp] = None,
) -> None: ) -> None:
self.table_name = table_name self.table_name = table_name
self.schema = schema self.schema = schema
self.if_exists = if_exists
self.table_kw = table_kw or {} self.table_kw = table_kw or {}
self.comment = self.table_kw.pop("comment", None) self.comment = self.table_kw.pop("comment", None)
self.info = self.table_kw.pop("info", None) self.info = self.table_kw.pop("info", None)
@@ -1363,9 +1394,9 @@ class DropTableOp(MigrateOperation):
info=self.info.copy() if self.info else {}, info=self.info.copy() if self.info else {},
prefixes=list(self.prefixes) if self.prefixes else [], prefixes=list(self.prefixes) if self.prefixes else [],
schema=self.schema, schema=self.schema,
_constraints_included=self._reverse._constraints_included _constraints_included=(
if self._reverse self._reverse._constraints_included if self._reverse else False
else False, ),
**self.table_kw, **self.table_kw,
) )
return t return t
@@ -1377,6 +1408,7 @@ class DropTableOp(MigrateOperation):
table_name: str, table_name: str,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
**kw: Any, **kw: Any,
) -> None: ) -> None:
r"""Issue a "drop table" instruction using the current r"""Issue a "drop table" instruction using the current
@@ -1392,11 +1424,15 @@ class DropTableOp(MigrateOperation):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the table.
.. versionadded:: 1.13.3
:param \**kw: Other keyword arguments are passed to the underlying :param \**kw: Other keyword arguments are passed to the underlying
:class:`sqlalchemy.schema.Table` object created for the command. :class:`sqlalchemy.schema.Table` object created for the command.
""" """
op = cls(table_name, schema=schema, table_kw=kw) op = cls(table_name, schema=schema, if_exists=if_exists, table_kw=kw)
operations.invoke(op) operations.invoke(op)
@@ -1534,7 +1570,7 @@ class CreateTableCommentOp(AlterTableOp):
) )
return operations.invoke(op) return operations.invoke(op)
def reverse(self): def reverse(self) -> Union[CreateTableCommentOp, DropTableCommentOp]:
"""Reverses the COMMENT ON operation against a table.""" """Reverses the COMMENT ON operation against a table."""
if self.existing_comment is None: if self.existing_comment is None:
return DropTableCommentOp( return DropTableCommentOp(
@@ -1550,14 +1586,16 @@ class CreateTableCommentOp(AlterTableOp):
schema=self.schema, schema=self.schema,
) )
def to_table(self, migration_context=None): def to_table(
self, migration_context: Optional[MigrationContext] = None
) -> Table:
schema_obj = schemaobj.SchemaObjects(migration_context) schema_obj = schemaobj.SchemaObjects(migration_context)
return schema_obj.table( return schema_obj.table(
self.table_name, schema=self.schema, comment=self.comment self.table_name, schema=self.schema, comment=self.comment
) )
def to_diff_tuple(self): def to_diff_tuple(self) -> Tuple[Any, ...]:
return ("add_table_comment", self.to_table(), self.existing_comment) return ("add_table_comment", self.to_table(), self.existing_comment)
@@ -1629,18 +1667,20 @@ class DropTableCommentOp(AlterTableOp):
) )
return operations.invoke(op) return operations.invoke(op)
def reverse(self): def reverse(self) -> CreateTableCommentOp:
"""Reverses the COMMENT ON operation against a table.""" """Reverses the COMMENT ON operation against a table."""
return CreateTableCommentOp( return CreateTableCommentOp(
self.table_name, self.existing_comment, schema=self.schema self.table_name, self.existing_comment, schema=self.schema
) )
def to_table(self, migration_context=None): def to_table(
self, migration_context: Optional[MigrationContext] = None
) -> Table:
schema_obj = schemaobj.SchemaObjects(migration_context) schema_obj = schemaobj.SchemaObjects(migration_context)
return schema_obj.table(self.table_name, schema=self.schema) return schema_obj.table(self.table_name, schema=self.schema)
def to_diff_tuple(self): def to_diff_tuple(self) -> Tuple[Any, ...]:
return ("remove_table_comment", self.to_table()) return ("remove_table_comment", self.to_table())
@@ -1815,12 +1855,16 @@ class AlterColumnOp(AlterTableOp):
*, *,
nullable: Optional[bool] = None, nullable: Optional[bool] = None,
comment: Optional[Union[str, Literal[False]]] = False, comment: Optional[Union[str, Literal[False]]] = False,
server_default: Any = False, server_default: Union[
str, bool, Identity, Computed, TextClause, None
] = False,
new_column_name: Optional[str] = None, new_column_name: Optional[str] = None,
type_: Optional[Union[TypeEngine, Type[TypeEngine]]] = None, type_: Optional[Union[TypeEngine[Any], Type[TypeEngine[Any]]]] = None,
existing_type: Optional[Union[TypeEngine, Type[TypeEngine]]] = None, existing_type: Optional[
existing_server_default: Optional[ Union[TypeEngine[Any], Type[TypeEngine[Any]]]
Union[str, bool, Identity, Computed] ] = None,
existing_server_default: Union[
str, bool, Identity, Computed, TextClause, None
] = False, ] = False,
existing_nullable: Optional[bool] = None, existing_nullable: Optional[bool] = None,
existing_comment: Optional[str] = None, existing_comment: Optional[str] = None,
@@ -1938,8 +1982,10 @@ class AlterColumnOp(AlterTableOp):
comment: Optional[Union[str, Literal[False]]] = False, comment: Optional[Union[str, Literal[False]]] = False,
server_default: Any = False, server_default: Any = False,
new_column_name: Optional[str] = None, new_column_name: Optional[str] = None,
type_: Optional[Union[TypeEngine, Type[TypeEngine]]] = None, type_: Optional[Union[TypeEngine[Any], Type[TypeEngine[Any]]]] = None,
existing_type: Optional[Union[TypeEngine, Type[TypeEngine]]] = None, existing_type: Optional[
Union[TypeEngine[Any], Type[TypeEngine[Any]]]
] = None,
existing_server_default: Optional[ existing_server_default: Optional[
Union[str, bool, Identity, Computed] Union[str, bool, Identity, Computed]
] = False, ] = False,
@@ -2003,27 +2049,31 @@ class AddColumnOp(AlterTableOp):
column: Column[Any], column: Column[Any],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_not_exists: Optional[bool] = None,
**kw: Any, **kw: Any,
) -> None: ) -> None:
super().__init__(table_name, schema=schema) super().__init__(table_name, schema=schema)
self.column = column self.column = column
self.if_not_exists = if_not_exists
self.kw = kw self.kw = kw
def reverse(self) -> DropColumnOp: def reverse(self) -> DropColumnOp:
return DropColumnOp.from_column_and_tablename( op = DropColumnOp.from_column_and_tablename(
self.schema, self.table_name, self.column self.schema, self.table_name, self.column
) )
op.if_exists = self.if_not_exists
return op
def to_diff_tuple( def to_diff_tuple(
self, self,
) -> Tuple[str, Optional[str], str, Column[Any]]: ) -> Tuple[str, Optional[str], str, Column[Any]]:
return ("add_column", self.schema, self.table_name, self.column) return ("add_column", self.schema, self.table_name, self.column)
def to_column(self) -> Column: def to_column(self) -> Column[Any]:
return self.column return self.column
@classmethod @classmethod
def from_column(cls, col: Column) -> AddColumnOp: def from_column(cls, col: Column[Any]) -> AddColumnOp:
return cls(col.table.name, col, schema=col.table.schema) return cls(col.table.name, col, schema=col.table.schema)
@classmethod @classmethod
@@ -2043,6 +2093,7 @@ class AddColumnOp(AlterTableOp):
column: Column[Any], column: Column[Any],
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
"""Issue an "add column" instruction using the current """Issue an "add column" instruction using the current
migration context. migration context.
@@ -2119,10 +2170,19 @@ class AddColumnOp(AlterTableOp):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_not_exists: If True, adds IF NOT EXISTS operator
when creating the new column for compatible dialects
.. versionadded:: 1.16.0
""" """
op = cls(table_name, column, schema=schema) op = cls(
table_name,
column,
schema=schema,
if_not_exists=if_not_exists,
)
return operations.invoke(op) return operations.invoke(op)
@classmethod @classmethod
@@ -2133,6 +2193,7 @@ class AddColumnOp(AlterTableOp):
*, *,
insert_before: Optional[str] = None, insert_before: Optional[str] = None,
insert_after: Optional[str] = None, insert_after: Optional[str] = None,
if_not_exists: Optional[bool] = None,
) -> None: ) -> None:
"""Issue an "add column" instruction using the current """Issue an "add column" instruction using the current
batch migration context. batch migration context.
@@ -2153,6 +2214,7 @@ class AddColumnOp(AlterTableOp):
operations.impl.table_name, operations.impl.table_name,
column, column,
schema=operations.impl.schema, schema=operations.impl.schema,
if_not_exists=if_not_exists,
**kw, **kw,
) )
return operations.invoke(op) return operations.invoke(op)
@@ -2169,12 +2231,14 @@ class DropColumnOp(AlterTableOp):
column_name: str, column_name: str,
*, *,
schema: Optional[str] = None, schema: Optional[str] = None,
if_exists: Optional[bool] = None,
_reverse: Optional[AddColumnOp] = None, _reverse: Optional[AddColumnOp] = None,
**kw: Any, **kw: Any,
) -> None: ) -> None:
super().__init__(table_name, schema=schema) super().__init__(table_name, schema=schema)
self.column_name = column_name self.column_name = column_name
self.kw = kw self.kw = kw
self.if_exists = if_exists
self._reverse = _reverse self._reverse = _reverse
def to_diff_tuple( def to_diff_tuple(
@@ -2194,9 +2258,11 @@ class DropColumnOp(AlterTableOp):
"original column is not present" "original column is not present"
) )
return AddColumnOp.from_column_and_tablename( op = AddColumnOp.from_column_and_tablename(
self.schema, self.table_name, self._reverse.column self.schema, self.table_name, self._reverse.column
) )
op.if_not_exists = self.if_exists
return op
@classmethod @classmethod
def from_column_and_tablename( def from_column_and_tablename(
@@ -2214,7 +2280,7 @@ class DropColumnOp(AlterTableOp):
def to_column( def to_column(
self, migration_context: Optional[MigrationContext] = None self, migration_context: Optional[MigrationContext] = None
) -> Column: ) -> Column[Any]:
if self._reverse is not None: if self._reverse is not None:
return self._reverse.column return self._reverse.column
schema_obj = schemaobj.SchemaObjects(migration_context) schema_obj = schemaobj.SchemaObjects(migration_context)
@@ -2243,6 +2309,11 @@ class DropColumnOp(AlterTableOp):
quoting of the schema outside of the default behavior, use quoting of the schema outside of the default behavior, use
the SQLAlchemy construct the SQLAlchemy construct
:class:`~sqlalchemy.sql.elements.quoted_name`. :class:`~sqlalchemy.sql.elements.quoted_name`.
:param if_exists: If True, adds IF EXISTS operator when
dropping the new column for compatible dialects
.. versionadded:: 1.16.0
:param mssql_drop_check: Optional boolean. When ``True``, on :param mssql_drop_check: Optional boolean. When ``True``, on
Microsoft SQL Server only, first Microsoft SQL Server only, first
drop the CHECK constraint on the column using a drop the CHECK constraint on the column using a
@@ -2264,7 +2335,6 @@ class DropColumnOp(AlterTableOp):
then exec's a separate DROP CONSTRAINT for that default. Only then exec's a separate DROP CONSTRAINT for that default. Only
works if the column has exactly one FK constraint which refers to works if the column has exactly one FK constraint which refers to
it, at the moment. it, at the moment.
""" """
op = cls(table_name, column_name, schema=schema, **kw) op = cls(table_name, column_name, schema=schema, **kw)
@@ -2298,7 +2368,7 @@ class BulkInsertOp(MigrateOperation):
def __init__( def __init__(
self, self,
table: Union[Table, TableClause], table: Union[Table, TableClause],
rows: List[dict], rows: List[Dict[str, Any]],
*, *,
multiinsert: bool = True, multiinsert: bool = True,
) -> None: ) -> None:
@@ -2311,7 +2381,7 @@ class BulkInsertOp(MigrateOperation):
cls, cls,
operations: Operations, operations: Operations,
table: Union[Table, TableClause], table: Union[Table, TableClause],
rows: List[dict], rows: List[Dict[str, Any]],
*, *,
multiinsert: bool = True, multiinsert: bool = True,
) -> None: ) -> None:
@@ -2607,7 +2677,7 @@ class UpgradeOps(OpContainer):
self.upgrade_token = upgrade_token self.upgrade_token = upgrade_token
def reverse_into(self, downgrade_ops: DowngradeOps) -> DowngradeOps: def reverse_into(self, downgrade_ops: DowngradeOps) -> DowngradeOps:
downgrade_ops.ops[:] = list( # type:ignore[index] downgrade_ops.ops[:] = list(
reversed([op.reverse() for op in self.ops]) reversed([op.reverse() for op in self.ops])
) )
return downgrade_ops return downgrade_ops
@@ -2634,7 +2704,7 @@ class DowngradeOps(OpContainer):
super().__init__(ops=ops) super().__init__(ops=ops)
self.downgrade_token = downgrade_token self.downgrade_token = downgrade_token
def reverse(self): def reverse(self) -> UpgradeOps:
return UpgradeOps( return UpgradeOps(
ops=list(reversed([op.reverse() for op in self.ops])) ops=list(reversed([op.reverse() for op in self.ops]))
) )
@@ -2665,6 +2735,8 @@ class MigrationScript(MigrateOperation):
""" """
_needs_render: Optional[bool] _needs_render: Optional[bool]
_upgrade_ops: List[UpgradeOps]
_downgrade_ops: List[DowngradeOps]
def __init__( def __init__(
self, self,
@@ -2677,7 +2749,7 @@ class MigrationScript(MigrateOperation):
head: Optional[str] = None, head: Optional[str] = None,
splice: Optional[bool] = None, splice: Optional[bool] = None,
branch_label: Optional[_RevIdType] = None, branch_label: Optional[_RevIdType] = None,
version_path: Optional[str] = None, version_path: Union[str, os.PathLike[str], None] = None,
depends_on: Optional[_RevIdType] = None, depends_on: Optional[_RevIdType] = None,
) -> None: ) -> None:
self.rev_id = rev_id self.rev_id = rev_id
@@ -2686,13 +2758,15 @@ class MigrationScript(MigrateOperation):
self.head = head self.head = head
self.splice = splice self.splice = splice
self.branch_label = branch_label self.branch_label = branch_label
self.version_path = version_path self.version_path = (
pathlib.Path(version_path).as_posix() if version_path else None
)
self.depends_on = depends_on self.depends_on = depends_on
self.upgrade_ops = upgrade_ops self.upgrade_ops = upgrade_ops
self.downgrade_ops = downgrade_ops self.downgrade_ops = downgrade_ops
@property @property
def upgrade_ops(self): def upgrade_ops(self) -> Optional[UpgradeOps]:
"""An instance of :class:`.UpgradeOps`. """An instance of :class:`.UpgradeOps`.
.. seealso:: .. seealso::
@@ -2711,13 +2785,15 @@ class MigrationScript(MigrateOperation):
return self._upgrade_ops[0] return self._upgrade_ops[0]
@upgrade_ops.setter @upgrade_ops.setter
def upgrade_ops(self, upgrade_ops): def upgrade_ops(
self, upgrade_ops: Union[UpgradeOps, List[UpgradeOps]]
) -> None:
self._upgrade_ops = util.to_list(upgrade_ops) self._upgrade_ops = util.to_list(upgrade_ops)
for elem in self._upgrade_ops: for elem in self._upgrade_ops:
assert isinstance(elem, UpgradeOps) assert isinstance(elem, UpgradeOps)
@property @property
def downgrade_ops(self): def downgrade_ops(self) -> Optional[DowngradeOps]:
"""An instance of :class:`.DowngradeOps`. """An instance of :class:`.DowngradeOps`.
.. seealso:: .. seealso::
@@ -2736,7 +2812,9 @@ class MigrationScript(MigrateOperation):
return self._downgrade_ops[0] return self._downgrade_ops[0]
@downgrade_ops.setter @downgrade_ops.setter
def downgrade_ops(self, downgrade_ops): def downgrade_ops(
self, downgrade_ops: Union[DowngradeOps, List[DowngradeOps]]
) -> None:
self._downgrade_ops = util.to_list(downgrade_ops) self._downgrade_ops = util.to_list(downgrade_ops)
for elem in self._downgrade_ops: for elem in self._downgrade_ops:
assert isinstance(elem, DowngradeOps) assert isinstance(elem, DowngradeOps)

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
from typing import Any from typing import Any
@@ -220,10 +223,12 @@ class SchemaObjects:
t = sa_schema.Table(name, m, *cols, **kw) t = sa_schema.Table(name, m, *cols, **kw)
constraints = [ constraints = [
sqla_compat._copy(elem, target_table=t) (
if getattr(elem, "parent", None) is not t sqla_compat._copy(elem, target_table=t)
and getattr(elem, "parent", None) is not None if getattr(elem, "parent", None) is not t
else elem and getattr(elem, "parent", None) is not None
else elem
)
for elem in columns for elem in columns
if isinstance(elem, (Constraint, Index)) if isinstance(elem, (Constraint, Index))
] ]
@@ -274,10 +279,8 @@ class SchemaObjects:
ForeignKey. ForeignKey.
""" """
if isinstance(fk._colspec, str): # type:ignore[attr-defined] if isinstance(fk._colspec, str):
table_key, cname = fk._colspec.rsplit( # type:ignore[attr-defined] table_key, cname = fk._colspec.rsplit(".", 1)
".", 1
)
sname, tname = self._parse_table_key(table_key) sname, tname = self._parse_table_key(table_key)
if table_key not in metadata.tables: if table_key not in metadata.tables:
rel_t = sa_schema.Table(tname, metadata, schema=sname) rel_t = sa_schema.Table(tname, metadata, schema=sname)

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from sqlalchemy import schema as sa_schema from sqlalchemy import schema as sa_schema
@@ -76,8 +79,11 @@ def alter_column(
@Operations.implementation_for(ops.DropTableOp) @Operations.implementation_for(ops.DropTableOp)
def drop_table(operations: "Operations", operation: "ops.DropTableOp") -> None: def drop_table(operations: "Operations", operation: "ops.DropTableOp") -> None:
kw = {}
if operation.if_exists is not None:
kw["if_exists"] = operation.if_exists
operations.impl.drop_table( operations.impl.drop_table(
operation.to_table(operations.migration_context) operation.to_table(operations.migration_context), **kw
) )
@@ -87,7 +93,11 @@ def drop_column(
) -> None: ) -> None:
column = operation.to_column(operations.migration_context) column = operation.to_column(operations.migration_context)
operations.impl.drop_column( operations.impl.drop_column(
operation.table_name, column, schema=operation.schema, **operation.kw operation.table_name,
column,
schema=operation.schema,
if_exists=operation.if_exists,
**operation.kw,
) )
@@ -98,9 +108,6 @@ def create_index(
idx = operation.to_index(operations.migration_context) idx = operation.to_index(operations.migration_context)
kw = {} kw = {}
if operation.if_not_exists is not None: if operation.if_not_exists is not None:
if not sqla_2:
raise NotImplementedError("SQLAlchemy 2.0+ required")
kw["if_not_exists"] = operation.if_not_exists kw["if_not_exists"] = operation.if_not_exists
operations.impl.create_index(idx, **kw) operations.impl.create_index(idx, **kw)
@@ -109,9 +116,6 @@ def create_index(
def drop_index(operations: "Operations", operation: "ops.DropIndexOp") -> None: def drop_index(operations: "Operations", operation: "ops.DropIndexOp") -> None:
kw = {} kw = {}
if operation.if_exists is not None: if operation.if_exists is not None:
if not sqla_2:
raise NotImplementedError("SQLAlchemy 2.0+ required")
kw["if_exists"] = operation.if_exists kw["if_exists"] = operation.if_exists
operations.impl.drop_index( operations.impl.drop_index(
@@ -124,8 +128,11 @@ def drop_index(operations: "Operations", operation: "ops.DropIndexOp") -> None:
def create_table( def create_table(
operations: "Operations", operation: "ops.CreateTableOp" operations: "Operations", operation: "ops.CreateTableOp"
) -> "Table": ) -> "Table":
kw = {}
if operation.if_not_exists is not None:
kw["if_not_exists"] = operation.if_not_exists
table = operation.to_table(operations.migration_context) table = operation.to_table(operations.migration_context)
operations.impl.create_table(table) operations.impl.create_table(table, **kw)
return table return table
@@ -165,7 +172,13 @@ def add_column(operations: "Operations", operation: "ops.AddColumnOp") -> None:
column = _copy(column) column = _copy(column)
t = operations.schema_obj.table(table_name, column, schema=schema) t = operations.schema_obj.table(table_name, column, schema=schema)
operations.impl.add_column(table_name, column, schema=schema, **kw) operations.impl.add_column(
table_name,
column,
schema=schema,
if_not_exists=operation.if_not_exists,
**kw,
)
for constraint in t.constraints: for constraint in t.constraints:
if not isinstance(constraint, sa_schema.PrimaryKeyConstraint): if not isinstance(constraint, sa_schema.PrimaryKeyConstraint):
@@ -195,13 +208,19 @@ def create_constraint(
def drop_constraint( def drop_constraint(
operations: "Operations", operation: "ops.DropConstraintOp" operations: "Operations", operation: "ops.DropConstraintOp"
) -> None: ) -> None:
kw = {}
if operation.if_exists is not None:
if not sqla_2:
raise NotImplementedError("SQLAlchemy 2.0 required")
kw["if_exists"] = operation.if_exists
operations.impl.drop_constraint( operations.impl.drop_constraint(
operations.schema_obj.generic_constraint( operations.schema_obj.generic_constraint(
operation.constraint_name, operation.constraint_name,
operation.table_name, operation.table_name,
operation.constraint_type, operation.constraint_type,
schema=operation.schema, schema=operation.schema,
) ),
**kw,
) )

View File

@@ -3,13 +3,13 @@ from __future__ import annotations
from typing import Any from typing import Any
from typing import Callable from typing import Callable
from typing import Collection from typing import Collection
from typing import ContextManager
from typing import Dict from typing import Dict
from typing import List from typing import List
from typing import Mapping from typing import Mapping
from typing import MutableMapping from typing import MutableMapping
from typing import Optional from typing import Optional
from typing import overload from typing import overload
from typing import Sequence
from typing import TextIO from typing import TextIO
from typing import Tuple from typing import Tuple
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
@@ -17,6 +17,7 @@ from typing import Union
from sqlalchemy.sql.schema import Column from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.schema import FetchedValue from sqlalchemy.sql.schema import FetchedValue
from typing_extensions import ContextManager
from typing_extensions import Literal from typing_extensions import Literal
from .migration import _ProxyTransaction from .migration import _ProxyTransaction
@@ -107,7 +108,6 @@ CompareType = Callable[
class EnvironmentContext(util.ModuleClsProxy): class EnvironmentContext(util.ModuleClsProxy):
"""A configurational facade made available in an ``env.py`` script. """A configurational facade made available in an ``env.py`` script.
The :class:`.EnvironmentContext` acts as a *facade* to the more The :class:`.EnvironmentContext` acts as a *facade* to the more
@@ -227,9 +227,9 @@ class EnvironmentContext(util.ModuleClsProxy):
has been configured. has been configured.
""" """
return self.context_opts.get("as_sql", False) return self.context_opts.get("as_sql", False) # type: ignore[no-any-return] # noqa: E501
def is_transactional_ddl(self): def is_transactional_ddl(self) -> bool:
"""Return True if the context is configured to expect a """Return True if the context is configured to expect a
transactional DDL capable backend. transactional DDL capable backend.
@@ -341,18 +341,17 @@ class EnvironmentContext(util.ModuleClsProxy):
return self.context_opts.get("tag", None) return self.context_opts.get("tag", None)
@overload @overload
def get_x_argument(self, as_dictionary: Literal[False]) -> List[str]: def get_x_argument(self, as_dictionary: Literal[False]) -> List[str]: ...
...
@overload @overload
def get_x_argument(self, as_dictionary: Literal[True]) -> Dict[str, str]: def get_x_argument(
... self, as_dictionary: Literal[True]
) -> Dict[str, str]: ...
@overload @overload
def get_x_argument( def get_x_argument(
self, as_dictionary: bool = ... self, as_dictionary: bool = ...
) -> Union[List[str], Dict[str, str]]: ) -> Union[List[str], Dict[str, str]]: ...
...
def get_x_argument( def get_x_argument(
self, as_dictionary: bool = False self, as_dictionary: bool = False
@@ -366,7 +365,11 @@ class EnvironmentContext(util.ModuleClsProxy):
The return value is a list, returned directly from the ``argparse`` The return value is a list, returned directly from the ``argparse``
structure. If ``as_dictionary=True`` is passed, the ``x`` arguments structure. If ``as_dictionary=True`` is passed, the ``x`` arguments
are parsed using ``key=value`` format into a dictionary that is are parsed using ``key=value`` format into a dictionary that is
then returned. then returned. If there is no ``=`` in the argument, value is an empty
string.
.. versionchanged:: 1.13.1 Support ``as_dictionary=True`` when
arguments are passed without the ``=`` symbol.
For example, to support passing a database URL on the command line, For example, to support passing a database URL on the command line,
the standard ``env.py`` script can be modified like this:: the standard ``env.py`` script can be modified like this::
@@ -400,7 +403,12 @@ class EnvironmentContext(util.ModuleClsProxy):
else: else:
value = [] value = []
if as_dictionary: if as_dictionary:
value = dict(arg.split("=", 1) for arg in value) dict_value = {}
for arg in value:
x_key, _, x_value = arg.partition("=")
dict_value[x_key] = x_value
value = dict_value
return value return value
def configure( def configure(
@@ -416,7 +424,7 @@ class EnvironmentContext(util.ModuleClsProxy):
tag: Optional[str] = None, tag: Optional[str] = None,
template_args: Optional[Dict[str, Any]] = None, template_args: Optional[Dict[str, Any]] = None,
render_as_batch: bool = False, render_as_batch: bool = False,
target_metadata: Optional[MetaData] = None, target_metadata: Union[MetaData, Sequence[MetaData], None] = None,
include_name: Optional[IncludeNameFn] = None, include_name: Optional[IncludeNameFn] = None,
include_object: Optional[IncludeObjectFn] = None, include_object: Optional[IncludeObjectFn] = None,
include_schemas: bool = False, include_schemas: bool = False,
@@ -940,7 +948,7 @@ class EnvironmentContext(util.ModuleClsProxy):
def execute( def execute(
self, self,
sql: Union[Executable, str], sql: Union[Executable, str],
execution_options: Optional[dict] = None, execution_options: Optional[Dict[str, Any]] = None,
) -> None: ) -> None:
"""Execute the given SQL using the current change context. """Execute the given SQL using the current change context.
@@ -968,7 +976,7 @@ class EnvironmentContext(util.ModuleClsProxy):
def begin_transaction( def begin_transaction(
self, self,
) -> Union[_ProxyTransaction, ContextManager[None]]: ) -> Union[_ProxyTransaction, ContextManager[None, Optional[bool]]]:
"""Return a context manager that will """Return a context manager that will
enclose an operation within a "transaction", enclose an operation within a "transaction",
as defined by the environment's offline as defined by the environment's offline

View File

@@ -1,3 +1,6 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
from contextlib import contextmanager from contextlib import contextmanager
@@ -8,7 +11,6 @@ from typing import Any
from typing import Callable from typing import Callable
from typing import cast from typing import cast
from typing import Collection from typing import Collection
from typing import ContextManager
from typing import Dict from typing import Dict
from typing import Iterable from typing import Iterable
from typing import Iterator from typing import Iterator
@@ -21,13 +23,11 @@ from typing import Union
from sqlalchemy import Column from sqlalchemy import Column
from sqlalchemy import literal_column from sqlalchemy import literal_column
from sqlalchemy import MetaData from sqlalchemy import select
from sqlalchemy import PrimaryKeyConstraint
from sqlalchemy import String
from sqlalchemy import Table
from sqlalchemy.engine import Engine from sqlalchemy.engine import Engine
from sqlalchemy.engine import url as sqla_url from sqlalchemy.engine import url as sqla_url
from sqlalchemy.engine.strategies import MockEngineStrategy from sqlalchemy.engine.strategies import MockEngineStrategy
from typing_extensions import ContextManager
from .. import ddl from .. import ddl
from .. import util from .. import util
@@ -83,7 +83,6 @@ class _ProxyTransaction:
class MigrationContext: class MigrationContext:
"""Represent the database state made available to a migration """Represent the database state made available to a migration
script. script.
@@ -176,7 +175,11 @@ class MigrationContext:
opts["output_encoding"], opts["output_encoding"],
) )
else: else:
self.output_buffer = opts.get("output_buffer", sys.stdout) self.output_buffer = opts.get(
"output_buffer", sys.stdout
) # type:ignore[assignment] # noqa: E501
self.transactional_ddl = transactional_ddl
self._user_compare_type = opts.get("compare_type", True) self._user_compare_type = opts.get("compare_type", True)
self._user_compare_server_default = opts.get( self._user_compare_server_default = opts.get(
@@ -188,18 +191,6 @@ class MigrationContext:
self.version_table_schema = version_table_schema = opts.get( self.version_table_schema = version_table_schema = opts.get(
"version_table_schema", None "version_table_schema", None
) )
self._version = Table(
version_table,
MetaData(),
Column("version_num", String(32), nullable=False),
schema=version_table_schema,
)
if opts.get("version_table_pk", True):
self._version.append_constraint(
PrimaryKeyConstraint(
"version_num", name="%s_pkc" % version_table
)
)
self._start_from_rev: Optional[str] = opts.get("starting_rev") self._start_from_rev: Optional[str] = opts.get("starting_rev")
self.impl = ddl.DefaultImpl.get_by_dialect(dialect)( self.impl = ddl.DefaultImpl.get_by_dialect(dialect)(
@@ -210,14 +201,23 @@ class MigrationContext:
self.output_buffer, self.output_buffer,
opts, opts,
) )
self._version = self.impl.version_table_impl(
version_table=version_table,
version_table_schema=version_table_schema,
version_table_pk=opts.get("version_table_pk", True),
)
log.info("Context impl %s.", self.impl.__class__.__name__) log.info("Context impl %s.", self.impl.__class__.__name__)
if self.as_sql: if self.as_sql:
log.info("Generating static SQL") log.info("Generating static SQL")
log.info( log.info(
"Will assume %s DDL.", "Will assume %s DDL.",
"transactional" (
if self.impl.transactional_ddl "transactional"
else "non-transactional", if self.impl.transactional_ddl
else "non-transactional"
),
) )
@classmethod @classmethod
@@ -342,9 +342,9 @@ class MigrationContext:
# except that it will not know it's in "autocommit" and will # except that it will not know it's in "autocommit" and will
# emit deprecation warnings when an autocommit action takes # emit deprecation warnings when an autocommit action takes
# place. # place.
self.connection = ( self.connection = self.impl.connection = (
self.impl.connection base_connection.execution_options(isolation_level="AUTOCOMMIT")
) = base_connection.execution_options(isolation_level="AUTOCOMMIT") )
# sqlalchemy future mode will "autobegin" in any case, so take # sqlalchemy future mode will "autobegin" in any case, so take
# control of that "transaction" here # control of that "transaction" here
@@ -372,7 +372,7 @@ class MigrationContext:
def begin_transaction( def begin_transaction(
self, _per_migration: bool = False self, _per_migration: bool = False
) -> Union[_ProxyTransaction, ContextManager[None]]: ) -> Union[_ProxyTransaction, ContextManager[None, Optional[bool]]]:
"""Begin a logical transaction for migration operations. """Begin a logical transaction for migration operations.
This method is used within an ``env.py`` script to demarcate where This method is used within an ``env.py`` script to demarcate where
@@ -521,7 +521,7 @@ class MigrationContext:
start_from_rev = None start_from_rev = None
elif start_from_rev is not None and self.script: elif start_from_rev is not None and self.script:
start_from_rev = [ start_from_rev = [
cast("Script", self.script.get_revision(sfr)).revision self.script.get_revision(sfr).revision
for sfr in util.to_list(start_from_rev) for sfr in util.to_list(start_from_rev)
if sfr not in (None, "base") if sfr not in (None, "base")
] ]
@@ -536,7 +536,10 @@ class MigrationContext:
return () return ()
assert self.connection is not None assert self.connection is not None
return tuple( return tuple(
row[0] for row in self.connection.execute(self._version.select()) row[0]
for row in self.connection.execute(
select(self._version.c.version_num)
)
) )
def _ensure_version_table(self, purge: bool = False) -> None: def _ensure_version_table(self, purge: bool = False) -> None:
@@ -652,7 +655,7 @@ class MigrationContext:
def execute( def execute(
self, self,
sql: Union[Executable, str], sql: Union[Executable, str],
execution_options: Optional[dict] = None, execution_options: Optional[Dict[str, Any]] = None,
) -> None: ) -> None:
"""Execute a SQL construct or string statement. """Execute a SQL construct or string statement.
@@ -1000,6 +1003,11 @@ class MigrationStep:
is_upgrade: bool is_upgrade: bool
migration_fn: Any migration_fn: Any
if TYPE_CHECKING:
@property
def doc(self) -> Optional[str]: ...
@property @property
def name(self) -> str: def name(self) -> str:
return self.migration_fn.__name__ return self.migration_fn.__name__
@@ -1048,13 +1056,9 @@ class RevisionStep(MigrationStep):
self.revision = revision self.revision = revision
self.is_upgrade = is_upgrade self.is_upgrade = is_upgrade
if is_upgrade: if is_upgrade:
self.migration_fn = ( self.migration_fn = revision.module.upgrade
revision.module.upgrade # type:ignore[attr-defined]
)
else: else:
self.migration_fn = ( self.migration_fn = revision.module.downgrade
revision.module.downgrade # type:ignore[attr-defined]
)
def __repr__(self): def __repr__(self):
return "RevisionStep(%r, is_upgrade=%r)" % ( return "RevisionStep(%r, is_upgrade=%r)" % (
@@ -1070,7 +1074,7 @@ class RevisionStep(MigrationStep):
) )
@property @property
def doc(self) -> str: def doc(self) -> Optional[str]:
return self.revision.doc return self.revision.doc
@property @property
@@ -1168,7 +1172,18 @@ class RevisionStep(MigrationStep):
} }
return tuple(set(self.to_revisions).difference(ancestors)) return tuple(set(self.to_revisions).difference(ancestors))
else: else:
return self.to_revisions # for each revision we plan to return, compute its ancestors
# (excluding self), and remove those from the final output since
# they are already accounted for.
ancestors = {
r.revision
for to_revision in self.to_revisions
for r in self.revision_map._get_ancestor_nodes(
self.revision_map.get_revisions(to_revision), check=False
)
if r.revision != to_revision
}
return tuple(set(self.to_revisions).difference(ancestors))
def unmerge_branch_idents( def unmerge_branch_idents(
self, heads: Set[str] self, heads: Set[str]
@@ -1283,7 +1298,7 @@ class StampStep(MigrationStep):
def __eq__(self, other): def __eq__(self, other):
return ( return (
isinstance(other, StampStep) isinstance(other, StampStep)
and other.from_revisions == self.revisions and other.from_revisions == self.from_revisions
and other.to_revisions == self.to_revisions and other.to_revisions == self.to_revisions
and other.branch_move == self.branch_move and other.branch_move == self.branch_move
and self.is_upgrade == other.is_upgrade and self.is_upgrade == other.is_upgrade

View File

@@ -3,6 +3,7 @@ from __future__ import annotations
from contextlib import contextmanager from contextlib import contextmanager
import datetime import datetime
import os import os
from pathlib import Path
import re import re
import shutil import shutil
import sys import sys
@@ -11,7 +12,6 @@ from typing import Any
from typing import cast from typing import cast
from typing import Iterator from typing import Iterator
from typing import List from typing import List
from typing import Mapping
from typing import Optional from typing import Optional
from typing import Sequence from typing import Sequence
from typing import Set from typing import Set
@@ -23,7 +23,9 @@ from . import revision
from . import write_hooks from . import write_hooks
from .. import util from .. import util
from ..runtime import migration from ..runtime import migration
from ..util import compat
from ..util import not_none from ..util import not_none
from ..util.pyfiles import _preserving_path_as_str
if TYPE_CHECKING: if TYPE_CHECKING:
from .revision import _GetRevArg from .revision import _GetRevArg
@@ -31,26 +33,28 @@ if TYPE_CHECKING:
from .revision import Revision from .revision import Revision
from ..config import Config from ..config import Config
from ..config import MessagingOptions from ..config import MessagingOptions
from ..config import PostWriteHookConfig
from ..runtime.migration import RevisionStep from ..runtime.migration import RevisionStep
from ..runtime.migration import StampStep from ..runtime.migration import StampStep
try: try:
from dateutil import tz if compat.py39:
from zoneinfo import ZoneInfo
from zoneinfo import ZoneInfoNotFoundError
else:
from backports.zoneinfo import ZoneInfo # type: ignore[import-not-found,no-redef] # noqa: E501
from backports.zoneinfo import ZoneInfoNotFoundError # type: ignore[no-redef] # noqa: E501
except ImportError: except ImportError:
tz = None # type: ignore[assignment] ZoneInfo = None # type: ignore[assignment, misc]
_sourceless_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)(c|o)?$") _sourceless_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)(c|o)?$")
_only_source_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)$") _only_source_rev_file = re.compile(r"(?!\.\#|__init__)(.*\.py)$")
_legacy_rev = re.compile(r"([a-f0-9]+)\.py$") _legacy_rev = re.compile(r"([a-f0-9]+)\.py$")
_slug_re = re.compile(r"\w+") _slug_re = re.compile(r"\w+")
_default_file_template = "%(rev)s_%(slug)s" _default_file_template = "%(rev)s_%(slug)s"
_split_on_space_comma = re.compile(r", *|(?: +)")
_split_on_space_comma_colon = re.compile(r", *|(?: +)|\:")
class ScriptDirectory: class ScriptDirectory:
"""Provides operations upon an Alembic script directory. """Provides operations upon an Alembic script directory.
This object is useful to get information as to current revisions, This object is useful to get information as to current revisions,
@@ -72,40 +76,55 @@ class ScriptDirectory:
def __init__( def __init__(
self, self,
dir: str, # noqa dir: Union[str, os.PathLike[str]], # noqa: A002
file_template: str = _default_file_template, file_template: str = _default_file_template,
truncate_slug_length: Optional[int] = 40, truncate_slug_length: Optional[int] = 40,
version_locations: Optional[List[str]] = None, version_locations: Optional[
Sequence[Union[str, os.PathLike[str]]]
] = None,
sourceless: bool = False, sourceless: bool = False,
output_encoding: str = "utf-8", output_encoding: str = "utf-8",
timezone: Optional[str] = None, timezone: Optional[str] = None,
hook_config: Optional[Mapping[str, str]] = None, hooks: list[PostWriteHookConfig] = [],
recursive_version_locations: bool = False, recursive_version_locations: bool = False,
messaging_opts: MessagingOptions = cast( messaging_opts: MessagingOptions = cast(
"MessagingOptions", util.EMPTY_DICT "MessagingOptions", util.EMPTY_DICT
), ),
) -> None: ) -> None:
self.dir = dir self.dir = _preserving_path_as_str(dir)
self.version_locations = [
_preserving_path_as_str(p) for p in version_locations or ()
]
self.file_template = file_template self.file_template = file_template
self.version_locations = version_locations
self.truncate_slug_length = truncate_slug_length or 40 self.truncate_slug_length = truncate_slug_length or 40
self.sourceless = sourceless self.sourceless = sourceless
self.output_encoding = output_encoding self.output_encoding = output_encoding
self.revision_map = revision.RevisionMap(self._load_revisions) self.revision_map = revision.RevisionMap(self._load_revisions)
self.timezone = timezone self.timezone = timezone
self.hook_config = hook_config self.hooks = hooks
self.recursive_version_locations = recursive_version_locations self.recursive_version_locations = recursive_version_locations
self.messaging_opts = messaging_opts self.messaging_opts = messaging_opts
if not os.access(dir, os.F_OK): if not os.access(dir, os.F_OK):
raise util.CommandError( raise util.CommandError(
"Path doesn't exist: %r. Please use " f"Path doesn't exist: {dir}. Please use "
"the 'init' command to create a new " "the 'init' command to create a new "
"scripts folder." % os.path.abspath(dir) "scripts folder."
) )
@property @property
def versions(self) -> str: def versions(self) -> str:
"""return a single version location based on the sole path passed
within version_locations.
If multiple version locations are configured, an error is raised.
"""
return str(self._singular_version_location)
@util.memoized_property
def _singular_version_location(self) -> Path:
loc = self._version_locations loc = self._version_locations
if len(loc) > 1: if len(loc) > 1:
raise util.CommandError("Multiple version_locations present") raise util.CommandError("Multiple version_locations present")
@@ -113,40 +132,31 @@ class ScriptDirectory:
return loc[0] return loc[0]
@util.memoized_property @util.memoized_property
def _version_locations(self): def _version_locations(self) -> Sequence[Path]:
if self.version_locations: if self.version_locations:
return [ return [
os.path.abspath(util.coerce_resource_to_filename(location)) util.coerce_resource_to_filename(location).absolute()
for location in self.version_locations for location in self.version_locations
] ]
else: else:
return (os.path.abspath(os.path.join(self.dir, "versions")),) return [Path(self.dir, "versions").absolute()]
def _load_revisions(self) -> Iterator[Script]: def _load_revisions(self) -> Iterator[Script]:
if self.version_locations: paths = [vers for vers in self._version_locations if vers.exists()]
paths = [
vers
for vers in self._version_locations
if os.path.exists(vers)
]
else:
paths = [self.versions]
dupes = set() dupes = set()
for vers in paths: for vers in paths:
for file_path in Script._list_py_dir(self, vers): for file_path in Script._list_py_dir(self, vers):
real_path = os.path.realpath(file_path) real_path = file_path.resolve()
if real_path in dupes: if real_path in dupes:
util.warn( util.warn(
"File %s loaded twice! ignoring. Please ensure " f"File {real_path} loaded twice! ignoring. "
"version_locations is unique." % real_path "Please ensure version_locations is unique."
) )
continue continue
dupes.add(real_path) dupes.add(real_path)
filename = os.path.basename(real_path) script = Script._from_path(self, real_path)
dir_name = os.path.dirname(real_path)
script = Script._from_filename(self, dir_name, filename)
if script is None: if script is None:
continue continue
yield script yield script
@@ -160,74 +170,36 @@ class ScriptDirectory:
present. present.
""" """
script_location = config.get_main_option("script_location") script_location = config.get_alembic_option("script_location")
if script_location is None: if script_location is None:
raise util.CommandError( raise util.CommandError(
"No 'script_location' key " "found in configuration." "No 'script_location' key found in configuration."
) )
truncate_slug_length: Optional[int] truncate_slug_length: Optional[int]
tsl = config.get_main_option("truncate_slug_length") tsl = config.get_alembic_option("truncate_slug_length")
if tsl is not None: if tsl is not None:
truncate_slug_length = int(tsl) truncate_slug_length = int(tsl)
else: else:
truncate_slug_length = None truncate_slug_length = None
version_locations_str = config.get_main_option("version_locations") prepend_sys_path = config.get_prepend_sys_paths_list()
version_locations: Optional[List[str]]
if version_locations_str:
version_path_separator = config.get_main_option(
"version_path_separator"
)
split_on_path = {
None: None,
"space": " ",
"os": os.pathsep,
":": ":",
";": ";",
}
try:
split_char: Optional[str] = split_on_path[
version_path_separator
]
except KeyError as ke:
raise ValueError(
"'%s' is not a valid value for "
"version_path_separator; "
"expected 'space', 'os', ':', ';'" % version_path_separator
) from ke
else:
if split_char is None:
# legacy behaviour for backwards compatibility
version_locations = _split_on_space_comma.split(
version_locations_str
)
else:
version_locations = [
x for x in version_locations_str.split(split_char) if x
]
else:
version_locations = None
prepend_sys_path = config.get_main_option("prepend_sys_path")
if prepend_sys_path: if prepend_sys_path:
sys.path[:0] = list( sys.path[:0] = prepend_sys_path
_split_on_space_comma_colon.split(prepend_sys_path)
)
rvl = config.get_main_option("recursive_version_locations") == "true" rvl = config.get_alembic_boolean_option("recursive_version_locations")
return ScriptDirectory( return ScriptDirectory(
util.coerce_resource_to_filename(script_location), util.coerce_resource_to_filename(script_location),
file_template=config.get_main_option( file_template=config.get_alembic_option(
"file_template", _default_file_template "file_template", _default_file_template
), ),
truncate_slug_length=truncate_slug_length, truncate_slug_length=truncate_slug_length,
sourceless=config.get_main_option("sourceless") == "true", sourceless=config.get_alembic_boolean_option("sourceless"),
output_encoding=config.get_main_option("output_encoding", "utf-8"), output_encoding=config.get_alembic_option(
version_locations=version_locations, "output_encoding", "utf-8"
timezone=config.get_main_option("timezone"), ),
hook_config=config.get_section("post_write_hooks", {}), version_locations=config.get_version_locations_list(),
timezone=config.get_alembic_option("timezone"),
hooks=config.get_hooks_list(),
recursive_version_locations=rvl, recursive_version_locations=rvl,
messaging_opts=config.messaging_opts, messaging_opts=config.messaging_opts,
) )
@@ -297,24 +269,22 @@ class ScriptDirectory:
): ):
yield cast(Script, rev) yield cast(Script, rev)
def get_revisions(self, id_: _GetRevArg) -> Tuple[Optional[Script], ...]: def get_revisions(self, id_: _GetRevArg) -> Tuple[Script, ...]:
"""Return the :class:`.Script` instance with the given rev identifier, """Return the :class:`.Script` instance with the given rev identifier,
symbolic name, or sequence of identifiers. symbolic name, or sequence of identifiers.
""" """
with self._catch_revision_errors(): with self._catch_revision_errors():
return cast( return cast(
Tuple[Optional[Script], ...], Tuple[Script, ...],
self.revision_map.get_revisions(id_), self.revision_map.get_revisions(id_),
) )
def get_all_current(self, id_: Tuple[str, ...]) -> Set[Optional[Script]]: def get_all_current(self, id_: Tuple[str, ...]) -> Set[Script]:
with self._catch_revision_errors(): with self._catch_revision_errors():
return cast( return cast(Set[Script], self.revision_map._get_all_current(id_))
Set[Optional[Script]], self.revision_map._get_all_current(id_)
)
def get_revision(self, id_: str) -> Optional[Script]: def get_revision(self, id_: str) -> Script:
"""Return the :class:`.Script` instance with the given rev id. """Return the :class:`.Script` instance with the given rev id.
.. seealso:: .. seealso::
@@ -324,7 +294,7 @@ class ScriptDirectory:
""" """
with self._catch_revision_errors(): with self._catch_revision_errors():
return cast(Optional[Script], self.revision_map.get_revision(id_)) return cast(Script, self.revision_map.get_revision(id_))
def as_revision_number( def as_revision_number(
self, id_: Optional[str] self, id_: Optional[str]
@@ -579,24 +549,37 @@ class ScriptDirectory:
util.load_python_file(self.dir, "env.py") util.load_python_file(self.dir, "env.py")
@property @property
def env_py_location(self): def env_py_location(self) -> str:
return os.path.abspath(os.path.join(self.dir, "env.py")) return str(Path(self.dir, "env.py"))
def _generate_template(self, src: str, dest: str, **kw: Any) -> None: def _append_template(self, src: Path, dest: Path, **kw: Any) -> None:
with util.status( with util.status(
f"Generating {os.path.abspath(dest)}", **self.messaging_opts f"Appending to existing {dest.absolute()}",
**self.messaging_opts,
):
util.template_to_file(
src,
dest,
self.output_encoding,
append_with_newlines=True,
**kw,
)
def _generate_template(self, src: Path, dest: Path, **kw: Any) -> None:
with util.status(
f"Generating {dest.absolute()}", **self.messaging_opts
): ):
util.template_to_file(src, dest, self.output_encoding, **kw) util.template_to_file(src, dest, self.output_encoding, **kw)
def _copy_file(self, src: str, dest: str) -> None: def _copy_file(self, src: Path, dest: Path) -> None:
with util.status( with util.status(
f"Generating {os.path.abspath(dest)}", **self.messaging_opts f"Generating {dest.absolute()}", **self.messaging_opts
): ):
shutil.copy(src, dest) shutil.copy(src, dest)
def _ensure_directory(self, path: str) -> None: def _ensure_directory(self, path: Path) -> None:
path = os.path.abspath(path) path = path.absolute()
if not os.path.exists(path): if not path.exists():
with util.status( with util.status(
f"Creating directory {path}", **self.messaging_opts f"Creating directory {path}", **self.messaging_opts
): ):
@@ -604,25 +587,27 @@ class ScriptDirectory:
def _generate_create_date(self) -> datetime.datetime: def _generate_create_date(self) -> datetime.datetime:
if self.timezone is not None: if self.timezone is not None:
if tz is None: if ZoneInfo is None:
raise util.CommandError( raise util.CommandError(
"The library 'python-dateutil' is required " "Python >= 3.9 is required for timezone support or "
"for timezone support" "the 'backports.zoneinfo' package must be installed."
) )
# First, assume correct capitalization # First, assume correct capitalization
tzinfo = tz.gettz(self.timezone) try:
tzinfo = ZoneInfo(self.timezone)
except ZoneInfoNotFoundError:
tzinfo = None
if tzinfo is None: if tzinfo is None:
# Fall back to uppercase try:
tzinfo = tz.gettz(self.timezone.upper()) tzinfo = ZoneInfo(self.timezone.upper())
if tzinfo is None: except ZoneInfoNotFoundError:
raise util.CommandError( raise util.CommandError(
"Can't locate timezone: %s" % self.timezone "Can't locate timezone: %s" % self.timezone
) ) from None
create_date = (
datetime.datetime.utcnow() create_date = datetime.datetime.now(
.replace(tzinfo=tz.tzutc()) tz=datetime.timezone.utc
.astimezone(tzinfo) ).astimezone(tzinfo)
)
else: else:
create_date = datetime.datetime.now() create_date = datetime.datetime.now()
return create_date return create_date
@@ -634,7 +619,8 @@ class ScriptDirectory:
head: Optional[_RevIdType] = None, head: Optional[_RevIdType] = None,
splice: Optional[bool] = False, splice: Optional[bool] = False,
branch_labels: Optional[_RevIdType] = None, branch_labels: Optional[_RevIdType] = None,
version_path: Optional[str] = None, version_path: Union[str, os.PathLike[str], None] = None,
file_template: Optional[str] = None,
depends_on: Optional[_RevIdType] = None, depends_on: Optional[_RevIdType] = None,
**kw: Any, **kw: Any,
) -> Optional[Script]: ) -> Optional[Script]:
@@ -675,7 +661,7 @@ class ScriptDirectory:
self.revision_map.get_revisions(head), self.revision_map.get_revisions(head),
) )
for h in heads: for h in heads:
assert h != "base" assert h != "base" # type: ignore[comparison-overlap]
if len(set(heads)) != len(heads): if len(set(heads)) != len(heads):
raise util.CommandError("Duplicate head revisions specified") raise util.CommandError("Duplicate head revisions specified")
@@ -687,7 +673,7 @@ class ScriptDirectory:
for head_ in heads: for head_ in heads:
if head_ is not None: if head_ is not None:
assert isinstance(head_, Script) assert isinstance(head_, Script)
version_path = os.path.dirname(head_.path) version_path = head_._script_path.parent
break break
else: else:
raise util.CommandError( raise util.CommandError(
@@ -695,16 +681,19 @@ class ScriptDirectory:
"please specify --version-path" "please specify --version-path"
) )
else: else:
version_path = self.versions version_path = self._singular_version_location
else:
version_path = Path(version_path)
norm_path = os.path.normpath(os.path.abspath(version_path)) assert isinstance(version_path, Path)
norm_path = version_path.absolute()
for vers_path in self._version_locations: for vers_path in self._version_locations:
if os.path.normpath(vers_path) == norm_path: if vers_path.absolute() == norm_path:
break break
else: else:
raise util.CommandError( raise util.CommandError(
"Path %s is not represented in current " f"Path {version_path} is not represented in current "
"version locations" % version_path "version locations"
) )
if self.version_locations: if self.version_locations:
@@ -725,9 +714,11 @@ class ScriptDirectory:
if depends_on: if depends_on:
with self._catch_revision_errors(): with self._catch_revision_errors():
resolved_depends_on = [ resolved_depends_on = [
dep (
if dep in rev.branch_labels # maintain branch labels dep
else rev.revision # resolve partial revision identifiers if dep in rev.branch_labels # maintain branch labels
else rev.revision
) # resolve partial revision identifiers
for rev, dep in [ for rev, dep in [
(not_none(self.revision_map.get_revision(dep)), dep) (not_none(self.revision_map.get_revision(dep)), dep)
for dep in util.to_list(depends_on) for dep in util.to_list(depends_on)
@@ -737,7 +728,7 @@ class ScriptDirectory:
resolved_depends_on = None resolved_depends_on = None
self._generate_template( self._generate_template(
os.path.join(self.dir, "script.py.mako"), Path(self.dir, "script.py.mako"),
path, path,
up_revision=str(revid), up_revision=str(revid),
down_revision=revision.tuple_rev_as_scalar( down_revision=revision.tuple_rev_as_scalar(
@@ -751,7 +742,7 @@ class ScriptDirectory:
**kw, **kw,
) )
post_write_hooks = self.hook_config post_write_hooks = self.hooks
if post_write_hooks: if post_write_hooks:
write_hooks._run_hooks(path, post_write_hooks) write_hooks._run_hooks(path, post_write_hooks)
@@ -774,11 +765,11 @@ class ScriptDirectory:
def _rev_path( def _rev_path(
self, self,
path: str, path: Union[str, os.PathLike[str]],
rev_id: str, rev_id: str,
message: Optional[str], message: Optional[str],
create_date: datetime.datetime, create_date: datetime.datetime,
) -> str: ) -> Path:
epoch = int(create_date.timestamp()) epoch = int(create_date.timestamp())
slug = "_".join(_slug_re.findall(message or "")).lower() slug = "_".join(_slug_re.findall(message or "")).lower()
if len(slug) > self.truncate_slug_length: if len(slug) > self.truncate_slug_length:
@@ -797,11 +788,10 @@ class ScriptDirectory:
"second": create_date.second, "second": create_date.second,
} }
) )
return os.path.join(path, filename) return Path(path) / filename
class Script(revision.Revision): class Script(revision.Revision):
"""Represent a single revision file in a ``versions/`` directory. """Represent a single revision file in a ``versions/`` directory.
The :class:`.Script` instance is returned by methods The :class:`.Script` instance is returned by methods
@@ -809,12 +799,17 @@ class Script(revision.Revision):
""" """
def __init__(self, module: ModuleType, rev_id: str, path: str): def __init__(
self,
module: ModuleType,
rev_id: str,
path: Union[str, os.PathLike[str]],
):
self.module = module self.module = module
self.path = path self.path = _preserving_path_as_str(path)
super().__init__( super().__init__(
rev_id, rev_id,
module.down_revision, # type: ignore[attr-defined] module.down_revision,
branch_labels=util.to_tuple( branch_labels=util.to_tuple(
getattr(module, "branch_labels", None), default=() getattr(module, "branch_labels", None), default=()
), ),
@@ -829,6 +824,10 @@ class Script(revision.Revision):
path: str path: str
"""Filesystem path of the script.""" """Filesystem path of the script."""
@property
def _script_path(self) -> Path:
return Path(self.path)
_db_current_indicator: Optional[bool] = None _db_current_indicator: Optional[bool] = None
"""Utility variable which when set will cause string output to indicate """Utility variable which when set will cause string output to indicate
this is a "current" version in some database""" this is a "current" version in some database"""
@@ -847,9 +846,9 @@ class Script(revision.Revision):
if doc: if doc:
if hasattr(self.module, "_alembic_source_encoding"): if hasattr(self.module, "_alembic_source_encoding"):
doc = doc.decode( # type: ignore[attr-defined] doc = doc.decode( # type: ignore[attr-defined]
self.module._alembic_source_encoding # type: ignore[attr-defined] # noqa self.module._alembic_source_encoding
) )
return doc.strip() # type: ignore[union-attr] return doc.strip()
else: else:
return "" return ""
@@ -889,7 +888,7 @@ class Script(revision.Revision):
) )
return entry return entry
def __str__(self): def __str__(self) -> str:
return "%s -> %s%s%s%s, %s" % ( return "%s -> %s%s%s%s, %s" % (
self._format_down_revision(), self._format_down_revision(),
self.revision, self.revision,
@@ -923,9 +922,11 @@ class Script(revision.Revision):
if head_indicators or tree_indicators: if head_indicators or tree_indicators:
text += "%s%s%s" % ( text += "%s%s%s" % (
" (head)" if self._is_real_head else "", " (head)" if self._is_real_head else "",
" (effective head)" (
if self.is_head and not self._is_real_head " (effective head)"
else "", if self.is_head and not self._is_real_head
else ""
),
" (current)" if self._db_current_indicator else "", " (current)" if self._db_current_indicator else "",
) )
if tree_indicators: if tree_indicators:
@@ -959,36 +960,33 @@ class Script(revision.Revision):
return util.format_as_comma(self._versioned_down_revisions) return util.format_as_comma(self._versioned_down_revisions)
@classmethod @classmethod
def _from_path( def _list_py_dir(
cls, scriptdir: ScriptDirectory, path: str cls, scriptdir: ScriptDirectory, path: Path
) -> Optional[Script]: ) -> List[Path]:
dir_, filename = os.path.split(path)
return cls._from_filename(scriptdir, dir_, filename)
@classmethod
def _list_py_dir(cls, scriptdir: ScriptDirectory, path: str) -> List[str]:
paths = [] paths = []
for root, dirs, files in os.walk(path, topdown=True): for root, dirs, files in compat.path_walk(path, top_down=True):
if root.endswith("__pycache__"): if root.name.endswith("__pycache__"):
# a special case - we may include these files # a special case - we may include these files
# if a `sourceless` option is specified # if a `sourceless` option is specified
continue continue
for filename in sorted(files): for filename in sorted(files):
paths.append(os.path.join(root, filename)) paths.append(root / filename)
if scriptdir.sourceless: if scriptdir.sourceless:
# look for __pycache__ # look for __pycache__
py_cache_path = os.path.join(root, "__pycache__") py_cache_path = root / "__pycache__"
if os.path.exists(py_cache_path): if py_cache_path.exists():
# add all files from __pycache__ whose filename is not # add all files from __pycache__ whose filename is not
# already in the names we got from the version directory. # already in the names we got from the version directory.
# add as relative paths including __pycache__ token # add as relative paths including __pycache__ token
names = {filename.split(".")[0] for filename in files} names = {
Path(filename).name.split(".")[0] for filename in files
}
paths.extend( paths.extend(
os.path.join(py_cache_path, pyc) py_cache_path / pyc
for pyc in os.listdir(py_cache_path) for pyc in py_cache_path.iterdir()
if pyc.split(".")[0] not in names if pyc.name.split(".")[0] not in names
) )
if not scriptdir.recursive_version_locations: if not scriptdir.recursive_version_locations:
@@ -1003,9 +1001,13 @@ class Script(revision.Revision):
return paths return paths
@classmethod @classmethod
def _from_filename( def _from_path(
cls, scriptdir: ScriptDirectory, dir_: str, filename: str cls, scriptdir: ScriptDirectory, path: Union[str, os.PathLike[str]]
) -> Optional[Script]: ) -> Optional[Script]:
path = Path(path)
dir_, filename = path.parent, path.name
if scriptdir.sourceless: if scriptdir.sourceless:
py_match = _sourceless_rev_file.match(filename) py_match = _sourceless_rev_file.match(filename)
else: else:
@@ -1023,8 +1025,8 @@ class Script(revision.Revision):
is_c = is_o = False is_c = is_o = False
if is_o or is_c: if is_o or is_c:
py_exists = os.path.exists(os.path.join(dir_, py_filename)) py_exists = (dir_ / py_filename).exists()
pyc_exists = os.path.exists(os.path.join(dir_, py_filename + "c")) pyc_exists = (dir_ / (py_filename + "c")).exists()
# prefer .py over .pyc because we'd like to get the # prefer .py over .pyc because we'd like to get the
# source encoding; prefer .pyc over .pyo because we'd like to # source encoding; prefer .pyc over .pyo because we'd like to
@@ -1040,14 +1042,14 @@ class Script(revision.Revision):
m = _legacy_rev.match(filename) m = _legacy_rev.match(filename)
if not m: if not m:
raise util.CommandError( raise util.CommandError(
"Could not determine revision id from filename %s. " "Could not determine revision id from "
f"filename {filename}. "
"Be sure the 'revision' variable is " "Be sure the 'revision' variable is "
"declared inside the script (please see 'Upgrading " "declared inside the script (please see 'Upgrading "
"from Alembic 0.1 to 0.2' in the documentation)." "from Alembic 0.1 to 0.2' in the documentation)."
% filename
) )
else: else:
revision = m.group(1) revision = m.group(1)
else: else:
revision = module.revision revision = module.revision
return Script(module, revision, os.path.join(dir_, filename)) return Script(module, revision, dir_ / filename)

View File

@@ -14,6 +14,7 @@ from typing import Iterator
from typing import List from typing import List
from typing import Optional from typing import Optional
from typing import overload from typing import overload
from typing import Protocol
from typing import Sequence from typing import Sequence
from typing import Set from typing import Set
from typing import Tuple from typing import Tuple
@@ -47,6 +48,17 @@ _relative_destination = re.compile(r"(?:(.+?)@)?(\w+)?((?:\+|-)\d+)")
_revision_illegal_chars = ["@", "-", "+"] _revision_illegal_chars = ["@", "-", "+"]
class _CollectRevisionsProtocol(Protocol):
def __call__(
self,
upper: _RevisionIdentifierType,
lower: _RevisionIdentifierType,
inclusive: bool,
implicit_base: bool,
assert_relative_length: bool,
) -> Tuple[Set[Revision], Tuple[Optional[_RevisionOrBase], ...]]: ...
class RevisionError(Exception): class RevisionError(Exception):
pass pass
@@ -396,7 +408,7 @@ class RevisionMap:
for rev in self._get_ancestor_nodes( for rev in self._get_ancestor_nodes(
[revision], [revision],
include_dependencies=False, include_dependencies=False,
map_=cast(_RevisionMapType, map_), map_=map_,
): ):
if rev is revision: if rev is revision:
continue continue
@@ -707,9 +719,11 @@ class RevisionMap:
resolved_target = target resolved_target = target
resolved_test_against_revs = [ resolved_test_against_revs = [
self._revision_for_ident(test_against_rev) (
if not isinstance(test_against_rev, Revision) self._revision_for_ident(test_against_rev)
else test_against_rev if not isinstance(test_against_rev, Revision)
else test_against_rev
)
for test_against_rev in util.to_tuple( for test_against_rev in util.to_tuple(
test_against_revs, default=() test_against_revs, default=()
) )
@@ -791,7 +805,7 @@ class RevisionMap:
The iterator yields :class:`.Revision` objects. The iterator yields :class:`.Revision` objects.
""" """
fn: Callable fn: _CollectRevisionsProtocol
if select_for_downgrade: if select_for_downgrade:
fn = self._collect_downgrade_revisions fn = self._collect_downgrade_revisions
else: else:
@@ -818,7 +832,7 @@ class RevisionMap:
) -> Iterator[Any]: ) -> Iterator[Any]:
if omit_immediate_dependencies: if omit_immediate_dependencies:
def fn(rev): def fn(rev: Revision) -> Iterable[str]:
if rev not in targets: if rev not in targets:
return rev._all_nextrev return rev._all_nextrev
else: else:
@@ -826,12 +840,12 @@ class RevisionMap:
elif include_dependencies: elif include_dependencies:
def fn(rev): def fn(rev: Revision) -> Iterable[str]:
return rev._all_nextrev return rev._all_nextrev
else: else:
def fn(rev): def fn(rev: Revision) -> Iterable[str]:
return rev.nextrev return rev.nextrev
return self._iterate_related_revisions( return self._iterate_related_revisions(
@@ -847,12 +861,12 @@ class RevisionMap:
) -> Iterator[Revision]: ) -> Iterator[Revision]:
if include_dependencies: if include_dependencies:
def fn(rev): def fn(rev: Revision) -> Iterable[str]:
return rev._normalized_down_revisions return rev._normalized_down_revisions
else: else:
def fn(rev): def fn(rev: Revision) -> Iterable[str]:
return rev._versioned_down_revisions return rev._versioned_down_revisions
return self._iterate_related_revisions( return self._iterate_related_revisions(
@@ -861,7 +875,7 @@ class RevisionMap:
def _iterate_related_revisions( def _iterate_related_revisions(
self, self,
fn: Callable, fn: Callable[[Revision], Iterable[str]],
targets: Collection[Optional[_RevisionOrBase]], targets: Collection[Optional[_RevisionOrBase]],
map_: Optional[_RevisionMapType], map_: Optional[_RevisionMapType],
check: bool = False, check: bool = False,
@@ -923,7 +937,7 @@ class RevisionMap:
id_to_rev = self._revision_map id_to_rev = self._revision_map
def get_ancestors(rev_id): def get_ancestors(rev_id: str) -> Set[str]:
return { return {
r.revision r.revision
for r in self._get_ancestor_nodes([id_to_rev[rev_id]]) for r in self._get_ancestor_nodes([id_to_rev[rev_id]])
@@ -1003,9 +1017,9 @@ class RevisionMap:
# each time but it was getting complicated # each time but it was getting complicated
current_heads[current_candidate_idx] = heads_to_add[0] current_heads[current_candidate_idx] = heads_to_add[0]
current_heads.extend(heads_to_add[1:]) current_heads.extend(heads_to_add[1:])
ancestors_by_idx[ ancestors_by_idx[current_candidate_idx] = (
current_candidate_idx get_ancestors(heads_to_add[0])
] = get_ancestors(heads_to_add[0]) )
ancestors_by_idx.extend( ancestors_by_idx.extend(
get_ancestors(head) for head in heads_to_add[1:] get_ancestors(head) for head in heads_to_add[1:]
) )
@@ -1041,7 +1055,7 @@ class RevisionMap:
children: Sequence[Optional[_RevisionOrBase]] children: Sequence[Optional[_RevisionOrBase]]
for _ in range(abs(steps)): for _ in range(abs(steps)):
if steps > 0: if steps > 0:
assert initial != "base" assert initial != "base" # type: ignore[comparison-overlap]
# Walk up # Walk up
walk_up = [ walk_up = [
is_revision(rev) is_revision(rev)
@@ -1055,7 +1069,7 @@ class RevisionMap:
children = walk_up children = walk_up
else: else:
# Walk down # Walk down
if initial == "base": if initial == "base": # type: ignore[comparison-overlap]
children = () children = ()
else: else:
children = self.get_revisions( children = self.get_revisions(
@@ -1170,9 +1184,13 @@ class RevisionMap:
branch_label = symbol branch_label = symbol
# Walk down the tree to find downgrade target. # Walk down the tree to find downgrade target.
rev = self._walk( rev = self._walk(
start=self.get_revision(symbol) start=(
if branch_label is None self.get_revision(symbol)
else self.get_revision("%s@%s" % (branch_label, symbol)), if branch_label is None
else self.get_revision(
"%s@%s" % (branch_label, symbol)
)
),
steps=rel_int, steps=rel_int,
no_overwalk=assert_relative_length, no_overwalk=assert_relative_length,
) )
@@ -1189,7 +1207,7 @@ class RevisionMap:
# No relative destination given, revision specified is absolute. # No relative destination given, revision specified is absolute.
branch_label, _, symbol = target.rpartition("@") branch_label, _, symbol = target.rpartition("@")
if not branch_label: if not branch_label:
branch_label = None # type:ignore[assignment] branch_label = None
return branch_label, self.get_revision(symbol) return branch_label, self.get_revision(symbol)
def _parse_upgrade_target( def _parse_upgrade_target(
@@ -1290,9 +1308,13 @@ class RevisionMap:
) )
return ( return (
self._walk( self._walk(
start=self.get_revision(symbol) start=(
if branch_label is None self.get_revision(symbol)
else self.get_revision("%s@%s" % (branch_label, symbol)), if branch_label is None
else self.get_revision(
"%s@%s" % (branch_label, symbol)
)
),
steps=relative, steps=relative,
no_overwalk=assert_relative_length, no_overwalk=assert_relative_length,
), ),
@@ -1301,11 +1323,11 @@ class RevisionMap:
def _collect_downgrade_revisions( def _collect_downgrade_revisions(
self, self,
upper: _RevisionIdentifierType, upper: _RevisionIdentifierType,
target: _RevisionIdentifierType, lower: _RevisionIdentifierType,
inclusive: bool, inclusive: bool,
implicit_base: bool, implicit_base: bool,
assert_relative_length: bool, assert_relative_length: bool,
) -> Any: ) -> Tuple[Set[Revision], Tuple[Optional[_RevisionOrBase], ...]]:
""" """
Compute the set of current revisions specified by :upper, and the Compute the set of current revisions specified by :upper, and the
downgrade target specified by :target. Return all dependents of target downgrade target specified by :target. Return all dependents of target
@@ -1316,7 +1338,7 @@ class RevisionMap:
branch_label, target_revision = self._parse_downgrade_target( branch_label, target_revision = self._parse_downgrade_target(
current_revisions=upper, current_revisions=upper,
target=target, target=lower,
assert_relative_length=assert_relative_length, assert_relative_length=assert_relative_length,
) )
if target_revision == "base": if target_revision == "base":
@@ -1408,7 +1430,7 @@ class RevisionMap:
inclusive: bool, inclusive: bool,
implicit_base: bool, implicit_base: bool,
assert_relative_length: bool, assert_relative_length: bool,
) -> Tuple[Set[Revision], Tuple[Optional[_RevisionOrBase]]]: ) -> Tuple[Set[Revision], Tuple[Revision, ...]]:
""" """
Compute the set of required revisions specified by :upper, and the Compute the set of required revisions specified by :upper, and the
current set of active revisions specified by :lower. Find the current set of active revisions specified by :lower. Find the
@@ -1500,7 +1522,7 @@ class RevisionMap:
) )
needs.intersection_update(lower_descendents) needs.intersection_update(lower_descendents)
return needs, tuple(targets) # type:ignore[return-value] return needs, tuple(targets)
def _get_all_current( def _get_all_current(
self, id_: Tuple[str, ...] self, id_: Tuple[str, ...]
@@ -1681,15 +1703,13 @@ class Revision:
@overload @overload
def tuple_rev_as_scalar(rev: None) -> None: def tuple_rev_as_scalar(rev: None) -> None: ...
...
@overload @overload
def tuple_rev_as_scalar( def tuple_rev_as_scalar(
rev: Union[Tuple[_T, ...], List[_T]] rev: Union[Tuple[_T, ...], List[_T]],
) -> Union[_T, Tuple[_T, ...], List[_T]]: ) -> Union[_T, Tuple[_T, ...], List[_T]]: ...
...
def tuple_rev_as_scalar( def tuple_rev_as_scalar(

View File

@@ -1,5 +1,10 @@
# mypy: allow-untyped-defs, allow-incomplete-defs, allow-untyped-calls
# mypy: no-warn-return-any, allow-any-generics
from __future__ import annotations from __future__ import annotations
import importlib.util
import os
import shlex import shlex
import subprocess import subprocess
import sys import sys
@@ -7,13 +12,16 @@ from typing import Any
from typing import Callable from typing import Callable
from typing import Dict from typing import Dict
from typing import List from typing import List
from typing import Mapping
from typing import Optional from typing import Optional
from typing import TYPE_CHECKING
from typing import Union from typing import Union
from .. import util from .. import util
from ..util import compat from ..util import compat
from ..util.pyfiles import _preserving_path_as_str
if TYPE_CHECKING:
from ..config import PostWriteHookConfig
REVISION_SCRIPT_TOKEN = "REVISION_SCRIPT_FILENAME" REVISION_SCRIPT_TOKEN = "REVISION_SCRIPT_FILENAME"
@@ -40,16 +48,19 @@ def register(name: str) -> Callable:
def _invoke( def _invoke(
name: str, revision: str, options: Mapping[str, Union[str, int]] name: str,
revision_path: Union[str, os.PathLike[str]],
options: PostWriteHookConfig,
) -> Any: ) -> Any:
"""Invokes the formatter registered for the given name. """Invokes the formatter registered for the given name.
:param name: The name of a formatter in the registry :param name: The name of a formatter in the registry
:param revision: A :class:`.MigrationRevision` instance :param revision: string path to the revision file
:param options: A dict containing kwargs passed to the :param options: A dict containing kwargs passed to the
specified formatter. specified formatter.
:raises: :class:`alembic.util.CommandError` :raises: :class:`alembic.util.CommandError`
""" """
revision_path = _preserving_path_as_str(revision_path)
try: try:
hook = _registry[name] hook = _registry[name]
except KeyError as ke: except KeyError as ke:
@@ -57,36 +68,28 @@ def _invoke(
f"No formatter with name '{name}' registered" f"No formatter with name '{name}' registered"
) from ke ) from ke
else: else:
return hook(revision, options) return hook(revision_path, options)
def _run_hooks(path: str, hook_config: Mapping[str, str]) -> None: def _run_hooks(
path: Union[str, os.PathLike[str]], hooks: list[PostWriteHookConfig]
) -> None:
"""Invoke hooks for a generated revision.""" """Invoke hooks for a generated revision."""
from .base import _split_on_space_comma for hook in hooks:
name = hook["_hook_name"]
names = _split_on_space_comma.split(hook_config.get("hooks", ""))
for name in names:
if not name:
continue
opts = {
key[len(name) + 1 :]: hook_config[key]
for key in hook_config
if key.startswith(name + ".")
}
opts["_hook_name"] = name
try: try:
type_ = opts["type"] type_ = hook["type"]
except KeyError as ke: except KeyError as ke:
raise util.CommandError( raise util.CommandError(
f"Key {name}.type is required for post write hook {name!r}" f"Key '{name}.type' (or 'type' in toml) is required "
f"for post write hook {name!r}"
) from ke ) from ke
else: else:
with util.status( with util.status(
f"Running post write hook {name!r}", newline=True f"Running post write hook {name!r}", newline=True
): ):
_invoke(type_, path, opts) _invoke(type_, path, hook)
def _parse_cmdline_options(cmdline_options_str: str, path: str) -> List[str]: def _parse_cmdline_options(cmdline_options_str: str, path: str) -> List[str]:
@@ -110,17 +113,35 @@ def _parse_cmdline_options(cmdline_options_str: str, path: str) -> List[str]:
return cmdline_options_list return cmdline_options_list
def _get_required_option(options: dict, name: str) -> str:
try:
return options[name]
except KeyError as ke:
raise util.CommandError(
f"Key {options['_hook_name']}.{name} is required for post "
f"write hook {options['_hook_name']!r}"
) from ke
def _run_hook(
path: str, options: dict, ignore_output: bool, command: List[str]
) -> None:
cwd: Optional[str] = options.get("cwd", None)
cmdline_options_str = options.get("options", "")
cmdline_options_list = _parse_cmdline_options(cmdline_options_str, path)
kw: Dict[str, Any] = {}
if ignore_output:
kw["stdout"] = kw["stderr"] = subprocess.DEVNULL
subprocess.run([*command, *cmdline_options_list], cwd=cwd, **kw)
@register("console_scripts") @register("console_scripts")
def console_scripts( def console_scripts(
path: str, options: dict, ignore_output: bool = False path: str, options: dict, ignore_output: bool = False
) -> None: ) -> None:
try: entrypoint_name = _get_required_option(options, "entrypoint")
entrypoint_name = options["entrypoint"]
except KeyError as ke:
raise util.CommandError(
f"Key {options['_hook_name']}.entrypoint is required for post "
f"write hook {options['_hook_name']!r}"
) from ke
for entry in compat.importlib_metadata_get("console_scripts"): for entry in compat.importlib_metadata_get("console_scripts"):
if entry.name == entrypoint_name: if entry.name == entrypoint_name:
impl: Any = entry impl: Any = entry
@@ -129,48 +150,27 @@ def console_scripts(
raise util.CommandError( raise util.CommandError(
f"Could not find entrypoint console_scripts.{entrypoint_name}" f"Could not find entrypoint console_scripts.{entrypoint_name}"
) )
cwd: Optional[str] = options.get("cwd", None)
cmdline_options_str = options.get("options", "")
cmdline_options_list = _parse_cmdline_options(cmdline_options_str, path)
kw: Dict[str, Any] = {} command = [
if ignore_output: sys.executable,
kw["stdout"] = kw["stderr"] = subprocess.DEVNULL "-c",
f"import {impl.module}; {impl.module}.{impl.attr}()",
subprocess.run( ]
[ _run_hook(path, options, ignore_output, command)
sys.executable,
"-c",
f"import {impl.module}; {impl.module}.{impl.attr}()",
]
+ cmdline_options_list,
cwd=cwd,
**kw,
)
@register("exec") @register("exec")
def exec_(path: str, options: dict, ignore_output: bool = False) -> None: def exec_(path: str, options: dict, ignore_output: bool = False) -> None:
try: executable = _get_required_option(options, "executable")
executable = options["executable"] _run_hook(path, options, ignore_output, command=[executable])
except KeyError as ke:
raise util.CommandError(
f"Key {options['_hook_name']}.executable is required for post "
f"write hook {options['_hook_name']!r}"
) from ke
cwd: Optional[str] = options.get("cwd", None)
cmdline_options_str = options.get("options", "")
cmdline_options_list = _parse_cmdline_options(cmdline_options_str, path)
kw: Dict[str, Any] = {}
if ignore_output:
kw["stdout"] = kw["stderr"] = subprocess.DEVNULL
subprocess.run( @register("module")
[ def module(path: str, options: dict, ignore_output: bool = False) -> None:
executable, module_name = _get_required_option(options, "module")
*cmdline_options_list,
], if importlib.util.find_spec(module_name) is None:
cwd=cwd, raise util.CommandError(f"Could not find module {module_name}")
**kw,
) command = [sys.executable, "-m", module_name]
_run_hook(path, options, ignore_output, command)

View File

@@ -1,27 +1,32 @@
# A generic, single database configuration. # A generic, single database configuration.
[alembic] [alembic]
# path to migration scripts # path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = ${script_location} script_location = ${script_location}
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s # template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time # Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s # file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present. # sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. # defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = . prepend_sys_path = .
# timezone to use when rendering the date within the migration file # timezone to use when rendering the date within the migration file
# as well as the filename. # as well as the filename.
# If specified, requires the python-dateutil library that can be # If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# installed by adding `alembic[tz]` to the pip requirements # Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz() # string value is passed to ZoneInfo()
# leave blank for localtime # leave blank for localtime
# timezone = # timezone =
# max length of characters to apply to the # max length of characters to apply to the "slug" field
# "slug" field
# truncate_slug_length = 40 # truncate_slug_length = 40
# set to 'true' to run the environment during # set to 'true' to run the environment during
@@ -34,20 +39,38 @@ prepend_sys_path = .
# sourceless = false # sourceless = false
# version location specification; This defaults # version location specification; This defaults
# to ${script_location}/versions. When using multiple version # to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path. # directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below. # The path separator used here should be the separator specified by "path_separator"
# version_locations = %(here)s/bar:%(here)s/bat:${script_location}/versions # below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# version path separator; As mentioned above, this is the character used to split # path_separator; This indicates what character is used to split lists of file
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep. # paths, including version_locations and prepend_sys_path within configparser
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas. # files such as alembic.ini.
# Valid values for version_path_separator are: # The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
# #
# version_path_separator = : # Note that in order to support legacy alembic.ini files, this default does NOT
# version_path_separator = ; # take place if path_separator is not present in alembic.ini. If this
# version_path_separator = space # option is omitted entirely, fallback logic is as follows:
version_path_separator = os # Use os.pathsep. Default configuration used for new projects. #
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively # set to 'true' to search source files recursively
# in each "version_locations" directory # in each "version_locations" directory
@@ -58,6 +81,9 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
# are written from script.py.mako # are written from script.py.mako
# output_encoding = utf-8 # output_encoding = utf-8
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = driver://user:pass@localhost/dbname sqlalchemy.url = driver://user:pass@localhost/dbname
@@ -72,13 +98,20 @@ sqlalchemy.url = driver://user:pass@localhost/dbname
# black.entrypoint = black # black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME # black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary # lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# hooks = ruff
# ruff.type = module
# ruff.module = ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Alternatively, use the exec runner to execute a binary found on your PATH
# hooks = ruff # hooks = ruff
# ruff.type = exec # ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff # ruff.executable = ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME # ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration # Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers] [loggers]
keys = root,sqlalchemy,alembic keys = root,sqlalchemy,alembic
@@ -89,12 +122,12 @@ keys = console
keys = generic keys = generic
[logger_root] [logger_root]
level = WARN level = WARNING
handlers = console handlers = console
qualname = qualname =
[logger_sqlalchemy] [logger_sqlalchemy]
level = WARN level = WARNING
handlers = handlers =
qualname = sqlalchemy.engine qualname = sqlalchemy.engine

View File

@@ -13,14 +13,16 @@ ${imports if imports else ""}
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)} revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)} down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None: def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"} ${upgrades if upgrades else "pass"}
def downgrade() -> None: def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"} ${downgrades if downgrades else "pass"}

View File

@@ -1,7 +1,10 @@
# A generic, single database configuration. # A generic, single database configuration.
[alembic] [alembic]
# path to migration scripts # path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = ${script_location} script_location = ${script_location}
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s # template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
@@ -11,19 +14,20 @@ script_location = ${script_location}
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s # file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present. # sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. # defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = . prepend_sys_path = .
# timezone to use when rendering the date within the migration file # timezone to use when rendering the date within the migration file
# as well as the filename. # as well as the filename.
# If specified, requires the python-dateutil library that can be # If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# installed by adding `alembic[tz]` to the pip requirements # Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz() # string value is passed to ZoneInfo()
# leave blank for localtime # leave blank for localtime
# timezone = # timezone =
# max length of characters to apply to the # max length of characters to apply to the "slug" field
# "slug" field
# truncate_slug_length = 40 # truncate_slug_length = 40
# set to 'true' to run the environment during # set to 'true' to run the environment during
@@ -36,20 +40,37 @@ prepend_sys_path = .
# sourceless = false # sourceless = false
# version location specification; This defaults # version location specification; This defaults
# to ${script_location}/versions. When using multiple version # to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path. # directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below. # The path separator used here should be the separator specified by "path_separator"
# version_locations = %(here)s/bar:%(here)s/bat:${script_location}/versions # below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# version path separator; As mentioned above, this is the character used to split # path_separator; This indicates what character is used to split lists of file
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep. # paths, including version_locations and prepend_sys_path within configparser
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas. # files such as alembic.ini.
# Valid values for version_path_separator are: # The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
# #
# version_path_separator = : # Note that in order to support legacy alembic.ini files, this default does NOT
# version_path_separator = ; # take place if path_separator is not present in alembic.ini. If this
# version_path_separator = space # option is omitted entirely, fallback logic is as follows:
version_path_separator = os # Use os.pathsep. Default configuration used for new projects. #
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively # set to 'true' to search source files recursively
# in each "version_locations" directory # in each "version_locations" directory
@@ -60,6 +81,9 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
# are written from script.py.mako # are written from script.py.mako
# output_encoding = utf-8 # output_encoding = utf-8
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = driver://user:pass@localhost/dbname sqlalchemy.url = driver://user:pass@localhost/dbname
@@ -74,13 +98,20 @@ sqlalchemy.url = driver://user:pass@localhost/dbname
# black.entrypoint = black # black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME # black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary # lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# hooks = ruff
# ruff.type = module
# ruff.module = ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Alternatively, use the exec runner to execute a binary found on your PATH
# hooks = ruff # hooks = ruff
# ruff.type = exec # ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff # ruff.executable = ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME # ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration # Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers] [loggers]
keys = root,sqlalchemy,alembic keys = root,sqlalchemy,alembic
@@ -91,12 +122,12 @@ keys = console
keys = generic keys = generic
[logger_root] [logger_root]
level = WARN level = WARNING
handlers = console handlers = console
qualname = qualname =
[logger_sqlalchemy] [logger_sqlalchemy]
level = WARN level = WARNING
handlers = handlers =
qualname = sqlalchemy.engine qualname = sqlalchemy.engine

View File

@@ -13,14 +13,16 @@ ${imports if imports else ""}
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)} revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)} down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None: def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"} ${upgrades if upgrades else "pass"}
def downgrade() -> None: def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"} ${downgrades if downgrades else "pass"}

View File

@@ -1,7 +1,10 @@
# a multi-database configuration. # a multi-database configuration.
[alembic] [alembic]
# path to migration scripts # path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = ${script_location} script_location = ${script_location}
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s # template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
@@ -11,19 +14,19 @@ script_location = ${script_location}
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s # file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present. # sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. # defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = . prepend_sys_path = .
# timezone to use when rendering the date within the migration file # timezone to use when rendering the date within the migration file
# as well as the filename. # as well as the filename.
# If specified, requires the python-dateutil library that can be # If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# installed by adding `alembic[tz]` to the pip requirements # Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz() # string value is passed to ZoneInfo()
# leave blank for localtime # leave blank for localtime
# timezone = # timezone =
# max length of characters to apply to the # max length of characters to apply to the "slug" field
# "slug" field
# truncate_slug_length = 40 # truncate_slug_length = 40
# set to 'true' to run the environment during # set to 'true' to run the environment during
@@ -36,20 +39,37 @@ prepend_sys_path = .
# sourceless = false # sourceless = false
# version location specification; This defaults # version location specification; This defaults
# to ${script_location}/versions. When using multiple version # to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path. # directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below. # The path separator used here should be the separator specified by "path_separator"
# version_locations = %(here)s/bar:%(here)s/bat:${script_location}/versions # below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# version path separator; As mentioned above, this is the character used to split # path_separator; This indicates what character is used to split lists of file
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep. # paths, including version_locations and prepend_sys_path within configparser
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas. # files such as alembic.ini.
# Valid values for version_path_separator are: # The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
# #
# version_path_separator = : # Note that in order to support legacy alembic.ini files, this default does NOT
# version_path_separator = ; # take place if path_separator is not present in alembic.ini. If this
# version_path_separator = space # option is omitted entirely, fallback logic is as follows:
version_path_separator = os # Use os.pathsep. Default configuration used for new projects. #
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively # set to 'true' to search source files recursively
# in each "version_locations" directory # in each "version_locations" directory
@@ -60,6 +80,13 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
# are written from script.py.mako # are written from script.py.mako
# output_encoding = utf-8 # output_encoding = utf-8
# for multiple database configuration, new named sections are added
# which each include a distinct ``sqlalchemy.url`` entry. A custom value
# ``databases`` is added which indicates a listing of the per-database sections.
# The ``databases`` entry as well as the URLs present in the ``[engine1]``
# and ``[engine2]`` sections continue to be consumed by the user-maintained env.py
# script only.
databases = engine1, engine2 databases = engine1, engine2
[engine1] [engine1]
@@ -79,13 +106,20 @@ sqlalchemy.url = driver://user:pass@localhost/dbname2
# black.entrypoint = black # black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME # black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary # lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# hooks = ruff
# ruff.type = module
# ruff.module = ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Alternatively, use the exec runner to execute a binary found on your PATH
# hooks = ruff # hooks = ruff
# ruff.type = exec # ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff # ruff.executable = ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME # ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration # Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers] [loggers]
keys = root,sqlalchemy,alembic keys = root,sqlalchemy,alembic
@@ -96,12 +130,12 @@ keys = console
keys = generic keys = generic
[logger_root] [logger_root]
level = WARN level = WARNING
handlers = console handlers = console
qualname = qualname =
[logger_sqlalchemy] [logger_sqlalchemy]
level = WARN level = WARNING
handlers = handlers =
qualname = sqlalchemy.engine qualname = sqlalchemy.engine

View File

@@ -16,16 +16,18 @@ ${imports if imports else ""}
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)} revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)} down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade(engine_name: str) -> None: def upgrade(engine_name: str) -> None:
"""Upgrade schema."""
globals()["upgrade_%s" % engine_name]() globals()["upgrade_%s" % engine_name]()
def downgrade(engine_name: str) -> None: def downgrade(engine_name: str) -> None:
"""Downgrade schema."""
globals()["downgrade_%s" % engine_name]() globals()["downgrade_%s" % engine_name]()
<% <%
@@ -38,10 +40,12 @@ def downgrade(engine_name: str) -> None:
% for db_name in re.split(r',\s*', db_names): % for db_name in re.split(r',\s*', db_names):
def upgrade_${db_name}() -> None: def upgrade_${db_name}() -> None:
"""Upgrade ${db_name} schema."""
${context.get("%s_upgrades" % db_name, "pass")} ${context.get("%s_upgrades" % db_name, "pass")}
def downgrade_${db_name}() -> None: def downgrade_${db_name}() -> None:
"""Downgrade ${db_name} schema."""
${context.get("%s_downgrades" % db_name, "pass")} ${context.get("%s_downgrades" % db_name, "pass")}
% endfor % endfor

View File

@@ -0,0 +1 @@
pyproject configuration, based on the generic configuration.

View File

@@ -0,0 +1,44 @@
# A generic, single database configuration.
[alembic]
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = driver://user:pass@localhost/dbname
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARNING
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@@ -0,0 +1,78 @@
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@@ -0,0 +1,82 @@
[tool.alembic]
# path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = "${script_location}"
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = "%%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s"
# additional paths to be prepended to sys.path. defaults to the current working directory.
prepend_sys_path = [
"."
]
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# version_locations = [
# "%(here)s/alembic/versions",
# "%(here)s/foo/bar"
# ]
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = "utf-8"
# This section defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# [[tool.alembic.post_write_hooks]]
# format using "black" - use the console_scripts runner,
# against the "black" entrypoint
# name = "black"
# type = "console_scripts"
# entrypoint = "black"
# options = "-l 79 REVISION_SCRIPT_FILENAME"
#
# [[tool.alembic.post_write_hooks]]
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# name = "ruff"
# type = "module"
# module = "ruff"
# options = "check --fix REVISION_SCRIPT_FILENAME"
#
# [[tool.alembic.post_write_hooks]]
# Alternatively, use the exec runner to execute a binary found on your PATH
# name = "ruff"
# type = "exec"
# executable = "ruff"
# options = "check --fix REVISION_SCRIPT_FILENAME"

View File

@@ -0,0 +1,28 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}
def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1 @@
pyproject configuration, with an async dbapi.

View File

@@ -0,0 +1,44 @@
# A generic, single database configuration.
[alembic]
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = driver://user:pass@localhost/dbname
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARNING
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@@ -0,0 +1,89 @@
import asyncio
from logging.config import fileConfig
from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection: Connection) -> None:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
async def run_async_migrations() -> None:
"""In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode."""
asyncio.run(run_async_migrations())
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@@ -0,0 +1,82 @@
[tool.alembic]
# path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = "${script_location}"
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = "%%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s"
# additional paths to be prepended to sys.path. defaults to the current working directory.
prepend_sys_path = [
"."
]
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# version_locations = [
# "%(here)s/alembic/versions",
# "%(here)s/foo/bar"
# ]
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = "utf-8"
# This section defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# [[tool.alembic.post_write_hooks]]
# format using "black" - use the console_scripts runner,
# against the "black" entrypoint
# name = "black"
# type = "console_scripts"
# entrypoint = "black"
# options = "-l 79 REVISION_SCRIPT_FILENAME"
#
# [[tool.alembic.post_write_hooks]]
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# name = "ruff"
# type = "module"
# module = "ruff"
# options = "check --fix REVISION_SCRIPT_FILENAME"
#
# [[tool.alembic.post_write_hooks]]
# Alternatively, use the exec runner to execute a binary found on your PATH
# name = "ruff"
# type = "exec"
# executable = "ruff"
# options = "check --fix REVISION_SCRIPT_FILENAME"

View File

@@ -0,0 +1,28 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}
def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}

View File

@@ -9,12 +9,15 @@ from sqlalchemy.testing import uses_deprecated
from sqlalchemy.testing.config import combinations from sqlalchemy.testing.config import combinations
from sqlalchemy.testing.config import fixture from sqlalchemy.testing.config import fixture
from sqlalchemy.testing.config import requirements as requires from sqlalchemy.testing.config import requirements as requires
from sqlalchemy.testing.config import Variation
from sqlalchemy.testing.config import variation
from .assertions import assert_raises from .assertions import assert_raises
from .assertions import assert_raises_message from .assertions import assert_raises_message
from .assertions import emits_python_deprecation_warning from .assertions import emits_python_deprecation_warning
from .assertions import eq_ from .assertions import eq_
from .assertions import eq_ignore_whitespace from .assertions import eq_ignore_whitespace
from .assertions import expect_deprecated
from .assertions import expect_raises from .assertions import expect_raises
from .assertions import expect_raises_message from .assertions import expect_raises_message
from .assertions import expect_sqlalchemy_deprecated from .assertions import expect_sqlalchemy_deprecated

Some files were not shown because too many files have changed in this diff Show More