Overhaul script handling. Closes #125.

This commit is contained in:
Jack Kingsman
2026-03-30 15:40:13 -07:00
parent 60f3fa8e36
commit 3f6efaae1d
19 changed files with 184 additions and 67 deletions

2
.gitignore vendored
View File

@@ -2,6 +2,8 @@
__pycache__/
*.py[oc]
build/
!scripts/build/
!scripts/build/**
wheels/
*.egg-info

View File

@@ -7,7 +7,7 @@
If instructed to "run all tests" or "get ready for a commit" or other summative, work ending directives, run:
```bash
./scripts/all_quality.sh
./scripts/quality/all_quality.sh
```
This is the repo's end-to-end quality gate. It runs backend/frontend autofixers first, then type checking, tests, and the standard frontend build. All checks must pass green, and the script may leave formatting/lint edits behind.
@@ -210,10 +210,16 @@ This message-layer echo/path handling is independent of raw-packet storage dedup
│ │ └── ...
│ └── vite.config.ts
├── scripts/ # Quality / release helpers (listing below is representative, not exhaustive)
│ ├── all_quality.sh # Repo-standard autofix + validate gate
│ ├── collect_licenses.sh # Gather third-party license attributions
├── e2e.sh # End-to-end test runner
── publish.sh # Version bump, changelog, docker build & push
│ ├── build/
│ ├── collect_licenses.sh # Gather third-party license attributions
│ └── publish.sh # Version bump, changelog, docker build & push
── quality/
│ │ ├── all_quality.sh # Repo-standard autofix + validate gate
│ │ ├── e2e.sh # End-to-end test runner
│ │ └── extended_quality.sh # Quality gate plus e2e and Docker matrix
│ └── setup/
│ ├── fetch_prebuilt_frontend.py # Download release frontend fallback
│ └── install_service.sh # Install/configure Linux systemd service
├── README_ADVANCED.md # Advanced setup, troubleshooting, and service guidance
├── CONTRIBUTING.md # Contributor workflow and testing guidance
├── tests/ # Backend tests (pytest)
@@ -298,7 +304,7 @@ npm run test:run
### Before Completing Major Changes
**Run `./scripts/all_quality.sh` before finishing major changes that have modified code or tests.** It is the standard repo gate: autofix first, then type checks, tests, and the standard frontend build. This is not necessary for docs-only changes. For minor changes (like wording, color, spacing, etc.), wait until prompted to run the quality gate.
**Run `./scripts/quality/all_quality.sh` before finishing major changes that have modified code or tests.** It is the standard repo gate: autofix first, then type checks, tests, and the standard frontend build. This is not necessary for docs-only changes. For minor changes (like wording, color, spacing, etc.), wait until prompted to run the quality gate.
## API Summary

View File

@@ -48,7 +48,7 @@ Run both the backend and `npm run dev` for hot-reloading frontend development.
Run the full quality suite before proposing or handing off code changes:
```bash
./scripts/all_quality.sh
./scripts/quality/all_quality.sh
```
That runs linting, formatting, type checking, tests, and builds for both backend and frontend.

View File

@@ -1,6 +1,6 @@
# Third-Party Licenses
Auto-generated by `scripts/collect_licenses.sh` — do not edit by hand.
Auto-generated by `scripts/build/collect_licenses.sh` — do not edit by hand.
## Backend (Python) Dependencies
@@ -1748,4 +1748,3 @@ THE SOFTWARE.
```
</details>

View File

@@ -95,7 +95,7 @@ Access the app at http://localhost:8000.
Source checkouts expect a normal frontend build in `frontend/dist`.
On Linux, if you want this installed as a persistent `systemd` service that starts on boot and restarts automatically on failure, run `bash scripts/install_service.sh` from the repo root.
On Linux, if you want this installed as a persistent `systemd` service that starts on boot and restarts automatically on failure, run `bash scripts/setup/install_service.sh` from the repo root.
## Path 1.5: Use The Prebuilt Release Zip
@@ -111,7 +111,7 @@ uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
The release bundle includes `frontend/prebuilt`, so it does not require a local frontend build.
Alternatively, if you have already cloned the repo, you can fetch just the prebuilt frontend into your working tree without downloading the full release zip via `python3 scripts/fetch_prebuilt_frontend.py`.
Alternatively, if you have already cloned the repo, you can fetch just the prebuilt frontend into your working tree without downloading the full release zip via `python3 scripts/setup/fetch_prebuilt_frontend.py`.
## Path 2: Docker

View File

@@ -53,7 +53,7 @@ Two paths are available depending on your comfort level with Linux system admini
On Linux systems, this is the recommended installation method if you want RemoteTerm set up as a persistent systemd service that starts automatically on boot and restarts automatically if it crashes. Run the installer script from the repo root. It runs as your current user, installs from wherever you cloned the repo, and prints a quick-reference cheatsheet when done — no separate service account or path juggling required.
```bash
bash scripts/install_service.sh
bash scripts/setup/install_service.sh
```
The script interactively asks which transport to use (serial auto-detect, serial with explicit port, TCP, or BLE), whether to build the frontend locally or download a prebuilt copy, whether to enable the bot system, and whether to set up HTTP Basic Auth. It handles dependency installation (`uv sync`), validates `node`/`npm` for local builds, adds your user to the `dialout` group if needed, writes the systemd unit file, and enables the service. After installation, normal operations work without any `sudo -u` gymnastics:
@@ -69,7 +69,7 @@ cd frontend && npm install && npm run build && cd ..
sudo systemctl restart remoteterm
# Refresh prebuilt frontend only (skips local build)
python3 scripts/fetch_prebuilt_frontend.py
python3 scripts/setup/fetch_prebuilt_frontend.py
sudo systemctl restart remoteterm
# View live logs

View File

@@ -96,8 +96,12 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_messages_dedup_null_safe
ON messages(type, conversation_key, text, COALESCE(sender_timestamp, 0))
WHERE type = 'CHAN';
CREATE INDEX IF NOT EXISTS idx_raw_packets_message_id ON raw_packets(message_id);
CREATE INDEX IF NOT EXISTS idx_raw_packets_timestamp ON raw_packets(timestamp);
CREATE UNIQUE INDEX IF NOT EXISTS idx_raw_packets_payload_hash ON raw_packets(payload_hash);
CREATE INDEX IF NOT EXISTS idx_contacts_on_radio ON contacts(on_radio);
CREATE INDEX IF NOT EXISTS idx_contacts_type_last_seen ON contacts(type, last_seen);
CREATE INDEX IF NOT EXISTS idx_messages_type_received_conversation
ON messages(type, received_at, conversation_key);
-- idx_messages_sender_key is created by migration 25 (after adding the sender_key column)
-- idx_messages_incoming_priv_dedup is created by migration 44 after legacy rows are reconciled
CREATE INDEX IF NOT EXISTS idx_contact_advert_paths_recent

View File

@@ -360,6 +360,13 @@ async def run_migrations(conn: aiosqlite.Connection) -> int:
await set_version(conn, 46)
applied += 1
# Migration 47: Add statistics indexes for time-windowed scans
if version < 47:
logger.info("Applying migration 47: add statistics indexes")
await _migrate_047_add_statistics_indexes(conn)
await set_version(conn, 47)
applied += 1
if applied > 0:
logger.info(
"Applied %d migration(s), schema now at version %d", applied, await get_version(conn)
@@ -2868,3 +2875,37 @@ async def _migrate_046_cleanup_orphaned_contact_child_rows(conn: aiosqlite.Conne
)
await conn.commit()
async def _migrate_047_add_statistics_indexes(conn: aiosqlite.Connection) -> None:
"""Add indexes used by the statistics endpoint's time-windowed scans."""
cursor = await conn.execute("SELECT name FROM sqlite_master WHERE type='table'")
tables = {row[0] for row in await cursor.fetchall()}
if "raw_packets" in tables:
cursor = await conn.execute("PRAGMA table_info(raw_packets)")
raw_packet_columns = {row[1] for row in await cursor.fetchall()}
if "timestamp" in raw_packet_columns:
await conn.execute(
"CREATE INDEX IF NOT EXISTS idx_raw_packets_timestamp ON raw_packets(timestamp)"
)
if "contacts" in tables:
cursor = await conn.execute("PRAGMA table_info(contacts)")
contact_columns = {row[1] for row in await cursor.fetchall()}
if {"type", "last_seen"}.issubset(contact_columns):
await conn.execute(
"CREATE INDEX IF NOT EXISTS idx_contacts_type_last_seen ON contacts(type, last_seen)"
)
if "messages" in tables:
cursor = await conn.execute("PRAGMA table_info(messages)")
message_columns = {row[1] for row in await cursor.fetchall()}
if {"type", "received_at", "conversation_key"}.issubset(message_columns):
await conn.execute(
"""
CREATE INDEX IF NOT EXISTS idx_messages_type_received_conversation
ON messages(type, received_at, conversation_key)
"""
)
await conn.commit()

View File

@@ -404,7 +404,7 @@ Do not rely on old class-only layout assumptions.
Run all quality checks (backend + frontend) from the repo root:
```bash
./scripts/all_quality.sh
./scripts/quality/all_quality.sh
```
Or run frontend checks individually:

View File

@@ -2,10 +2,10 @@
set -euo pipefail
# Collect third-party license texts into LICENSES.md
# Usage: scripts/collect_licenses.sh [output-path]
# Usage: scripts/build/collect_licenses.sh [output-path]
# output-path defaults to LICENSES.md at the repo root
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
OUT="${1:-$REPO_ROOT/LICENSES.md}"
FRONTEND_LICENSE_IMAGE="${FRONTEND_LICENSE_IMAGE:-node:20-slim}"
FRONTEND_LICENSE_NPM="${FRONTEND_LICENSE_NPM:-10.9.5}"
@@ -59,7 +59,7 @@ for d in data:
# ── Frontend (npm) ───────────────────────────────────────────────────
frontend_licenses_local() {
cd "$REPO_ROOT/frontend"
node "$REPO_ROOT/scripts/print_frontend_licenses.cjs"
node "$REPO_ROOT/scripts/build/print_frontend_licenses.cjs"
}
frontend_licenses_docker() {
@@ -73,7 +73,7 @@ frontend_licenses_docker() {
cd frontend
npm i -g npm@$FRONTEND_LICENSE_NPM >/dev/null
npm ci --ignore-scripts >/dev/null
node /src/scripts/print_frontend_licenses.cjs
node /src/scripts/build/print_frontend_licenses.cjs
"
}
@@ -85,7 +85,7 @@ frontend_licenses() {
{
echo "# Third-Party Licenses"
echo
echo "Auto-generated by \`scripts/collect_licenses.sh\` — do not edit by hand."
echo "Auto-generated by \`scripts/build/collect_licenses.sh\` — do not edit by hand."
echo
echo "## Backend (Python) Dependencies"
echo

28
scripts/publish.sh → scripts/build/publish.sh Executable file → Normal file
View File

@@ -7,8 +7,8 @@ GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
cd "$SCRIPT_DIR"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
cd "$REPO_ROOT"
RELEASE_WORK_DIR=""
RELEASE_BUNDLE_DIR_NAME="Remote-Terminal-for-MeshCore"
@@ -17,14 +17,14 @@ DOCKER_IMAGE="jkingsman/remoteterm-meshcore"
DOCKER_PLATFORMS="linux/amd64,linux/arm64"
cleanup_release_build_artifacts() {
if [ -d "$SCRIPT_DIR/frontend/prebuilt" ]; then
rm -rf "$SCRIPT_DIR/frontend/prebuilt"
if [ -d "$REPO_ROOT/frontend/prebuilt" ]; then
rm -rf "$REPO_ROOT/frontend/prebuilt"
fi
if [ -n "$RELEASE_WORK_DIR" ] && [ -d "$RELEASE_WORK_DIR" ]; then
rm -rf "$RELEASE_WORK_DIR"
fi
if [ -n "$RELEASE_ASSET" ] && [ -f "$SCRIPT_DIR/$RELEASE_ASSET" ]; then
rm -f "$SCRIPT_DIR/$RELEASE_ASSET"
if [ -n "$RELEASE_ASSET" ] && [ -f "$REPO_ROOT/$RELEASE_ASSET" ]; then
rm -f "$REPO_ROOT/$RELEASE_ASSET"
fi
}
@@ -78,7 +78,7 @@ echo
# Run frontend linting and formatting check
echo -e "${YELLOW}Running frontend lint (ESLint)...${NC}"
cd "$SCRIPT_DIR/frontend"
cd "$REPO_ROOT/frontend"
npm run lint
echo -e "${GREEN}Frontend lint passed!${NC}"
echo
@@ -97,11 +97,11 @@ echo
echo -e "${YELLOW}Building frontend...${NC}"
npm run build
echo -e "${GREEN}Frontend build complete!${NC}"
cd "$SCRIPT_DIR"
cd "$REPO_ROOT"
echo
echo -e "${YELLOW}Regenerating LICENSES.md...${NC}"
bash scripts/collect_licenses.sh LICENSES.md
bash scripts/build/collect_licenses.sh LICENSES.md
echo -e "${GREEN}LICENSES.md updated!${NC}"
echo
@@ -202,16 +202,16 @@ FULL_GIT_HASH=$(git rev-parse HEAD)
RELEASE_ASSET="remoteterm-prebuilt-frontend-v${VERSION}-${GIT_HASH}.zip"
echo -e "${YELLOW}Building packaged frontend artifact...${NC}"
cd "$SCRIPT_DIR/frontend"
cd "$REPO_ROOT/frontend"
npm run packaged-build
cd "$SCRIPT_DIR"
cd "$REPO_ROOT"
RELEASE_WORK_DIR=$(mktemp -d)
RELEASE_BUNDLE_DIR="$RELEASE_WORK_DIR/$RELEASE_BUNDLE_DIR_NAME"
mkdir -p "$RELEASE_BUNDLE_DIR"
git archive "$FULL_GIT_HASH" | tar -x -C "$RELEASE_BUNDLE_DIR"
mkdir -p "$RELEASE_BUNDLE_DIR/frontend"
cp -R "$SCRIPT_DIR/frontend/prebuilt" "$RELEASE_BUNDLE_DIR/frontend/prebuilt"
cp -R "$REPO_ROOT/frontend/prebuilt" "$RELEASE_BUNDLE_DIR/frontend/prebuilt"
cat > "$RELEASE_BUNDLE_DIR/build_info.json" <<EOF
{
"version": "$VERSION",
@@ -219,10 +219,10 @@ cat > "$RELEASE_BUNDLE_DIR/build_info.json" <<EOF
"build_source": "prebuilt-release"
}
EOF
rm -f "$SCRIPT_DIR/$RELEASE_ASSET"
rm -f "$REPO_ROOT/$RELEASE_ASSET"
(
cd "$RELEASE_WORK_DIR"
zip -qr "$SCRIPT_DIR/$RELEASE_ASSET" "$(basename "$RELEASE_BUNDLE_DIR")"
zip -qr "$REPO_ROOT/$RELEASE_ASSET" "$(basename "$RELEASE_BUNDLE_DIR")"
)
echo -e "${GREEN}Packaged release artifact created: $RELEASE_ASSET${NC}"
echo

View File

@@ -14,7 +14,7 @@ YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
echo -e "${YELLOW}=== RemoteTerm Quality Checks ===${NC}"
echo
@@ -24,13 +24,13 @@ echo
echo -e "${YELLOW}=== Phase 1: Lint & Format ===${NC}"
echo -e "${BLUE}[backend lint]${NC} Running ruff check + format..."
cd "$SCRIPT_DIR"
cd "$REPO_ROOT"
uv run ruff check app/ tests/ --fix
uv run ruff format app/ tests/
echo -e "${GREEN}[backend lint]${NC} Passed!"
echo -e "${BLUE}[frontend lint]${NC} Running eslint + prettier..."
cd "$SCRIPT_DIR/frontend"
cd "$REPO_ROOT/frontend"
npm run lint:fix
npm run format
echo -e "${GREEN}[frontend lint]${NC} Passed!"
@@ -43,17 +43,17 @@ echo
echo -e "${YELLOW}=== Phase 2: Typecheck, Tests & Build ===${NC}"
echo -e "${BLUE}[pyright]${NC} Running type check..."
cd "$SCRIPT_DIR"
cd "$REPO_ROOT"
uv run pyright app/
echo -e "${GREEN}[pyright]${NC} Passed!"
echo -e "${BLUE}[pytest]${NC} Running backend tests..."
cd "$SCRIPT_DIR"
cd "$REPO_ROOT"
PYTHONPATH=. uv run pytest tests/ -v
echo -e "${GREEN}[pytest]${NC} Passed!"
echo -e "${BLUE}[frontend]${NC} Running tests + build..."
cd "$SCRIPT_DIR/frontend"
cd "$REPO_ROOT/frontend"
npm run test:run
npm run build
echo -e "${GREEN}[frontend]${NC} Passed!"

6
scripts/docker_ci.sh → scripts/quality/docker_ci.sh Executable file → Normal file
View File

@@ -7,7 +7,7 @@ YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m'
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
NODE_VERSIONS=("20" "22" "24")
# Use explicit npm patch versions so resolver regressions are caught.
@@ -27,7 +27,7 @@ run_combo() {
local image="node:${node_version}-slim"
docker run --rm \
-v "$SCRIPT_DIR:/src:ro" \
-v "$REPO_ROOT:/src:ro" \
-w /tmp \
"$image" \
bash -lc "
@@ -79,7 +79,7 @@ cleanup() {
trap cleanup EXIT
echo -e "${YELLOW}=== Frontend Docker CI Matrix ===${NC}"
echo -e "${BLUE}Repo:${NC} $SCRIPT_DIR"
echo -e "${BLUE}Repo:${NC} $REPO_ROOT"
echo
for case_spec in "${TEST_CASES[@]}"; do

4
scripts/e2e.sh → scripts/quality/e2e.sh Executable file → Normal file
View File

@@ -1,8 +1,8 @@
#!/usr/bin/env bash
set -e
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
echo "Starting E2E tests..."
cd "$SCRIPT_DIR/tests/e2e"
cd "$REPO_ROOT/tests/e2e"
npx playwright test "$@"

View File

@@ -6,23 +6,23 @@ GREEN='\033[0;32m'
BLUE='\033[0;34m'
NC='\033[0m'
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)"
echo -e "${YELLOW}=== Extended Quality Checks ===${NC}"
echo
echo -e "${BLUE}[all_quality]${NC} Running full lint, typecheck, unit tests, and the standard frontend build..."
"$SCRIPT_DIR/scripts/all_quality.sh"
"$REPO_ROOT/scripts/quality/all_quality.sh"
echo -e "${GREEN}[all_quality]${NC} Passed!"
echo
echo -e "${BLUE}[e2e]${NC} Running end-to-end tests..."
"$SCRIPT_DIR/scripts/e2e.sh" "$@"
"$REPO_ROOT/scripts/quality/e2e.sh" "$@"
echo -e "${GREEN}[e2e]${NC} Passed!"
echo
echo -e "${BLUE}[docker_ci]${NC} Running Docker frontend install/build matrix..."
"$SCRIPT_DIR/scripts/docker_ci.sh"
"$REPO_ROOT/scripts/quality/docker_ci.sh"
echo -e "${GREEN}[docker_ci]${NC} Passed!"
echo

View File

@@ -21,7 +21,8 @@ API_URL = f"https://api.github.com/repos/{REPO}/releases/latest"
PREBUILT_PREFIX = "Remote-Terminal-for-MeshCore/frontend/prebuilt/"
SCRIPT_DIR = Path(__file__).resolve().parent
PREBUILT_DIR = SCRIPT_DIR.parent / "frontend" / "prebuilt"
REPO_ROOT = SCRIPT_DIR.parent.parent
PREBUILT_DIR = REPO_ROOT / "frontend" / "prebuilt"
def fetch_json(url: str) -> dict:

View File

@@ -7,7 +7,7 @@
# gymnastics.
#
# Run from anywhere inside the repo:
# bash scripts/install_service.sh
# bash scripts/setup/install_service.sh
set -e
@@ -19,7 +19,7 @@ BOLD='\033[1m'
NC='\033[0m'
SERVICE_NAME="remoteterm"
REPO_DIR="$(cd "$(dirname "$0")/.." && pwd)"
REPO_DIR="$(cd "$(dirname "$0")/../.." && pwd)"
CURRENT_USER="$(id -un)"
SERVICE_FILE="/etc/systemd/system/${SERVICE_NAME}.service"
FRONTEND_MODE="build"
@@ -252,7 +252,7 @@ if [ "$FRONTEND_MODE" = "build" ]; then
)
else
echo -e "${YELLOW}Fetching prebuilt frontend...${NC}"
python3 "$REPO_DIR/scripts/fetch_prebuilt_frontend.py"
python3 "$REPO_DIR/scripts/setup/fetch_prebuilt_frontend.py"
fi
echo
@@ -402,7 +402,7 @@ echo -e " cd frontend && npm install && npm run build && cd .."
echo -e " sudo systemctl restart ${SERVICE_NAME}"
echo
echo -e "${YELLOW}Refresh prebuilt frontend only (skips local build):${NC}"
echo -e " python3 ${REPO_DIR}/scripts/fetch_prebuilt_frontend.py"
echo -e " python3 ${REPO_DIR}/scripts/setup/fetch_prebuilt_frontend.py"
echo -e " sudo systemctl restart ${SERVICE_NAME}"
echo
echo -e "${YELLOW}View live logs (useful for troubleshooting):${NC}"

View File

@@ -1247,8 +1247,8 @@ class TestMigration039:
applied = await run_migrations(conn)
assert applied == 8
assert await get_version(conn) == 46
assert applied == 9
assert await get_version(conn) == 47
cursor = await conn.execute(
"""
@@ -1319,8 +1319,8 @@ class TestMigration039:
applied = await run_migrations(conn)
assert applied == 8
assert await get_version(conn) == 46
assert applied == 9
assert await get_version(conn) == 47
cursor = await conn.execute(
"""
@@ -1386,8 +1386,8 @@ class TestMigration039:
applied = await run_migrations(conn)
assert applied == 2
assert await get_version(conn) == 46
assert applied == 3
assert await get_version(conn) == 47
cursor = await conn.execute(
"""
@@ -1439,8 +1439,8 @@ class TestMigration040:
applied = await run_migrations(conn)
assert applied == 7
assert await get_version(conn) == 46
assert applied == 8
assert await get_version(conn) == 47
await conn.execute(
"""
@@ -1501,8 +1501,8 @@ class TestMigration041:
applied = await run_migrations(conn)
assert applied == 6
assert await get_version(conn) == 46
assert applied == 7
assert await get_version(conn) == 47
await conn.execute(
"""
@@ -1554,8 +1554,8 @@ class TestMigration042:
applied = await run_migrations(conn)
assert applied == 5
assert await get_version(conn) == 46
assert applied == 6
assert await get_version(conn) == 47
await conn.execute(
"""
@@ -1694,8 +1694,8 @@ class TestMigration046:
applied = await run_migrations(conn)
assert applied == 1
assert await get_version(conn) == 46
assert applied == 2
assert await get_version(conn) == 47
cursor = await conn.execute(
"""
@@ -1750,6 +1750,70 @@ class TestMigration046:
await conn.close()
class TestMigration047:
"""Test migration 047: add statistics indexes."""
@pytest.mark.asyncio
async def test_adds_statistics_indexes(self):
conn = await aiosqlite.connect(":memory:")
conn.row_factory = aiosqlite.Row
try:
await set_version(conn, 46)
await conn.execute("""
CREATE TABLE contacts (
public_key TEXT PRIMARY KEY,
name TEXT,
type INTEGER DEFAULT 0,
last_seen INTEGER
)
""")
await conn.execute("""
CREATE TABLE messages (
id INTEGER PRIMARY KEY AUTOINCREMENT,
type TEXT NOT NULL,
conversation_key TEXT NOT NULL,
received_at INTEGER NOT NULL
)
""")
await conn.execute("""
CREATE TABLE raw_packets (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp INTEGER NOT NULL,
data BLOB NOT NULL,
message_id INTEGER,
payload_hash BLOB
)
""")
await conn.commit()
applied = await run_migrations(conn)
assert applied == 1
assert await get_version(conn) == 47
cursor = await conn.execute(
"""
SELECT name
FROM sqlite_master
WHERE type = 'index'
AND name IN (
'idx_raw_packets_timestamp',
'idx_contacts_type_last_seen',
'idx_messages_type_received_conversation'
)
ORDER BY name
"""
)
rows = await cursor.fetchall()
assert [row["name"] for row in rows] == [
"idx_contacts_type_last_seen",
"idx_messages_type_received_conversation",
"idx_raw_packets_timestamp",
]
finally:
await conn.close()
class TestMigrationPacketHelpers:
"""Test migration-local packet helpers against canonical path validation."""