mirror of
https://github.com/jkingsman/Remote-Terminal-for-MeshCore.git
synced 2026-05-11 12:00:28 +02:00
Compare commits
21 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 9ab4e7a9b0 | |||
| af76546287 | |||
| 31bd4a0744 | |||
| 1db724073b | |||
| 4783da8f3e | |||
| 4b69ec4519 | |||
| 8efbbd97bd | |||
| 1437e8e48a | |||
| 5cd8f7e80f | |||
| e8c50d0b2a | |||
| 7f3bb89323 | |||
| 5bfdd0880e | |||
| 0e9bd59b44 | |||
| b1cd6e1aa9 | |||
| 56fc589e0b | |||
| 64502c4ca2 | |||
| d1f657342a | |||
| 86a0ac7beb | |||
| 3b7e2737ee | |||
| 01158ac69f | |||
| 485df05372 |
@@ -197,6 +197,7 @@ This message-layer echo/path handling is independent of raw-packet storage dedup
|
|||||||
│ ├── event_handlers.py # Radio events
|
│ ├── event_handlers.py # Radio events
|
||||||
│ ├── decoder.py # Packet decryption
|
│ ├── decoder.py # Packet decryption
|
||||||
│ ├── websocket.py # Real-time broadcasts
|
│ ├── websocket.py # Real-time broadcasts
|
||||||
|
│ ├── push/ # Web Push notification subsystem (VAPID keys, dispatch, send)
|
||||||
│ └── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md)
|
│ └── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md)
|
||||||
├── frontend/ # React frontend
|
├── frontend/ # React frontend
|
||||||
│ ├── AGENTS.md # Frontend documentation
|
│ ├── AGENTS.md # Frontend documentation
|
||||||
@@ -380,6 +381,12 @@ All endpoints are prefixed with `/api` (e.g., `/api/health`).
|
|||||||
| DELETE | `/api/fanout/{id}` | Delete fanout config (stops module) |
|
| DELETE | `/api/fanout/{id}` | Delete fanout config (stops module) |
|
||||||
| POST | `/api/fanout/bots/disable-until-restart` | Stop bot fanout modules and keep bots disabled until the process restarts |
|
| POST | `/api/fanout/bots/disable-until-restart` | Stop bot fanout modules and keep bots disabled until the process restarts |
|
||||||
| GET | `/api/statistics` | Aggregated mesh network statistics |
|
| GET | `/api/statistics` | Aggregated mesh network statistics |
|
||||||
|
| GET | `/api/push/vapid-public-key` | VAPID public key for browser push subscription |
|
||||||
|
| POST | `/api/push/subscribe` | Register/upsert a push subscription |
|
||||||
|
| GET | `/api/push/subscriptions` | List all push subscriptions |
|
||||||
|
| PATCH | `/api/push/subscriptions/{id}` | Update subscription label or filter preferences |
|
||||||
|
| DELETE | `/api/push/subscriptions/{id}` | Delete a push subscription |
|
||||||
|
| POST | `/api/push/subscriptions/{id}/test` | Send a test push notification |
|
||||||
| WS | `/api/ws` | Real-time updates |
|
| WS | `/api/ws` | Real-time updates |
|
||||||
|
|
||||||
## Key Concepts
|
## Key Concepts
|
||||||
@@ -434,6 +441,17 @@ All external integrations are managed through the fanout bus (`app/fanout/`). Ea
|
|||||||
|
|
||||||
Community MQTT forwards raw packets only. Its derived `path` field, when present on direct packets, is a comma-separated list of hop identifiers as reported by the packet format. Token width therefore varies with the packet's path hash mode; it is intentionally not a flat per-byte rendering.
|
Community MQTT forwards raw packets only. Its derived `path` field, when present on direct packets, is a comma-separated list of hop identifiers as reported by the packet format. Token width therefore varies with the packet's path hash mode; it is intentionally not a flat per-byte rendering.
|
||||||
|
|
||||||
|
### Web Push Notifications
|
||||||
|
|
||||||
|
Web Push is a standalone subsystem (`app/push/`) that sends browser push notifications for incoming messages even when the browser tab is closed. It is **not** a fanout module — it manages its own per-browser subscriptions, while the set of push-enabled conversations is stored once per server instance.
|
||||||
|
|
||||||
|
- **Requires HTTPS** (self-signed certificates work) and outbound internet from the server to reach browser push services (Google FCM, Mozilla autopush).
|
||||||
|
- VAPID key pair is auto-generated on first startup and stored in `app_settings`.
|
||||||
|
- Each browser subscription is stored in `push_subscriptions` with device identity and delivery state. The set of push-enabled conversations is stored globally in `app_settings.push_conversations`, so all subscribed browsers receive the same configured rooms/DMs.
|
||||||
|
- `broadcast_event()` in `websocket.py` dispatches to `push_manager.dispatch_message()` alongside fanout for `message` events.
|
||||||
|
- Expired subscriptions (HTTP 404/410 from push service) are auto-deleted.
|
||||||
|
- Frontend: service worker (`sw.js`) handles push display and notification click navigation. The `BellRing` icon in `ChatHeader` toggles per-conversation push. Device management lives in Settings > Local.
|
||||||
|
|
||||||
### Server-Side Decryption
|
### Server-Side Decryption
|
||||||
|
|
||||||
The server can decrypt packets using stored keys, both in real-time and for historical packets.
|
The server can decrypt packets using stored keys, both in real-time and for historical packets.
|
||||||
|
|||||||
@@ -1,3 +1,8 @@
|
|||||||
|
## [3.11.3] - 2026-04-12
|
||||||
|
|
||||||
|
* Bugfix: Add icons and screenshots for webmanifest
|
||||||
|
* Bugfix: Use incoming DMs, not just outgoing, for recency ranking for preferential radio contact load
|
||||||
|
|
||||||
## [3.11.2] - 2026-04-12
|
## [3.11.2] - 2026-04-12
|
||||||
|
|
||||||
* Feature: Unread DMs are always at the top of the DM list no matter what
|
* Feature: Unread DMs are always at the top of the DM list no matter what
|
||||||
|
|||||||
+26
-1
@@ -50,6 +50,10 @@ app/
|
|||||||
├── events.py # Typed WS event payload serialization
|
├── events.py # Typed WS event payload serialization
|
||||||
├── websocket.py # WS manager + broadcast helpers
|
├── websocket.py # WS manager + broadcast helpers
|
||||||
├── security.py # Optional app-wide HTTP Basic auth middleware for HTTP + WS
|
├── security.py # Optional app-wide HTTP Basic auth middleware for HTTP + WS
|
||||||
|
├── push/ # Web Push notification subsystem
|
||||||
|
│ ├── vapid.py # VAPID key generation, storage, caching
|
||||||
|
│ ├── send.py # pywebpush wrapper (async via thread executor)
|
||||||
|
│ └── manager.py # Push dispatch: filter, build payload, concurrent send
|
||||||
├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md)
|
├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md)
|
||||||
├── dependencies.py # Shared FastAPI dependency providers
|
├── dependencies.py # Shared FastAPI dependency providers
|
||||||
├── path_utils.py # Path hex rendering and hop-width helpers
|
├── path_utils.py # Path hex rendering and hop-width helpers
|
||||||
@@ -71,6 +75,7 @@ app/
|
|||||||
├── fanout.py
|
├── fanout.py
|
||||||
├── repeaters.py
|
├── repeaters.py
|
||||||
├── statistics.py
|
├── statistics.py
|
||||||
|
├── push.py
|
||||||
└── ws.py
|
└── ws.py
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -168,6 +173,17 @@ app/
|
|||||||
- Community MQTT publishes raw packets only, but its derived `path` field for direct packets is emitted as comma-separated hop identifiers, not flat path bytes.
|
- Community MQTT publishes raw packets only, but its derived `path` field for direct packets is emitted as comma-separated hop identifiers, not flat path bytes.
|
||||||
- See `app/fanout/AGENTS_fanout.md` for full architecture details and event payload shapes.
|
- See `app/fanout/AGENTS_fanout.md` for full architecture details and event payload shapes.
|
||||||
|
|
||||||
|
### Web Push notifications
|
||||||
|
|
||||||
|
Web Push is a standalone subsystem in `app/push/`, separate from the fanout module system. It sends browser push notifications for incoming messages even when the tab is closed.
|
||||||
|
|
||||||
|
- **Not a fanout module** — Web Push manages per-browser subscriptions (N browsers, each with its own endpoint and delivery state), unlike fanout which is one-config-to-one-destination.
|
||||||
|
- **VAPID keys**: auto-generated P-256 key pair on first startup, stored in `app_settings.vapid_private_key` / `vapid_public_key`. Cached in-module by `app/push/vapid.py`.
|
||||||
|
- **Dispatch**: `broadcast_event()` in `websocket.py` fires `push_manager.dispatch_message(data)` alongside fanout for `message` events. The manager checks the global `app_settings.push_conversations` list, then sends to all currently registered subscriptions via `pywebpush` (run in a thread executor).
|
||||||
|
- **Stale cleanup**: HTTP 404/410 from the push service triggers immediate subscription deletion.
|
||||||
|
- **Subscriptions stored** in `push_subscriptions` table with `UNIQUE(endpoint)` for upsert semantics.
|
||||||
|
- Requires HTTPS (self-signed OK) and outbound internet to reach browser push services.
|
||||||
|
|
||||||
## API Surface (all under `/api`)
|
## API Surface (all under `/api`)
|
||||||
|
|
||||||
### Health
|
### Health
|
||||||
@@ -258,6 +274,14 @@ app/
|
|||||||
### Statistics
|
### Statistics
|
||||||
- `GET /statistics` — aggregated mesh network stats (entity counts, message/packet splits, activity windows, busiest channels)
|
- `GET /statistics` — aggregated mesh network stats (entity counts, message/packet splits, activity windows, busiest channels)
|
||||||
|
|
||||||
|
### Push
|
||||||
|
- `GET /push/vapid-public-key` — VAPID public key for browser `PushManager.subscribe()`
|
||||||
|
- `POST /push/subscribe` — register/upsert push subscription (keyed by endpoint URL)
|
||||||
|
- `GET /push/subscriptions` — list all push subscriptions
|
||||||
|
- `PATCH /push/subscriptions/{id}` — update label or filter preferences
|
||||||
|
- `DELETE /push/subscriptions/{id}` — delete subscription
|
||||||
|
- `POST /push/subscriptions/{id}/test` — send test notification
|
||||||
|
|
||||||
### WebSocket
|
### WebSocket
|
||||||
- `WS /ws`
|
- `WS /ws`
|
||||||
|
|
||||||
@@ -290,7 +314,8 @@ Main tables:
|
|||||||
- `contact_name_history` (tracks name changes over time)
|
- `contact_name_history` (tracks name changes over time)
|
||||||
- `repeater_telemetry_history` (time-series telemetry snapshots for tracked repeaters)
|
- `repeater_telemetry_history` (time-series telemetry snapshots for tracked repeaters)
|
||||||
- `fanout_configs` (MQTT, bot, webhook, Apprise, SQS integration configs)
|
- `fanout_configs` (MQTT, bot, webhook, Apprise, SQS integration configs)
|
||||||
- `app_settings`
|
- `push_subscriptions` (Web Push browser subscriptions with delivery metadata; UNIQUE on endpoint)
|
||||||
|
- `app_settings` (includes `vapid_private_key` and `vapid_public_key` for Web Push VAPID signing)
|
||||||
|
|
||||||
Contact route state is canonicalized on the backend:
|
Contact route state is canonicalized on the backend:
|
||||||
- stored route inputs: `direct_path`, `direct_path_len`, `direct_path_hash_mode`, `direct_path_updated_at`, plus optional `route_override_*`
|
- stored route inputs: `direct_path`, `direct_path_len`, `direct_path_hash_mode`, `direct_path_updated_at`, plus optional `route_override_*`
|
||||||
|
|||||||
+70
-1
@@ -1,4 +1,7 @@
|
|||||||
|
import asyncio
|
||||||
import logging
|
import logging
|
||||||
|
from collections.abc import AsyncIterator
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
import aiosqlite
|
import aiosqlite
|
||||||
@@ -108,7 +111,8 @@ CREATE TABLE IF NOT EXISTS app_settings (
|
|||||||
blocked_names TEXT DEFAULT '[]',
|
blocked_names TEXT DEFAULT '[]',
|
||||||
discovery_blocked_types TEXT DEFAULT '[]',
|
discovery_blocked_types TEXT DEFAULT '[]',
|
||||||
tracked_telemetry_repeaters TEXT DEFAULT '[]',
|
tracked_telemetry_repeaters TEXT DEFAULT '[]',
|
||||||
auto_resend_channel INTEGER DEFAULT 0
|
auto_resend_channel INTEGER DEFAULT 0,
|
||||||
|
telemetry_interval_hours INTEGER DEFAULT 8
|
||||||
);
|
);
|
||||||
INSERT OR IGNORE INTO app_settings (id) VALUES (1);
|
INSERT OR IGNORE INTO app_settings (id) VALUES (1);
|
||||||
|
|
||||||
@@ -164,9 +168,74 @@ CREATE INDEX IF NOT EXISTS idx_repeater_telemetry_pk_ts
|
|||||||
|
|
||||||
|
|
||||||
class Database:
|
class Database:
|
||||||
|
"""Single-connection aiosqlite wrapper with coroutine-level serialization.
|
||||||
|
|
||||||
|
Why the lock: aiosqlite runs one ``sqlite3.Connection`` on a background
|
||||||
|
worker thread and serializes statement execution there. But SQLite's
|
||||||
|
``COMMIT`` fails with ``OperationalError: cannot commit transaction -
|
||||||
|
SQL statements in progress`` whenever *any* cursor on the connection has
|
||||||
|
a live prepared statement (a ``SELECT`` that returned ``SQLITE_ROW`` but
|
||||||
|
hasn't been fully consumed or closed). Under concurrent coroutines, one
|
||||||
|
task's in-flight ``fetchone()`` can still be in ``SQLITE_ROW`` state when
|
||||||
|
another task's ``commit()`` runs on the worker — triggering the error.
|
||||||
|
|
||||||
|
Fix: all DB work goes through ``tx()`` (writes) or ``readonly()`` (reads),
|
||||||
|
both of which acquire ``self._lock``. The lock is non-reentrant (asyncio
|
||||||
|
default) by design — nested ``tx()`` calls are a bug. Repository methods
|
||||||
|
that compose multiple operations factor the raw SQL into private helpers
|
||||||
|
that take a ``conn`` and don't lock; the public method acquires the lock
|
||||||
|
once and calls those helpers.
|
||||||
|
|
||||||
|
Why reads are also locked: reads must also hold the lock, because a read
|
||||||
|
in ``SQLITE_ROW`` state is precisely the live statement that breaks a
|
||||||
|
concurrent writer's commit. Single-connection aiosqlite cannot safely
|
||||||
|
overlap reads and writes. If we ever split reader/writer connections in
|
||||||
|
the future, ``readonly()`` becomes the seam to point at the reader pool.
|
||||||
|
"""
|
||||||
|
|
||||||
def __init__(self, db_path: str):
|
def __init__(self, db_path: str):
|
||||||
self.db_path = db_path
|
self.db_path = db_path
|
||||||
self._connection: aiosqlite.Connection | None = None
|
self._connection: aiosqlite.Connection | None = None
|
||||||
|
self._lock = asyncio.Lock()
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def tx(self) -> AsyncIterator[aiosqlite.Connection]:
|
||||||
|
"""Acquire the connection for a write transaction.
|
||||||
|
|
||||||
|
Commits on clean exit, rolls back on exception. Callers MUST close
|
||||||
|
every cursor opened inside the block (use ``async with conn.execute(...)
|
||||||
|
as cursor:``) so no prepared statement is alive when commit runs.
|
||||||
|
|
||||||
|
The lock serializes concurrent writers AND ensures no reader's cursor
|
||||||
|
is alive during the commit. Nested calls will deadlock — factor shared
|
||||||
|
SQL into helpers that accept ``conn`` and do not re-enter ``tx()``.
|
||||||
|
"""
|
||||||
|
async with self._lock:
|
||||||
|
if self._connection is None:
|
||||||
|
raise RuntimeError("Database not connected")
|
||||||
|
conn = self._connection
|
||||||
|
try:
|
||||||
|
yield conn
|
||||||
|
except BaseException:
|
||||||
|
await conn.rollback()
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
await conn.commit()
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def readonly(self) -> AsyncIterator[aiosqlite.Connection]:
|
||||||
|
"""Acquire the connection for a read. No commit, no rollback.
|
||||||
|
|
||||||
|
Locked for the same reason writes are: on a single connection, an
|
||||||
|
active read statement blocks a concurrent writer's commit. Callers
|
||||||
|
MUST fully consume or close cursors before the block exits (use
|
||||||
|
``async with conn.execute(...) as cursor:`` + ``fetchall`` /
|
||||||
|
``fetchone``; avoid holding a cursor across ``await`` on other IO).
|
||||||
|
"""
|
||||||
|
async with self._lock:
|
||||||
|
if self._connection is None:
|
||||||
|
raise RuntimeError("Database not connected")
|
||||||
|
yield self._connection
|
||||||
|
|
||||||
async def connect(self) -> None:
|
async def connect(self) -> None:
|
||||||
logger.info("Connecting to database at %s", self.db_path)
|
logger.info("Connecting to database at %s", self.db_path)
|
||||||
|
|||||||
@@ -237,7 +237,9 @@ async def on_new_contact(event: "Event") -> None:
|
|||||||
logger.debug("New contact: %s", public_key[:12])
|
logger.debug("New contact: %s", public_key[:12])
|
||||||
|
|
||||||
contact_upsert = ContactUpsert.from_radio_dict(public_key.lower(), payload, on_radio=False)
|
contact_upsert = ContactUpsert.from_radio_dict(public_key.lower(), payload, on_radio=False)
|
||||||
contact_upsert.last_seen = int(time.time())
|
# Intentionally do not set last_seen here: NEW_CONTACT fires from the
|
||||||
|
# radio's stored contact DB, not an RF observation. last_seen means
|
||||||
|
# "last time we heard this pubkey on RF".
|
||||||
await ContactRepository.upsert(contact_upsert)
|
await ContactRepository.upsert(contact_upsert)
|
||||||
promoted_keys = await promote_prefix_contacts_for_contact(
|
promoted_keys = await promote_prefix_contacts_for_contact(
|
||||||
public_key=public_key,
|
public_key=public_key,
|
||||||
|
|||||||
@@ -144,8 +144,8 @@ Amazon SQS delivery. Config blob:
|
|||||||
- Supports both decoded messages and raw packets via normal scope selection
|
- Supports both decoded messages and raw packets via normal scope selection
|
||||||
|
|
||||||
### map_upload (map_upload.py)
|
### map_upload (map_upload.py)
|
||||||
Uploads heard repeater and room-server advertisements to map.meshcore.dev. Config blob:
|
Uploads heard repeater and room-server advertisements to map.meshcore.io. Config blob:
|
||||||
- `api_url` (optional, default `""`) — upload endpoint; empty falls back to the public map.meshcore.dev API
|
- `api_url` (optional, default `""`) — upload endpoint; empty falls back to the public map.meshcore.io API
|
||||||
- `dry_run` (bool, default `true`) — when true, logs the payload at INFO level without sending
|
- `dry_run` (bool, default `true`) — when true, logs the payload at INFO level without sending
|
||||||
- `geofence_enabled` (bool, default `false`) — when true, only uploads nodes within `geofence_radius_km` of the radio's own configured lat/lon
|
- `geofence_enabled` (bool, default `false`) — when true, only uploads nodes within `geofence_radius_km` of the radio's own configured lat/lon
|
||||||
- `geofence_radius_km` (float, default `0`) — filter radius in kilometres
|
- `geofence_radius_km` (float, default `0`) — filter radius in kilometres
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
"""Fanout module for uploading heard advert packets to map.meshcore.dev.
|
"""Fanout module for uploading heard advert packets to map.meshcore.io.
|
||||||
|
|
||||||
Mirrors the logic of the standalone map.meshcore.dev-uploader project:
|
Mirrors the logic of the standalone map.meshcore.dev-uploader project
|
||||||
|
(historical name; the live service is now hosted at map.meshcore.io):
|
||||||
- Listens on raw RF packets via on_raw
|
- Listens on raw RF packets via on_raw
|
||||||
- Filters for ADVERT packets, only processes repeaters (role 2) and rooms (role 3)
|
- Filters for ADVERT packets, only processes repeaters (role 2) and rooms (role 3)
|
||||||
- Skips nodes with no valid location (lat/lon None)
|
- Skips nodes with no valid location (lat/lon None)
|
||||||
@@ -16,7 +17,7 @@ the raw hex link.
|
|||||||
Config keys
|
Config keys
|
||||||
-----------
|
-----------
|
||||||
api_url : str, default ""
|
api_url : str, default ""
|
||||||
Upload endpoint. Empty string falls back to the public map.meshcore.dev API.
|
Upload endpoint. Empty string falls back to the public map.meshcore.io API.
|
||||||
dry_run : bool, default True
|
dry_run : bool, default True
|
||||||
When True, log the payload at INFO level instead of sending it.
|
When True, log the payload at INFO level instead of sending it.
|
||||||
geofence_enabled : bool, default False
|
geofence_enabled : bool, default False
|
||||||
@@ -46,7 +47,7 @@ from app.services.radio_runtime import radio_runtime
|
|||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
_DEFAULT_API_URL = "https://map.meshcore.dev/api/v1/uploader/node"
|
_DEFAULT_API_URL = "https://map.meshcore.io/api/v1/uploader/node"
|
||||||
|
|
||||||
# Re-upload guard: skip re-uploading a pubkey seen within this window (AU parity)
|
# Re-upload guard: skip re-uploading a pubkey seen within this window (AU parity)
|
||||||
_REUPLOAD_SECONDS = 3600
|
_REUPLOAD_SECONDS = 3600
|
||||||
|
|||||||
@@ -135,7 +135,34 @@ def register_frontend_static_routes(app: FastAPI, frontend_dir: Path) -> bool:
|
|||||||
"display_override": ["window-controls-overlay", "standalone", "fullscreen"],
|
"display_override": ["window-controls-overlay", "standalone", "fullscreen"],
|
||||||
"theme_color": "#111419",
|
"theme_color": "#111419",
|
||||||
"background_color": "#111419",
|
"background_color": "#111419",
|
||||||
|
# Icons are PNG-only on purpose. iOS Safari's manifest parser has
|
||||||
|
# historically been unreliable with SVG icons, and Android/Chrome
|
||||||
|
# PWA install flows prefer PNG for the install prompt.
|
||||||
|
#
|
||||||
|
# The "any" purpose entries are what iOS and desktop Chrome use
|
||||||
|
# for the home-screen / install icon. "maskable" entries are
|
||||||
|
# Android-only (adaptive icon with safe-zone crop); iOS does not
|
||||||
|
# apply the safe-zone mask, so a maskable-only icon set would
|
||||||
|
# render with excessive padding.
|
||||||
"icons": [
|
"icons": [
|
||||||
|
{
|
||||||
|
"src": f"{base}favicon-96x96.png",
|
||||||
|
"sizes": "96x96",
|
||||||
|
"type": "image/png",
|
||||||
|
"purpose": "any",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": f"{base}apple-touch-icon.png",
|
||||||
|
"sizes": "180x180",
|
||||||
|
"type": "image/png",
|
||||||
|
"purpose": "any",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": f"{base}favicon-256x256.png",
|
||||||
|
"sizes": "256x256",
|
||||||
|
"type": "image/png",
|
||||||
|
"purpose": "any",
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"src": f"{base}web-app-manifest-192x192.png",
|
"src": f"{base}web-app-manifest-192x192.png",
|
||||||
"sizes": "192x192",
|
"sizes": "192x192",
|
||||||
@@ -149,6 +176,27 @@ def register_frontend_static_routes(app: FastAPI, frontend_dir: Path) -> bool:
|
|||||||
"purpose": "maskable",
|
"purpose": "maskable",
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
|
"screenshots": [
|
||||||
|
{
|
||||||
|
"src": f"{base}screenshot-wide.png",
|
||||||
|
"sizes": "1367x909",
|
||||||
|
"type": "image/png",
|
||||||
|
"form_factor": "wide",
|
||||||
|
"label": "RemoteTerm desktop view",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": f"{base}screenshot-mobile.png",
|
||||||
|
"sizes": "1170x2532",
|
||||||
|
"type": "image/png",
|
||||||
|
"label": "RemoteTerm mobile view",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": f"{base}screenshot-mobile-2.png",
|
||||||
|
"sizes": "750x1334",
|
||||||
|
"type": "image/png",
|
||||||
|
"label": "RemoteTerm mobile conversation",
|
||||||
|
},
|
||||||
|
],
|
||||||
}
|
}
|
||||||
return JSONResponse(
|
return JSONResponse(
|
||||||
manifest,
|
manifest,
|
||||||
|
|||||||
+10
@@ -67,6 +67,7 @@ from app.routers import (
|
|||||||
health,
|
health,
|
||||||
messages,
|
messages,
|
||||||
packets,
|
packets,
|
||||||
|
push,
|
||||||
radio,
|
radio,
|
||||||
read_state,
|
read_state,
|
||||||
repeaters,
|
repeaters,
|
||||||
@@ -102,6 +103,14 @@ async def lifespan(app: FastAPI):
|
|||||||
await db.connect()
|
await db.connect()
|
||||||
logger.info("Database connected")
|
logger.info("Database connected")
|
||||||
|
|
||||||
|
# Initialize VAPID keys for Web Push (generates on first run)
|
||||||
|
from app.push.vapid import ensure_vapid_keys
|
||||||
|
|
||||||
|
try:
|
||||||
|
await ensure_vapid_keys()
|
||||||
|
except Exception:
|
||||||
|
logger.warning("Failed to initialize VAPID keys for Web Push", exc_info=True)
|
||||||
|
|
||||||
# Ensure default channels exist in the database even before the radio
|
# Ensure default channels exist in the database even before the radio
|
||||||
# connects. Without this, a fresh or disconnected instance would return
|
# connects. Without this, a fresh or disconnected instance would return
|
||||||
# zero channels from GET /channels until the first successful radio sync.
|
# zero channels from GET /channels until the first successful radio sync.
|
||||||
@@ -185,6 +194,7 @@ app.include_router(packets.router, prefix="/api")
|
|||||||
app.include_router(read_state.router, prefix="/api")
|
app.include_router(read_state.router, prefix="/api")
|
||||||
app.include_router(settings.router, prefix="/api")
|
app.include_router(settings.router, prefix="/api")
|
||||||
app.include_router(statistics.router, prefix="/api")
|
app.include_router(statistics.router, prefix="/api")
|
||||||
|
app.include_router(push.router, prefix="/api")
|
||||||
app.include_router(ws.router, prefix="/api")
|
app.include_router(ws.router, prefix="/api")
|
||||||
|
|
||||||
# Serve frontend static files in production
|
# Serve frontend static files in production
|
||||||
|
|||||||
@@ -0,0 +1,22 @@
|
|||||||
|
import logging
|
||||||
|
|
||||||
|
import aiosqlite
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
async def migrate(conn: aiosqlite.Connection) -> None:
|
||||||
|
"""Add telemetry_interval_hours integer column to app_settings."""
|
||||||
|
tables_cursor = await conn.execute("SELECT name FROM sqlite_master WHERE type='table'")
|
||||||
|
if "app_settings" not in {row[0] for row in await tables_cursor.fetchall()}:
|
||||||
|
await conn.commit()
|
||||||
|
return
|
||||||
|
col_cursor = await conn.execute("PRAGMA table_info(app_settings)")
|
||||||
|
columns = {row[1] for row in await col_cursor.fetchall()}
|
||||||
|
if "telemetry_interval_hours" not in columns:
|
||||||
|
# Default to 8 hours, matching the previous hard-coded interval
|
||||||
|
# so existing users see no behavior change until they opt in.
|
||||||
|
await conn.execute(
|
||||||
|
"ALTER TABLE app_settings ADD COLUMN telemetry_interval_hours INTEGER DEFAULT 8"
|
||||||
|
)
|
||||||
|
await conn.commit()
|
||||||
@@ -0,0 +1,49 @@
|
|||||||
|
import logging
|
||||||
|
|
||||||
|
import aiosqlite
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
async def migrate(conn: aiosqlite.Connection) -> None:
|
||||||
|
"""Add Web Push support: VAPID keys, push subscriptions table, and global conversation list."""
|
||||||
|
|
||||||
|
# VAPID key pair + global push conversation list in app_settings
|
||||||
|
table_check = await conn.execute(
|
||||||
|
"SELECT name FROM sqlite_master WHERE type='table' AND name='app_settings'"
|
||||||
|
)
|
||||||
|
if await table_check.fetchone():
|
||||||
|
cursor = await conn.execute("PRAGMA table_info(app_settings)")
|
||||||
|
columns = {row[1] for row in await cursor.fetchall()}
|
||||||
|
|
||||||
|
if "vapid_private_key" not in columns:
|
||||||
|
await conn.execute(
|
||||||
|
"ALTER TABLE app_settings ADD COLUMN vapid_private_key TEXT DEFAULT ''"
|
||||||
|
)
|
||||||
|
if "vapid_public_key" not in columns:
|
||||||
|
await conn.execute(
|
||||||
|
"ALTER TABLE app_settings ADD COLUMN vapid_public_key TEXT DEFAULT ''"
|
||||||
|
)
|
||||||
|
if "push_conversations" not in columns:
|
||||||
|
await conn.execute(
|
||||||
|
"ALTER TABLE app_settings ADD COLUMN push_conversations TEXT DEFAULT '[]'"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Push subscriptions — one row per browser/device
|
||||||
|
await conn.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS push_subscriptions (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
endpoint TEXT NOT NULL,
|
||||||
|
p256dh TEXT NOT NULL,
|
||||||
|
auth TEXT NOT NULL,
|
||||||
|
label TEXT NOT NULL DEFAULT '',
|
||||||
|
created_at INTEGER NOT NULL,
|
||||||
|
last_success_at INTEGER,
|
||||||
|
failure_count INTEGER DEFAULT 0,
|
||||||
|
UNIQUE(endpoint)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
await conn.commit()
|
||||||
@@ -842,6 +842,14 @@ class AppSettings(BaseModel):
|
|||||||
default_factory=list,
|
default_factory=list,
|
||||||
description="Public keys of repeaters opted into periodic telemetry collection (max 8)",
|
description="Public keys of repeaters opted into periodic telemetry collection (max 8)",
|
||||||
)
|
)
|
||||||
|
telemetry_interval_hours: int = Field(
|
||||||
|
default=8,
|
||||||
|
description=(
|
||||||
|
"User-preferred telemetry collection interval in hours. The backend "
|
||||||
|
"clamps this up to the shortest legal interval given the number of "
|
||||||
|
"tracked repeaters so daily checks stay under a 24/day ceiling."
|
||||||
|
),
|
||||||
|
)
|
||||||
auto_resend_channel: bool = Field(
|
auto_resend_channel: bool = Field(
|
||||||
default=False,
|
default=False,
|
||||||
description=(
|
description=(
|
||||||
|
|||||||
@@ -0,0 +1,172 @@
|
|||||||
|
"""Web Push dispatch manager.
|
||||||
|
|
||||||
|
Checks the global push-enabled conversation list (stored in app_settings)
|
||||||
|
and sends push notifications to ALL registered devices when a matching
|
||||||
|
incoming message arrives.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from pywebpush import WebPushException
|
||||||
|
|
||||||
|
from app.push.send import send_push
|
||||||
|
from app.push.vapid import get_vapid_private_key
|
||||||
|
from app.repository.push_subscriptions import PushSubscriptionRepository
|
||||||
|
from app.repository.settings import AppSettingsRepository
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_SEND_TIMEOUT = 15 # seconds per push send
|
||||||
|
_VAPID_CLAIMS = {"sub": "mailto:noreply@meshcore.local"}
|
||||||
|
|
||||||
|
|
||||||
|
def _state_key_for_message(data: dict) -> str:
|
||||||
|
"""Derive the conversation state key from a message event payload."""
|
||||||
|
msg_type = data.get("type", "")
|
||||||
|
conversation_key = data.get("conversation_key", "")
|
||||||
|
if msg_type == "PRIV":
|
||||||
|
return f"contact-{conversation_key}"
|
||||||
|
return f"channel-{conversation_key}"
|
||||||
|
|
||||||
|
|
||||||
|
def _build_payload(data: dict) -> str:
|
||||||
|
"""Build the push notification JSON payload from a message event."""
|
||||||
|
msg_type = data.get("type", "")
|
||||||
|
text = data.get("text", "")
|
||||||
|
sender_name = data.get("sender_name") or ""
|
||||||
|
channel_name = data.get("channel_name") or ""
|
||||||
|
|
||||||
|
if msg_type == "PRIV":
|
||||||
|
title = f"Message from {sender_name}" if sender_name else "New direct message"
|
||||||
|
body = text
|
||||||
|
else:
|
||||||
|
title = channel_name if channel_name else "Channel message"
|
||||||
|
body = text
|
||||||
|
|
||||||
|
conversation_key = data.get("conversation_key", "")
|
||||||
|
state_key = _state_key_for_message(data)
|
||||||
|
if msg_type == "PRIV":
|
||||||
|
url_hash = f"#contact/{conversation_key}"
|
||||||
|
else:
|
||||||
|
url_hash = f"#channel/{conversation_key}"
|
||||||
|
|
||||||
|
return json.dumps(
|
||||||
|
{
|
||||||
|
"title": title,
|
||||||
|
"body": body,
|
||||||
|
# Tag per conversation so different conversations coexist in the
|
||||||
|
# notification tray, while repeated messages in the same
|
||||||
|
# conversation replace each other.
|
||||||
|
"tag": f"meshcore-{state_key}",
|
||||||
|
"url_hash": url_hash,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _subscription_info(sub: dict) -> dict:
|
||||||
|
"""Build the subscription_info dict that pywebpush expects."""
|
||||||
|
return {
|
||||||
|
"endpoint": sub["endpoint"],
|
||||||
|
"keys": {
|
||||||
|
"p256dh": sub["p256dh"],
|
||||||
|
"auth": sub["auth"],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class _SendResult:
|
||||||
|
sub_id: str
|
||||||
|
success: bool = False
|
||||||
|
expired: bool = False
|
||||||
|
|
||||||
|
|
||||||
|
class PushManager:
|
||||||
|
async def dispatch_message(self, data: dict) -> None:
|
||||||
|
"""Send push notifications for a message event to all devices."""
|
||||||
|
# Don't notify for messages the operator just sent themselves
|
||||||
|
if data.get("outgoing"):
|
||||||
|
return
|
||||||
|
|
||||||
|
# Check the global conversation list
|
||||||
|
state_key = _state_key_for_message(data)
|
||||||
|
try:
|
||||||
|
push_conversations = await AppSettingsRepository.get_push_conversations()
|
||||||
|
except Exception:
|
||||||
|
logger.debug("Push dispatch: failed to load push_conversations", exc_info=True)
|
||||||
|
return
|
||||||
|
|
||||||
|
if state_key not in push_conversations:
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
subs = await PushSubscriptionRepository.get_all()
|
||||||
|
except Exception:
|
||||||
|
logger.debug("Push dispatch: failed to load subscriptions", exc_info=True)
|
||||||
|
return
|
||||||
|
|
||||||
|
if not subs:
|
||||||
|
return
|
||||||
|
|
||||||
|
payload = _build_payload(data)
|
||||||
|
vapid_key = get_vapid_private_key()
|
||||||
|
if not vapid_key:
|
||||||
|
logger.debug("Push dispatch: no VAPID key configured, skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
results = await asyncio.gather(
|
||||||
|
*(self._send_one(sub, payload, vapid_key) for sub in subs),
|
||||||
|
return_exceptions=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Batch-update all delivery outcomes in one transaction.
|
||||||
|
success_ids: list[str] = []
|
||||||
|
failure_ids: list[str] = []
|
||||||
|
remove_ids: list[str] = []
|
||||||
|
for r in results:
|
||||||
|
if isinstance(r, _SendResult):
|
||||||
|
if r.expired:
|
||||||
|
remove_ids.append(r.sub_id)
|
||||||
|
elif r.success:
|
||||||
|
success_ids.append(r.sub_id)
|
||||||
|
else:
|
||||||
|
failure_ids.append(r.sub_id)
|
||||||
|
if success_ids or failure_ids or remove_ids:
|
||||||
|
try:
|
||||||
|
await PushSubscriptionRepository.batch_record_outcomes(
|
||||||
|
success_ids, failure_ids, remove_ids
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
logger.debug("Push dispatch: failed to record outcomes", exc_info=True)
|
||||||
|
|
||||||
|
async def _send_one(self, sub: dict, payload: str, vapid_key: str) -> _SendResult:
|
||||||
|
sub_id = sub["id"]
|
||||||
|
result = _SendResult(sub_id=sub_id)
|
||||||
|
try:
|
||||||
|
async with asyncio.timeout(_SEND_TIMEOUT):
|
||||||
|
await send_push(
|
||||||
|
subscription_info=_subscription_info(sub),
|
||||||
|
payload=payload,
|
||||||
|
vapid_private_key=vapid_key,
|
||||||
|
vapid_claims=_VAPID_CLAIMS,
|
||||||
|
)
|
||||||
|
result.success = True
|
||||||
|
except WebPushException as e:
|
||||||
|
status = getattr(e, "response", None)
|
||||||
|
status_code = getattr(status, "status_code", 0) if status else 0
|
||||||
|
if status_code in (403, 404, 410):
|
||||||
|
logger.info("Push subscription expired (HTTP %d), removing %s", status_code, sub_id)
|
||||||
|
result.expired = True
|
||||||
|
else:
|
||||||
|
logger.warning("Push send failed for %s: %s", sub_id, e)
|
||||||
|
except TimeoutError:
|
||||||
|
logger.warning("Push send timed out for %s", sub_id)
|
||||||
|
except Exception:
|
||||||
|
logger.debug("Push send error for %s", sub_id, exc_info=True)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
push_manager = PushManager()
|
||||||
@@ -0,0 +1,231 @@
|
|||||||
|
"""Thin wrapper around pywebpush for sending push notifications.
|
||||||
|
|
||||||
|
Isolates the pywebpush dependency and runs the synchronous send in
|
||||||
|
a thread executor to avoid blocking the event loop.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
import socket
|
||||||
|
from typing import Any, cast
|
||||||
|
|
||||||
|
import requests
|
||||||
|
import urllib3.connection
|
||||||
|
import urllib3.connectionpool
|
||||||
|
from pywebpush import webpush
|
||||||
|
from requests.adapters import HTTPAdapter
|
||||||
|
from requests.exceptions import ConnectionError as RequestsConnectionError
|
||||||
|
from requests.exceptions import ConnectTimeout as RequestsConnectTimeout
|
||||||
|
from urllib3.exceptions import ConnectTimeoutError, NameResolutionError, NewConnectionError
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
DEFAULT_TIMEOUT = object()
|
||||||
|
DEFAULT_PUSH_CONNECT_TIMEOUT_SECONDS = 3
|
||||||
|
IPV4_FALLBACK_CONNECT_TIMEOUT_SECONDS = 10
|
||||||
|
DEFAULT_PUSH_READ_TIMEOUT_SECONDS = 10
|
||||||
|
|
||||||
|
|
||||||
|
def _create_ipv4_connection(
|
||||||
|
address: tuple[str, int],
|
||||||
|
timeout: float | None | object = DEFAULT_TIMEOUT,
|
||||||
|
source_address: tuple[str, int] | None = None,
|
||||||
|
socket_options=None,
|
||||||
|
) -> socket.socket:
|
||||||
|
"""Create a socket connection using IPv4 only."""
|
||||||
|
host, port = address
|
||||||
|
if host.startswith("["):
|
||||||
|
host = host.strip("[]")
|
||||||
|
|
||||||
|
err: OSError | None = None
|
||||||
|
for res in socket.getaddrinfo(host, port, socket.AF_INET, socket.SOCK_STREAM):
|
||||||
|
af, socktype, proto, _, sa = res
|
||||||
|
sock = None
|
||||||
|
try:
|
||||||
|
sock = socket.socket(af, socktype, proto)
|
||||||
|
if socket_options:
|
||||||
|
for opt in socket_options:
|
||||||
|
sock.setsockopt(*opt)
|
||||||
|
if timeout is not DEFAULT_TIMEOUT:
|
||||||
|
sock.settimeout(cast(float | None, timeout))
|
||||||
|
if source_address:
|
||||||
|
sock.bind(source_address)
|
||||||
|
sock.connect(sa)
|
||||||
|
return sock
|
||||||
|
except OSError as exc:
|
||||||
|
err = exc
|
||||||
|
if sock is not None:
|
||||||
|
sock.close()
|
||||||
|
|
||||||
|
if err is not None:
|
||||||
|
raise err
|
||||||
|
raise OSError("getaddrinfo returns an empty list")
|
||||||
|
|
||||||
|
|
||||||
|
class IPv4HTTPConnection(urllib3.connection.HTTPConnection):
|
||||||
|
"""urllib3 HTTP connection that resolves and connects via IPv4 only."""
|
||||||
|
|
||||||
|
def _new_conn(self) -> socket.socket:
|
||||||
|
try:
|
||||||
|
return _create_ipv4_connection(
|
||||||
|
(self._dns_host, self.port),
|
||||||
|
self.timeout,
|
||||||
|
source_address=self.source_address,
|
||||||
|
socket_options=self.socket_options,
|
||||||
|
)
|
||||||
|
except socket.gaierror as exc:
|
||||||
|
raise NameResolutionError(self.host, self, exc) from exc
|
||||||
|
except TimeoutError as exc:
|
||||||
|
raise ConnectTimeoutError(
|
||||||
|
self,
|
||||||
|
f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
|
||||||
|
) from exc
|
||||||
|
except OSError as exc:
|
||||||
|
raise NewConnectionError(self, f"Failed to establish a new connection: {exc}") from exc
|
||||||
|
|
||||||
|
|
||||||
|
class IPv4HTTPSConnection(urllib3.connection.HTTPSConnection):
|
||||||
|
"""urllib3 HTTPS connection that resolves and connects via IPv4 only."""
|
||||||
|
|
||||||
|
def _new_conn(self) -> socket.socket:
|
||||||
|
try:
|
||||||
|
return _create_ipv4_connection(
|
||||||
|
(self._dns_host, self.port),
|
||||||
|
self.timeout,
|
||||||
|
source_address=self.source_address,
|
||||||
|
socket_options=self.socket_options,
|
||||||
|
)
|
||||||
|
except socket.gaierror as exc:
|
||||||
|
raise NameResolutionError(self.host, self, exc) from exc
|
||||||
|
except TimeoutError as exc:
|
||||||
|
raise ConnectTimeoutError(
|
||||||
|
self,
|
||||||
|
f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
|
||||||
|
) from exc
|
||||||
|
except OSError as exc:
|
||||||
|
raise NewConnectionError(self, f"Failed to establish a new connection: {exc}") from exc
|
||||||
|
|
||||||
|
|
||||||
|
class IPv4HTTPConnectionPool(urllib3.connectionpool.HTTPConnectionPool):
|
||||||
|
ConnectionCls = cast(Any, IPv4HTTPConnection)
|
||||||
|
|
||||||
|
|
||||||
|
class IPv4HTTPSConnectionPool(urllib3.connectionpool.HTTPSConnectionPool):
|
||||||
|
ConnectionCls = cast(Any, IPv4HTTPSConnection)
|
||||||
|
|
||||||
|
|
||||||
|
def _configure_pool_manager_for_ipv4(manager: Any) -> None:
|
||||||
|
manager.pool_classes_by_scheme = manager.pool_classes_by_scheme.copy()
|
||||||
|
manager.pool_classes_by_scheme["http"] = IPv4HTTPConnectionPool
|
||||||
|
manager.pool_classes_by_scheme["https"] = IPv4HTTPSConnectionPool
|
||||||
|
|
||||||
|
|
||||||
|
class IPv4HTTPAdapter(HTTPAdapter):
|
||||||
|
"""requests adapter that uses IPv4-only urllib3 connection pools."""
|
||||||
|
|
||||||
|
def init_poolmanager(self, connections, maxsize, block=False, **pool_kwargs):
|
||||||
|
super().init_poolmanager(connections, maxsize, block=block, **pool_kwargs)
|
||||||
|
_configure_pool_manager_for_ipv4(self.poolmanager)
|
||||||
|
|
||||||
|
def proxy_manager_for(self, *args, **kwargs):
|
||||||
|
manager = super().proxy_manager_for(*args, **kwargs)
|
||||||
|
_configure_pool_manager_for_ipv4(manager)
|
||||||
|
return manager
|
||||||
|
|
||||||
|
|
||||||
|
def _build_default_requests_session() -> requests.Session:
|
||||||
|
return requests.Session()
|
||||||
|
|
||||||
|
|
||||||
|
def _build_ipv4_requests_session() -> requests.Session:
|
||||||
|
session = requests.Session()
|
||||||
|
adapter = IPv4HTTPAdapter()
|
||||||
|
session.mount("http://", adapter)
|
||||||
|
session.mount("https://", adapter)
|
||||||
|
return session
|
||||||
|
|
||||||
|
|
||||||
|
def _send_push_with_session(
|
||||||
|
*,
|
||||||
|
subscription_info: dict,
|
||||||
|
payload: str,
|
||||||
|
vapid_private_key: str,
|
||||||
|
vapid_claims: dict,
|
||||||
|
session: requests.Session,
|
||||||
|
connect_timeout_seconds: int,
|
||||||
|
) -> int:
|
||||||
|
response = webpush(
|
||||||
|
subscription_info=subscription_info,
|
||||||
|
data=payload,
|
||||||
|
vapid_private_key=vapid_private_key,
|
||||||
|
vapid_claims=vapid_claims,
|
||||||
|
content_encoding="aes128gcm",
|
||||||
|
timeout=cast(Any, (connect_timeout_seconds, DEFAULT_PUSH_READ_TIMEOUT_SECONDS)),
|
||||||
|
requests_session=session,
|
||||||
|
)
|
||||||
|
return response.status_code # type: ignore[union-attr]
|
||||||
|
|
||||||
|
|
||||||
|
def _send_push_with_fallback(
|
||||||
|
subscription_info: dict,
|
||||||
|
payload: str,
|
||||||
|
vapid_private_key: str,
|
||||||
|
vapid_claims: dict,
|
||||||
|
) -> int:
|
||||||
|
"""Send using normal dual-stack resolution, then retry with IPv4-only on connect failures."""
|
||||||
|
session = _build_default_requests_session()
|
||||||
|
try:
|
||||||
|
return _send_push_with_session(
|
||||||
|
subscription_info=subscription_info,
|
||||||
|
payload=payload,
|
||||||
|
vapid_private_key=vapid_private_key,
|
||||||
|
vapid_claims=vapid_claims,
|
||||||
|
session=session,
|
||||||
|
connect_timeout_seconds=DEFAULT_PUSH_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
)
|
||||||
|
except (RequestsConnectTimeout, RequestsConnectionError) as exc:
|
||||||
|
logger.info("Push delivery retrying via IPv4 after initial network failure: %s", exc)
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
session = _build_ipv4_requests_session()
|
||||||
|
try:
|
||||||
|
return _send_push_with_session(
|
||||||
|
subscription_info=subscription_info,
|
||||||
|
payload=payload,
|
||||||
|
vapid_private_key=vapid_private_key,
|
||||||
|
vapid_claims=vapid_claims,
|
||||||
|
session=session,
|
||||||
|
connect_timeout_seconds=IPV4_FALLBACK_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def send_push(
|
||||||
|
subscription_info: dict,
|
||||||
|
payload: str,
|
||||||
|
vapid_private_key: str,
|
||||||
|
vapid_claims: dict,
|
||||||
|
) -> int:
|
||||||
|
"""Send an encrypted push notification.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
subscription_info: {"endpoint": ..., "keys": {"p256dh": ..., "auth": ...}}
|
||||||
|
payload: JSON string to encrypt and send
|
||||||
|
vapid_private_key: base64url-encoded raw EC private key scalar
|
||||||
|
vapid_claims: {"sub": "mailto:..."} or {"sub": "https://..."}
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
HTTP status code from the push service.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
WebPushException: on push service error (caller handles 404/410 cleanup).
|
||||||
|
"""
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
return await loop.run_in_executor(
|
||||||
|
None,
|
||||||
|
lambda: _send_push_with_fallback(
|
||||||
|
subscription_info, payload, vapid_private_key, vapid_claims
|
||||||
|
),
|
||||||
|
)
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
"""VAPID key management for Web Push.
|
||||||
|
|
||||||
|
Generates a P-256 key pair on first use and caches it in app_settings
|
||||||
|
via ``AppSettingsRepository``. The public key is served to browsers
|
||||||
|
for ``PushManager.subscribe()``.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from cryptography.hazmat.primitives.serialization import Encoding, PublicFormat
|
||||||
|
from py_vapid import Vapid
|
||||||
|
|
||||||
|
from app.repository.settings import AppSettingsRepository
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_cached_private_key: str = ""
|
||||||
|
_cached_public_key: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
async def ensure_vapid_keys() -> tuple[str, str]:
|
||||||
|
"""Read or generate VAPID keys. Call once at startup after DB connect."""
|
||||||
|
global _cached_private_key, _cached_public_key
|
||||||
|
|
||||||
|
private, public = await AppSettingsRepository.get_vapid_keys()
|
||||||
|
if private and public:
|
||||||
|
_cached_private_key = private
|
||||||
|
_cached_public_key = public
|
||||||
|
logger.info("VAPID keys loaded from database")
|
||||||
|
return _cached_private_key, _cached_public_key
|
||||||
|
|
||||||
|
# Generate new key pair
|
||||||
|
vapid = Vapid()
|
||||||
|
vapid.generate_keys()
|
||||||
|
|
||||||
|
# Private key as base64url-encoded raw 32-byte EC scalar — the format
|
||||||
|
# that pywebpush passes to ``Vapid.from_string()``.
|
||||||
|
raw_priv = vapid.private_key.private_numbers().private_value.to_bytes(32, "big") # type: ignore[union-attr]
|
||||||
|
_cached_private_key = base64.urlsafe_b64encode(raw_priv).rstrip(b"=").decode("ascii")
|
||||||
|
|
||||||
|
# Public key as uncompressed P-256 point, base64url-encoded (no padding)
|
||||||
|
# for the browser Push API's applicationServerKey
|
||||||
|
raw_pub = vapid.public_key.public_bytes(Encoding.X962, PublicFormat.UncompressedPoint) # type: ignore[union-attr]
|
||||||
|
_cached_public_key = base64.urlsafe_b64encode(raw_pub).rstrip(b"=").decode("ascii")
|
||||||
|
|
||||||
|
await AppSettingsRepository.set_vapid_keys(_cached_private_key, _cached_public_key)
|
||||||
|
logger.info("Generated and stored new VAPID key pair")
|
||||||
|
|
||||||
|
return _cached_private_key, _cached_public_key
|
||||||
|
|
||||||
|
|
||||||
|
def get_vapid_public_key() -> str:
|
||||||
|
"""Return the cached VAPID public key (base64url). Must call ensure_vapid_keys() first."""
|
||||||
|
return _cached_public_key
|
||||||
|
|
||||||
|
|
||||||
|
def get_vapid_private_key() -> str:
|
||||||
|
"""Return the cached VAPID private key (base64url). Must call ensure_vapid_keys() first."""
|
||||||
|
return _cached_private_key
|
||||||
+128
-63
@@ -14,6 +14,7 @@ import logging
|
|||||||
import math
|
import math
|
||||||
import time
|
import time
|
||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
from typing import Literal
|
from typing import Literal
|
||||||
|
|
||||||
from meshcore import EventType, MeshCore
|
from meshcore import EventType, MeshCore
|
||||||
@@ -36,6 +37,7 @@ from app.services.contact_reconciliation import (
|
|||||||
)
|
)
|
||||||
from app.services.messages import create_fallback_channel_message
|
from app.services.messages import create_fallback_channel_message
|
||||||
from app.services.radio_runtime import radio_runtime as radio_manager
|
from app.services.radio_runtime import radio_runtime as radio_manager
|
||||||
|
from app.telemetry_interval import clamp_telemetry_interval
|
||||||
from app.websocket import broadcast_error, broadcast_event
|
from app.websocket import broadcast_error, broadcast_event
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -159,10 +161,10 @@ MIN_ADVERT_INTERVAL = 3600
|
|||||||
# Periodic telemetry collection task handle
|
# Periodic telemetry collection task handle
|
||||||
_telemetry_collect_task: asyncio.Task | None = None
|
_telemetry_collect_task: asyncio.Task | None = None
|
||||||
|
|
||||||
# Telemetry collection interval (8 hours)
|
# Initial delay before the scheduler starts (let radio settle). After this,
|
||||||
TELEMETRY_COLLECT_INTERVAL = 8 * 3600
|
# the loop wakes at each UTC top-of-hour and decides whether to run a cycle
|
||||||
|
# based on the user's telemetry_interval_hours preference, clamped up to
|
||||||
# Initial delay before the first telemetry collection cycle (let radio settle)
|
# the shortest-legal interval for the current tracked-repeater count.
|
||||||
TELEMETRY_COLLECT_INITIAL_DELAY = 60
|
TELEMETRY_COLLECT_INITIAL_DELAY = 60
|
||||||
|
|
||||||
# Counter to pause polling during repeater operations (supports nested pauses)
|
# Counter to pause polling during repeater operations (supports nested pauses)
|
||||||
@@ -1295,7 +1297,13 @@ async def stop_background_contact_reconciliation() -> None:
|
|||||||
|
|
||||||
|
|
||||||
async def get_contacts_selected_for_radio_sync() -> list[Contact]:
|
async def get_contacts_selected_for_radio_sync() -> list[Contact]:
|
||||||
"""Return the contacts that would be loaded onto the radio right now."""
|
"""Return the contacts that would be loaded onto the radio right now.
|
||||||
|
|
||||||
|
Fill order:
|
||||||
|
1. Favorites (up to full capacity)
|
||||||
|
2. Most recently DM-active non-repeaters (sent or received, up to 80% refill target)
|
||||||
|
3. Most recently advertised non-repeaters (up to 80% refill target)
|
||||||
|
"""
|
||||||
app_settings = await AppSettingsRepository.get()
|
app_settings = await AppSettingsRepository.get()
|
||||||
max_contacts = _effective_radio_capacity(app_settings.max_radio_contacts)
|
max_contacts = _effective_radio_capacity(app_settings.max_radio_contacts)
|
||||||
refill_target, _full_sync_trigger = _compute_radio_contact_limits(max_contacts)
|
refill_target, _full_sync_trigger = _compute_radio_contact_limits(max_contacts)
|
||||||
@@ -1315,7 +1323,7 @@ async def get_contacts_selected_for_radio_sync() -> list[Contact]:
|
|||||||
break
|
break
|
||||||
|
|
||||||
if len(selected_contacts) < refill_target:
|
if len(selected_contacts) < refill_target:
|
||||||
for contact in await ContactRepository.get_recently_contacted_non_repeaters(
|
for contact in await ContactRepository.get_recently_dm_active_non_repeaters(
|
||||||
limit=max_contacts
|
limit=max_contacts
|
||||||
):
|
):
|
||||||
key = contact.public_key.lower()
|
key = contact.public_key.lower()
|
||||||
@@ -1354,8 +1362,8 @@ async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
|
|||||||
|
|
||||||
Fill order is:
|
Fill order is:
|
||||||
1. Favorite contacts
|
1. Favorite contacts
|
||||||
2. Most recently interacted-with non-repeaters
|
2. Most recently DM-active non-repeaters (sent or received)
|
||||||
3. Most recently advert-heard non-repeaters without interaction history
|
3. Most recently advert-heard non-repeaters
|
||||||
|
|
||||||
Favorite contacts are always reloaded first, up to the configured capacity.
|
Favorite contacts are always reloaded first, up to the configured capacity.
|
||||||
Additional non-favorite fill stops at the refill target (80% of capacity).
|
Additional non-favorite fill stops at the refill target (80% of capacity).
|
||||||
@@ -1489,8 +1497,8 @@ async def sync_recent_contacts_to_radio(force: bool = False, mc: MeshCore | None
|
|||||||
"""
|
"""
|
||||||
Load contacts to the radio for DM ACK support.
|
Load contacts to the radio for DM ACK support.
|
||||||
|
|
||||||
Fill order is favorites, then recently contacted non-repeaters,
|
Fill order is favorites, then recently DM-active non-repeaters (sent or
|
||||||
then recently advert-heard non-repeaters. Favorites are always reloaded
|
received), then recently advert-heard non-repeaters. Favorites are always reloaded
|
||||||
up to the configured capacity; additional non-favorite fill stops at the
|
up to the configured capacity; additional non-favorite fill stops at the
|
||||||
80% refill target.
|
80% refill target.
|
||||||
Only runs at most once every CONTACT_SYNC_THROTTLE_SECONDS unless forced.
|
Only runs at most once every CONTACT_SYNC_THROTTLE_SECONDS unless forced.
|
||||||
@@ -1650,62 +1658,122 @@ async def _collect_repeater_telemetry(mc: MeshCore, contact: Contact) -> bool:
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
async def _run_telemetry_cycle() -> None:
|
||||||
|
"""Collect one telemetry sample from every tracked repeater."""
|
||||||
|
if not radio_manager.is_connected:
|
||||||
|
logger.debug("Telemetry collect: radio not connected, skipping cycle")
|
||||||
|
return
|
||||||
|
|
||||||
|
app_settings = await AppSettingsRepository.get()
|
||||||
|
tracked = app_settings.tracked_telemetry_repeaters
|
||||||
|
if not tracked:
|
||||||
|
return
|
||||||
|
|
||||||
|
logger.info("Telemetry collect: starting cycle for %d repeater(s)", len(tracked))
|
||||||
|
collected = 0
|
||||||
|
|
||||||
|
for pub_key in tracked:
|
||||||
|
contact = await ContactRepository.get_by_key(pub_key)
|
||||||
|
if not contact or contact.type != 2:
|
||||||
|
logger.debug(
|
||||||
|
"Telemetry collect: skipping %s (not found or not repeater)",
|
||||||
|
pub_key[:12],
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with radio_manager.radio_operation(
|
||||||
|
"telemetry_collect",
|
||||||
|
blocking=False,
|
||||||
|
suspend_auto_fetch=True,
|
||||||
|
) as mc:
|
||||||
|
if await _collect_repeater_telemetry(mc, contact):
|
||||||
|
collected += 1
|
||||||
|
except RadioOperationBusyError:
|
||||||
|
logger.debug(
|
||||||
|
"Telemetry collect: radio busy, skipping %s",
|
||||||
|
pub_key[:12],
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"Telemetry collect: cycle complete, %d/%d successful",
|
||||||
|
collected,
|
||||||
|
len(tracked),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def _sleep_until_next_utc_top_of_hour() -> None:
|
||||||
|
"""Sleep until the next UTC top-of-hour (or a minimum of 1 second)."""
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
next_top = now.replace(minute=0, second=0, microsecond=0) + timedelta(hours=1)
|
||||||
|
delay = (next_top - now).total_seconds()
|
||||||
|
if delay < 1:
|
||||||
|
delay = 1
|
||||||
|
await asyncio.sleep(delay)
|
||||||
|
|
||||||
|
|
||||||
|
async def _maybe_run_scheduled_cycle(now: datetime) -> None:
|
||||||
|
"""Evaluate the modulo gate for the given UTC time and run a cycle if due.
|
||||||
|
|
||||||
|
Factored out of the loop so we can also invoke it immediately after the
|
||||||
|
post-boot initial delay — otherwise a restart within the initial-delay
|
||||||
|
window before a scheduled boundary would carry the task past that boundary
|
||||||
|
and skip a due cycle (for 24h cadence users, that's a full day of missed
|
||||||
|
telemetry).
|
||||||
|
"""
|
||||||
|
app_settings = await AppSettingsRepository.get()
|
||||||
|
tracked_count = len(app_settings.tracked_telemetry_repeaters)
|
||||||
|
if tracked_count == 0:
|
||||||
|
return
|
||||||
|
effective_hours = clamp_telemetry_interval(app_settings.telemetry_interval_hours, tracked_count)
|
||||||
|
if effective_hours <= 0:
|
||||||
|
return
|
||||||
|
if now.hour % effective_hours != 0:
|
||||||
|
return
|
||||||
|
await _run_telemetry_cycle()
|
||||||
|
|
||||||
|
|
||||||
async def _telemetry_collect_loop() -> None:
|
async def _telemetry_collect_loop() -> None:
|
||||||
"""Background task that collects telemetry from tracked repeaters every 8 hours.
|
"""Background task that runs tracked-repeater telemetry collection.
|
||||||
|
|
||||||
Runs a first cycle after a short initial delay (so newly tracked repeaters
|
After an initial post-boot delay we evaluate the modulo gate once
|
||||||
get a sample promptly), then sleeps the full interval between subsequent cycles.
|
(covers the edge case where the initial delay crossed a scheduled
|
||||||
|
boundary on restart). Then we wake at every UTC top-of-hour and
|
||||||
|
evaluate the gate again. A cycle runs only when
|
||||||
|
``current_utc_hour % effective_interval_hours == 0``, where the
|
||||||
|
effective interval is the user preference clamped up to the shortest
|
||||||
|
legal interval for the current tracked-repeater count. This keeps the
|
||||||
|
total daily check count bounded at ``DAILY_CHECK_CEILING`` (24).
|
||||||
|
|
||||||
Acquires the radio lock per-repeater (non-blocking) so manual operations can
|
The loop never updates the stored user preference. If the user picks a
|
||||||
|
short interval and then adds repeaters that make it illegal, they keep
|
||||||
|
their pick stored and we silently use the clamped value until they drop
|
||||||
|
repeaters.
|
||||||
|
|
||||||
|
Radio lock is acquired per-repeater (non-blocking) so manual ops can
|
||||||
interleave. Failures are logged and skipped.
|
interleave. Failures are logged and skipped.
|
||||||
"""
|
"""
|
||||||
first_run = True
|
try:
|
||||||
|
await asyncio.sleep(TELEMETRY_COLLECT_INITIAL_DELAY)
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
logger.info("Telemetry collect task cancelled before initial delay")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Post-boot boundary check: if the delay carried us into a matching hour
|
||||||
|
# (or we booted exactly at a matching hour), run now rather than waiting
|
||||||
|
# another full cycle.
|
||||||
|
try:
|
||||||
|
await _maybe_run_scheduled_cycle(datetime.now(UTC))
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
logger.info("Telemetry collect task cancelled after initial delay")
|
||||||
|
return
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error in post-boot telemetry check: %s", e, exc_info=True)
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
delay = TELEMETRY_COLLECT_INITIAL_DELAY if first_run else TELEMETRY_COLLECT_INTERVAL
|
await _sleep_until_next_utc_top_of_hour()
|
||||||
await asyncio.sleep(delay)
|
await _maybe_run_scheduled_cycle(datetime.now(UTC))
|
||||||
first_run = False
|
|
||||||
|
|
||||||
if not radio_manager.is_connected:
|
|
||||||
logger.debug("Telemetry collect: radio not connected, skipping cycle")
|
|
||||||
continue
|
|
||||||
|
|
||||||
app_settings = await AppSettingsRepository.get()
|
|
||||||
tracked = app_settings.tracked_telemetry_repeaters
|
|
||||||
if not tracked:
|
|
||||||
continue
|
|
||||||
|
|
||||||
logger.info("Telemetry collect: starting cycle for %d repeater(s)", len(tracked))
|
|
||||||
collected = 0
|
|
||||||
|
|
||||||
for pub_key in tracked:
|
|
||||||
contact = await ContactRepository.get_by_key(pub_key)
|
|
||||||
if not contact or contact.type != 2:
|
|
||||||
logger.debug(
|
|
||||||
"Telemetry collect: skipping %s (not found or not repeater)",
|
|
||||||
pub_key[:12],
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
|
||||||
async with radio_manager.radio_operation(
|
|
||||||
"telemetry_collect",
|
|
||||||
blocking=False,
|
|
||||||
suspend_auto_fetch=True,
|
|
||||||
) as mc:
|
|
||||||
if await _collect_repeater_telemetry(mc, contact):
|
|
||||||
collected += 1
|
|
||||||
except RadioOperationBusyError:
|
|
||||||
logger.debug(
|
|
||||||
"Telemetry collect: radio busy, skipping %s",
|
|
||||||
pub_key[:12],
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"Telemetry collect: cycle complete, %d/%d successful",
|
|
||||||
collected,
|
|
||||||
len(tracked),
|
|
||||||
)
|
|
||||||
|
|
||||||
except asyncio.CancelledError:
|
except asyncio.CancelledError:
|
||||||
logger.info("Telemetry collect task cancelled")
|
logger.info("Telemetry collect task cancelled")
|
||||||
@@ -1719,10 +1787,7 @@ def start_telemetry_collect() -> None:
|
|||||||
global _telemetry_collect_task
|
global _telemetry_collect_task
|
||||||
if _telemetry_collect_task is None or _telemetry_collect_task.done():
|
if _telemetry_collect_task is None or _telemetry_collect_task.done():
|
||||||
_telemetry_collect_task = asyncio.create_task(_telemetry_collect_loop())
|
_telemetry_collect_task = asyncio.create_task(_telemetry_collect_loop())
|
||||||
logger.info(
|
logger.info("Started periodic telemetry collection (UTC-hourly scheduler)")
|
||||||
"Started periodic telemetry collection (interval: %ds)",
|
|
||||||
TELEMETRY_COLLECT_INTERVAL,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
async def stop_telemetry_collect() -> None:
|
async def stop_telemetry_collect() -> None:
|
||||||
|
|||||||
+69
-60
@@ -8,31 +8,33 @@ class ChannelRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def upsert(key: str, name: str, is_hashtag: bool = False, on_radio: bool = False) -> None:
|
async def upsert(key: str, name: str, is_hashtag: bool = False, on_radio: bool = False) -> None:
|
||||||
"""Upsert a channel. Key is 32-char hex string."""
|
"""Upsert a channel. Key is 32-char hex string."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
INSERT INTO channels (key, name, is_hashtag, on_radio, flood_scope_override)
|
"""
|
||||||
VALUES (?, ?, ?, ?, NULL)
|
INSERT INTO channels (key, name, is_hashtag, on_radio, flood_scope_override)
|
||||||
ON CONFLICT(key) DO UPDATE SET
|
VALUES (?, ?, ?, ?, NULL)
|
||||||
name = excluded.name,
|
ON CONFLICT(key) DO UPDATE SET
|
||||||
is_hashtag = excluded.is_hashtag,
|
name = excluded.name,
|
||||||
on_radio = excluded.on_radio
|
is_hashtag = excluded.is_hashtag,
|
||||||
""",
|
on_radio = excluded.on_radio
|
||||||
(key.upper(), name, is_hashtag, on_radio),
|
""",
|
||||||
)
|
(key.upper(), name, is_hashtag, on_radio),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_key(key: str) -> Channel | None:
|
async def get_by_key(key: str) -> Channel | None:
|
||||||
"""Get a channel by its key (32-char hex string)."""
|
"""Get a channel by its key (32-char hex string)."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT key, name, is_hashtag, on_radio, flood_scope_override, path_hash_mode_override, last_read_at, favorite
|
"""
|
||||||
FROM channels
|
SELECT key, name, is_hashtag, on_radio, flood_scope_override, path_hash_mode_override, last_read_at, favorite
|
||||||
WHERE key = ?
|
FROM channels
|
||||||
""",
|
WHERE key = ?
|
||||||
(key.upper(),),
|
""",
|
||||||
)
|
(key.upper(),),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if row:
|
if row:
|
||||||
return Channel(
|
return Channel(
|
||||||
key=row["key"],
|
key=row["key"],
|
||||||
@@ -48,14 +50,15 @@ class ChannelRepository:
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_all() -> list[Channel]:
|
async def get_all() -> list[Channel]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT key, name, is_hashtag, on_radio, flood_scope_override, path_hash_mode_override, last_read_at, favorite
|
"""
|
||||||
FROM channels
|
SELECT key, name, is_hashtag, on_radio, flood_scope_override, path_hash_mode_override, last_read_at, favorite
|
||||||
ORDER BY name
|
FROM channels
|
||||||
"""
|
ORDER BY name
|
||||||
)
|
"""
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [
|
return [
|
||||||
Channel(
|
Channel(
|
||||||
key=row["key"],
|
key=row["key"],
|
||||||
@@ -73,21 +76,23 @@ class ChannelRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def set_favorite(key: str, value: bool) -> bool:
|
async def set_favorite(key: str, value: bool) -> bool:
|
||||||
"""Set or clear the favorite flag for a channel. Returns True if row was found."""
|
"""Set or clear the favorite flag for a channel. Returns True if row was found."""
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE channels SET favorite = ? WHERE key = ?",
|
async with conn.execute(
|
||||||
(1 if value else 0, key.upper()),
|
"UPDATE channels SET favorite = ? WHERE key = ?",
|
||||||
)
|
(1 if value else 0, key.upper()),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount > 0
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount > 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def delete(key: str) -> None:
|
async def delete(key: str) -> None:
|
||||||
"""Delete a channel by key."""
|
"""Delete a channel by key."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"DELETE FROM channels WHERE key = ?",
|
async with conn.execute(
|
||||||
(key.upper(),),
|
"DELETE FROM channels WHERE key = ?",
|
||||||
)
|
(key.upper(),),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update_last_read_at(key: str, timestamp: int | None = None) -> bool:
|
async def update_last_read_at(key: str, timestamp: int | None = None) -> bool:
|
||||||
@@ -96,35 +101,39 @@ class ChannelRepository:
|
|||||||
Returns True if a row was updated, False if channel not found.
|
Returns True if a row was updated, False if channel not found.
|
||||||
"""
|
"""
|
||||||
ts = timestamp if timestamp is not None else int(time.time())
|
ts = timestamp if timestamp is not None else int(time.time())
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE channels SET last_read_at = ? WHERE key = ?",
|
async with conn.execute(
|
||||||
(ts, key.upper()),
|
"UPDATE channels SET last_read_at = ? WHERE key = ?",
|
||||||
)
|
(ts, key.upper()),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount > 0
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount > 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update_flood_scope_override(key: str, flood_scope_override: str | None) -> bool:
|
async def update_flood_scope_override(key: str, flood_scope_override: str | None) -> bool:
|
||||||
"""Set or clear a channel's flood-scope override."""
|
"""Set or clear a channel's flood-scope override."""
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE channels SET flood_scope_override = ? WHERE key = ?",
|
async with conn.execute(
|
||||||
(flood_scope_override, key.upper()),
|
"UPDATE channels SET flood_scope_override = ? WHERE key = ?",
|
||||||
)
|
(flood_scope_override, key.upper()),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount > 0
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount > 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update_path_hash_mode_override(key: str, path_hash_mode_override: int | None) -> bool:
|
async def update_path_hash_mode_override(key: str, path_hash_mode_override: int | None) -> bool:
|
||||||
"""Set or clear a channel's path hash mode override."""
|
"""Set or clear a channel's path hash mode override."""
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE channels SET path_hash_mode_override = ? WHERE key = ?",
|
async with conn.execute(
|
||||||
(path_hash_mode_override, key.upper()),
|
"UPDATE channels SET path_hash_mode_override = ? WHERE key = ?",
|
||||||
)
|
(path_hash_mode_override, key.upper()),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount > 0
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount > 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def mark_all_read(timestamp: int) -> None:
|
async def mark_all_read(timestamp: int) -> None:
|
||||||
"""Mark all channels as read at the given timestamp."""
|
"""Mark all channels as read at the given timestamp."""
|
||||||
await db.conn.execute("UPDATE channels SET last_read_at = ?", (timestamp,))
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute("UPDATE channels SET last_read_at = ?", (timestamp,)):
|
||||||
|
pass
|
||||||
|
|||||||
+467
-356
@@ -61,66 +61,72 @@ class ContactRepository:
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
INSERT INTO contacts (public_key, name, type, flags, direct_path, direct_path_len,
|
"""
|
||||||
direct_path_hash_mode, direct_path_updated_at,
|
INSERT INTO contacts (public_key, name, type, flags, direct_path, direct_path_len,
|
||||||
route_override_path, route_override_len,
|
direct_path_hash_mode, direct_path_updated_at,
|
||||||
route_override_hash_mode,
|
route_override_path, route_override_len,
|
||||||
last_advert, lat, lon, last_seen,
|
route_override_hash_mode,
|
||||||
on_radio, last_contacted, first_seen)
|
last_advert, lat, lon, last_seen,
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
on_radio, last_contacted, first_seen)
|
||||||
ON CONFLICT(public_key) DO UPDATE SET
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
name = COALESCE(excluded.name, contacts.name),
|
ON CONFLICT(public_key) DO UPDATE SET
|
||||||
type = CASE WHEN excluded.type = 0 THEN contacts.type ELSE excluded.type END,
|
name = COALESCE(excluded.name, contacts.name),
|
||||||
flags = excluded.flags,
|
type = CASE WHEN excluded.type = 0 THEN contacts.type ELSE excluded.type END,
|
||||||
direct_path = COALESCE(excluded.direct_path, contacts.direct_path),
|
flags = excluded.flags,
|
||||||
direct_path_len = COALESCE(excluded.direct_path_len, contacts.direct_path_len),
|
direct_path = COALESCE(excluded.direct_path, contacts.direct_path),
|
||||||
direct_path_hash_mode = COALESCE(
|
direct_path_len = COALESCE(excluded.direct_path_len, contacts.direct_path_len),
|
||||||
excluded.direct_path_hash_mode, contacts.direct_path_hash_mode
|
direct_path_hash_mode = COALESCE(
|
||||||
|
excluded.direct_path_hash_mode, contacts.direct_path_hash_mode
|
||||||
|
),
|
||||||
|
direct_path_updated_at = COALESCE(
|
||||||
|
excluded.direct_path_updated_at, contacts.direct_path_updated_at
|
||||||
|
),
|
||||||
|
route_override_path = COALESCE(
|
||||||
|
excluded.route_override_path, contacts.route_override_path
|
||||||
|
),
|
||||||
|
route_override_len = COALESCE(
|
||||||
|
excluded.route_override_len, contacts.route_override_len
|
||||||
|
),
|
||||||
|
route_override_hash_mode = COALESCE(
|
||||||
|
excluded.route_override_hash_mode, contacts.route_override_hash_mode
|
||||||
|
),
|
||||||
|
last_advert = COALESCE(excluded.last_advert, contacts.last_advert),
|
||||||
|
lat = COALESCE(excluded.lat, contacts.lat),
|
||||||
|
lon = COALESCE(excluded.lon, contacts.lon),
|
||||||
|
last_seen = CASE
|
||||||
|
WHEN excluded.last_seen IS NULL THEN contacts.last_seen
|
||||||
|
WHEN contacts.last_seen IS NULL THEN excluded.last_seen
|
||||||
|
WHEN excluded.last_seen > contacts.last_seen THEN excluded.last_seen
|
||||||
|
ELSE contacts.last_seen
|
||||||
|
END,
|
||||||
|
on_radio = COALESCE(excluded.on_radio, contacts.on_radio),
|
||||||
|
last_contacted = COALESCE(excluded.last_contacted, contacts.last_contacted),
|
||||||
|
first_seen = COALESCE(contacts.first_seen, excluded.first_seen)
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
contact_row.public_key.lower(),
|
||||||
|
contact_row.name,
|
||||||
|
contact_row.type,
|
||||||
|
contact_row.flags,
|
||||||
|
direct_path,
|
||||||
|
direct_path_len,
|
||||||
|
direct_path_hash_mode,
|
||||||
|
contact_row.direct_path_updated_at,
|
||||||
|
route_override_path,
|
||||||
|
route_override_len,
|
||||||
|
route_override_hash_mode,
|
||||||
|
contact_row.last_advert,
|
||||||
|
contact_row.lat,
|
||||||
|
contact_row.lon,
|
||||||
|
contact_row.last_seen,
|
||||||
|
contact_row.on_radio,
|
||||||
|
contact_row.last_contacted,
|
||||||
|
contact_row.first_seen,
|
||||||
),
|
),
|
||||||
direct_path_updated_at = COALESCE(
|
):
|
||||||
excluded.direct_path_updated_at, contacts.direct_path_updated_at
|
pass
|
||||||
),
|
|
||||||
route_override_path = COALESCE(
|
|
||||||
excluded.route_override_path, contacts.route_override_path
|
|
||||||
),
|
|
||||||
route_override_len = COALESCE(
|
|
||||||
excluded.route_override_len, contacts.route_override_len
|
|
||||||
),
|
|
||||||
route_override_hash_mode = COALESCE(
|
|
||||||
excluded.route_override_hash_mode, contacts.route_override_hash_mode
|
|
||||||
),
|
|
||||||
last_advert = COALESCE(excluded.last_advert, contacts.last_advert),
|
|
||||||
lat = COALESCE(excluded.lat, contacts.lat),
|
|
||||||
lon = COALESCE(excluded.lon, contacts.lon),
|
|
||||||
last_seen = excluded.last_seen,
|
|
||||||
on_radio = COALESCE(excluded.on_radio, contacts.on_radio),
|
|
||||||
last_contacted = COALESCE(excluded.last_contacted, contacts.last_contacted),
|
|
||||||
first_seen = COALESCE(contacts.first_seen, excluded.first_seen)
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
contact_row.public_key.lower(),
|
|
||||||
contact_row.name,
|
|
||||||
contact_row.type,
|
|
||||||
contact_row.flags,
|
|
||||||
direct_path,
|
|
||||||
direct_path_len,
|
|
||||||
direct_path_hash_mode,
|
|
||||||
contact_row.direct_path_updated_at,
|
|
||||||
route_override_path,
|
|
||||||
route_override_len,
|
|
||||||
route_override_hash_mode,
|
|
||||||
contact_row.last_advert,
|
|
||||||
contact_row.lat,
|
|
||||||
contact_row.lon,
|
|
||||||
contact_row.last_seen if contact_row.last_seen is not None else int(time.time()),
|
|
||||||
contact_row.on_radio,
|
|
||||||
contact_row.last_contacted,
|
|
||||||
contact_row.first_seen,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
await db.conn.commit()
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _row_to_contact(row) -> Contact:
|
def _row_to_contact(row) -> Contact:
|
||||||
@@ -178,10 +184,11 @@ class ContactRepository:
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_key(public_key: str) -> Contact | None:
|
async def get_by_key(public_key: str) -> Contact | None:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts WHERE public_key = ?", (public_key.lower(),)
|
async with conn.execute(
|
||||||
)
|
"SELECT * FROM contacts WHERE public_key = ?", (public_key.lower(),)
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return ContactRepository._row_to_contact(row) if row else None
|
return ContactRepository._row_to_contact(row) if row else None
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -195,11 +202,12 @@ class ContactRepository:
|
|||||||
exact = await ContactRepository.get_by_key(normalized_prefix)
|
exact = await ContactRepository.get_by_key(normalized_prefix)
|
||||||
if exact:
|
if exact:
|
||||||
return exact
|
return exact
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts WHERE public_key LIKE ? ORDER BY public_key LIMIT 2",
|
async with conn.execute(
|
||||||
(f"{normalized_prefix}%",),
|
"SELECT * FROM contacts WHERE public_key LIKE ? ORDER BY public_key LIMIT 2",
|
||||||
)
|
(f"{normalized_prefix}%",),
|
||||||
rows = list(await cursor.fetchall())
|
) as cursor:
|
||||||
|
rows = list(await cursor.fetchall())
|
||||||
if len(rows) != 1:
|
if len(rows) != 1:
|
||||||
return None
|
return None
|
||||||
return ContactRepository._row_to_contact(rows[0])
|
return ContactRepository._row_to_contact(rows[0])
|
||||||
@@ -207,11 +215,12 @@ class ContactRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def _get_prefix_matches(prefix: str, limit: int = 2) -> list[Contact]:
|
async def _get_prefix_matches(prefix: str, limit: int = 2) -> list[Contact]:
|
||||||
"""Get contacts matching a key prefix, up to limit."""
|
"""Get contacts matching a key prefix, up to limit."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts WHERE public_key LIKE ? ORDER BY public_key LIMIT ?",
|
async with conn.execute(
|
||||||
(f"{prefix.lower()}%", limit),
|
"SELECT * FROM contacts WHERE public_key LIKE ? ORDER BY public_key LIMIT ?",
|
||||||
)
|
(f"{prefix.lower()}%", limit),
|
||||||
rows = list(await cursor.fetchall())
|
) as cursor:
|
||||||
|
rows = list(await cursor.fetchall())
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -237,8 +246,9 @@ class ContactRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_name(name: str) -> list[Contact]:
|
async def get_by_name(name: str) -> list[Contact]:
|
||||||
"""Get all contacts with the given exact name."""
|
"""Get all contacts with the given exact name."""
|
||||||
cursor = await db.conn.execute("SELECT * FROM contacts WHERE name = ?", (name,))
|
async with db.readonly() as conn:
|
||||||
rows = await cursor.fetchall()
|
async with conn.execute("SELECT * FROM contacts WHERE name = ?", (name,)) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -254,8 +264,9 @@ class ContactRepository:
|
|||||||
normalized = [p.lower() for p in prefixes]
|
normalized = [p.lower() for p in prefixes]
|
||||||
conditions = " OR ".join(["public_key LIKE ?"] * len(normalized))
|
conditions = " OR ".join(["public_key LIKE ?"] * len(normalized))
|
||||||
params = [f"{p}%" for p in normalized]
|
params = [f"{p}%" for p in normalized]
|
||||||
cursor = await db.conn.execute(f"SELECT * FROM contacts WHERE {conditions}", params)
|
async with db.readonly() as conn:
|
||||||
rows = await cursor.fetchall()
|
async with conn.execute(f"SELECT * FROM contacts WHERE {conditions}", params) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
# Group by which prefix each row matches
|
# Group by which prefix each row matches
|
||||||
prefix_to_rows: dict[str, list] = {p: [] for p in normalized}
|
prefix_to_rows: dict[str, list] = {p: [] for p in normalized}
|
||||||
for row in rows:
|
for row in rows:
|
||||||
@@ -272,41 +283,67 @@ class ContactRepository:
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_all(limit: int = 100, offset: int = 0) -> list[Contact]:
|
async def get_all(limit: int = 100, offset: int = 0) -> list[Contact]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts ORDER BY COALESCE(name, public_key) LIMIT ? OFFSET ?",
|
async with conn.execute(
|
||||||
(limit, offset),
|
"SELECT * FROM contacts ORDER BY COALESCE(name, public_key) LIMIT ? OFFSET ?",
|
||||||
)
|
(limit, offset),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_recently_contacted_non_repeaters(limit: int = 200) -> list[Contact]:
|
async def get_recently_contacted_non_repeaters(limit: int = 200) -> list[Contact]:
|
||||||
"""Get recently interacted-with non-repeater contacts."""
|
"""Get recently interacted-with non-repeater contacts."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT * FROM contacts
|
"""
|
||||||
WHERE type != 2 AND last_contacted IS NOT NULL AND length(public_key) = 64
|
SELECT * FROM contacts
|
||||||
ORDER BY last_contacted DESC
|
WHERE type != 2 AND last_contacted IS NOT NULL AND length(public_key) = 64
|
||||||
LIMIT ?
|
ORDER BY last_contacted DESC
|
||||||
""",
|
LIMIT ?
|
||||||
(limit,),
|
""",
|
||||||
)
|
(limit,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_recently_dm_active_non_repeaters(limit: int = 200) -> list[Contact]:
|
||||||
|
"""Get non-repeater contacts with the most recent DM activity (sent or received)."""
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
SELECT c.*
|
||||||
|
FROM contacts c
|
||||||
|
INNER JOIN (
|
||||||
|
SELECT conversation_key, MAX(received_at) AS last_dm
|
||||||
|
FROM messages
|
||||||
|
WHERE type = 'PRIV'
|
||||||
|
GROUP BY conversation_key
|
||||||
|
) m ON c.public_key = m.conversation_key
|
||||||
|
WHERE c.type != 2 AND length(c.public_key) = 64
|
||||||
|
ORDER BY m.last_dm DESC
|
||||||
|
LIMIT ?
|
||||||
|
""",
|
||||||
|
(limit,),
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_recently_advertised_non_repeaters(limit: int = 200) -> list[Contact]:
|
async def get_recently_advertised_non_repeaters(limit: int = 200) -> list[Contact]:
|
||||||
"""Get recently advert-heard non-repeater contacts."""
|
"""Get recently advert-heard non-repeater contacts."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT * FROM contacts
|
"""
|
||||||
WHERE type != 2 AND last_advert IS NOT NULL AND length(public_key) = 64
|
SELECT * FROM contacts
|
||||||
ORDER BY last_advert DESC
|
WHERE type != 2 AND last_advert IS NOT NULL AND length(public_key) = 64
|
||||||
LIMIT ?
|
ORDER BY last_advert DESC
|
||||||
""",
|
LIMIT ?
|
||||||
(limit,),
|
""",
|
||||||
)
|
(limit,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -317,27 +354,44 @@ class ContactRepository:
|
|||||||
path_hash_mode: int | None = None,
|
path_hash_mode: int | None = None,
|
||||||
updated_at: int | None = None,
|
updated_at: int | None = None,
|
||||||
) -> None:
|
) -> None:
|
||||||
|
"""Persist a learned direct route for a contact.
|
||||||
|
|
||||||
|
Both callers (the RF PATH packet processor and the firmware PATH_UPDATE
|
||||||
|
event handler) are RF-backed: firmware ``onContactPathUpdated`` only
|
||||||
|
fires from ``onContactPathRecv`` during RF PATH packet reception. So
|
||||||
|
this method also advances ``last_seen`` monotonically. Never moves
|
||||||
|
``last_seen`` backwards if an out-of-order arrival lands with an older
|
||||||
|
timestamp.
|
||||||
|
"""
|
||||||
normalized_path, normalized_path_len, normalized_hash_mode = normalize_contact_route(
|
normalized_path, normalized_path_len, normalized_hash_mode = normalize_contact_route(
|
||||||
path,
|
path,
|
||||||
path_len,
|
path_len,
|
||||||
path_hash_mode,
|
path_hash_mode,
|
||||||
)
|
)
|
||||||
ts = updated_at if updated_at is not None else int(time.time())
|
ts = updated_at if updated_at is not None else int(time.time())
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""UPDATE contacts SET direct_path = ?, direct_path_len = ?,
|
async with conn.execute(
|
||||||
direct_path_hash_mode = COALESCE(?, direct_path_hash_mode),
|
"""UPDATE contacts SET direct_path = ?, direct_path_len = ?,
|
||||||
direct_path_updated_at = ?,
|
direct_path_hash_mode = COALESCE(?, direct_path_hash_mode),
|
||||||
last_seen = ? WHERE public_key = ?""",
|
direct_path_updated_at = ?,
|
||||||
(
|
last_seen = CASE
|
||||||
normalized_path,
|
WHEN last_seen IS NULL THEN ?
|
||||||
normalized_path_len,
|
WHEN ? > last_seen THEN ?
|
||||||
normalized_hash_mode,
|
ELSE last_seen
|
||||||
ts,
|
END
|
||||||
ts,
|
WHERE public_key = ?""",
|
||||||
public_key.lower(),
|
(
|
||||||
),
|
normalized_path,
|
||||||
)
|
normalized_path_len,
|
||||||
await db.conn.commit()
|
normalized_hash_mode,
|
||||||
|
ts,
|
||||||
|
ts,
|
||||||
|
ts,
|
||||||
|
ts,
|
||||||
|
public_key.lower(),
|
||||||
|
),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def set_routing_override(
|
async def set_routing_override(
|
||||||
@@ -351,65 +405,71 @@ class ContactRepository:
|
|||||||
path_len,
|
path_len,
|
||||||
path_hash_mode,
|
path_hash_mode,
|
||||||
)
|
)
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
UPDATE contacts
|
"""
|
||||||
SET route_override_path = ?, route_override_len = ?, route_override_hash_mode = ?
|
UPDATE contacts
|
||||||
WHERE public_key = ?
|
SET route_override_path = ?, route_override_len = ?, route_override_hash_mode = ?
|
||||||
""",
|
WHERE public_key = ?
|
||||||
(
|
""",
|
||||||
normalized_path,
|
(
|
||||||
normalized_len,
|
normalized_path,
|
||||||
normalized_hash_mode,
|
normalized_len,
|
||||||
public_key.lower(),
|
normalized_hash_mode,
|
||||||
),
|
public_key.lower(),
|
||||||
)
|
),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def clear_routing_override(public_key: str) -> None:
|
async def clear_routing_override(public_key: str) -> None:
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
UPDATE contacts
|
"""
|
||||||
SET route_override_path = NULL,
|
UPDATE contacts
|
||||||
route_override_len = NULL,
|
SET route_override_path = NULL,
|
||||||
route_override_hash_mode = NULL
|
route_override_len = NULL,
|
||||||
WHERE public_key = ?
|
route_override_hash_mode = NULL
|
||||||
""",
|
WHERE public_key = ?
|
||||||
(public_key.lower(),),
|
""",
|
||||||
)
|
(public_key.lower(),),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def clear_on_radio_except(keep_keys: list[str]) -> None:
|
async def clear_on_radio_except(keep_keys: list[str]) -> None:
|
||||||
"""Set on_radio=False for all contacts NOT in keep_keys."""
|
"""Set on_radio=False for all contacts NOT in keep_keys."""
|
||||||
if not keep_keys:
|
async with db.tx() as conn:
|
||||||
await db.conn.execute("UPDATE contacts SET on_radio = 0 WHERE on_radio = 1")
|
if not keep_keys:
|
||||||
else:
|
async with conn.execute("UPDATE contacts SET on_radio = 0 WHERE on_radio = 1"):
|
||||||
placeholders = ",".join("?" * len(keep_keys))
|
pass
|
||||||
await db.conn.execute(
|
else:
|
||||||
f"UPDATE contacts SET on_radio = 0 WHERE on_radio = 1 AND public_key NOT IN ({placeholders})",
|
placeholders = ",".join("?" * len(keep_keys))
|
||||||
keep_keys,
|
async with conn.execute(
|
||||||
)
|
f"UPDATE contacts SET on_radio = 0 WHERE on_radio = 1 AND public_key NOT IN ({placeholders})",
|
||||||
await db.conn.commit()
|
keep_keys,
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_favorites() -> list[Contact]:
|
async def get_favorites() -> list[Contact]:
|
||||||
"""Return all contacts marked as favorite."""
|
"""Return all contacts marked as favorite."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts WHERE favorite = 1 AND LENGTH(public_key) = 64"
|
async with conn.execute(
|
||||||
)
|
"SELECT * FROM contacts WHERE favorite = 1 AND LENGTH(public_key) = 64"
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def set_favorite(public_key: str, value: bool) -> None:
|
async def set_favorite(public_key: str, value: bool) -> None:
|
||||||
"""Set or clear the favorite flag for a contact."""
|
"""Set or clear the favorite flag for a contact."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE contacts SET favorite = ? WHERE public_key = ?",
|
async with conn.execute(
|
||||||
(1 if value else 0, public_key.lower()),
|
"UPDATE contacts SET favorite = ? WHERE public_key = ?",
|
||||||
)
|
(1 if value else 0, public_key.lower()),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def delete(public_key: str) -> None:
|
async def delete(public_key: str) -> None:
|
||||||
@@ -417,18 +477,53 @@ class ContactRepository:
|
|||||||
# contact_name_history and contact_advert_paths cascade via FK.
|
# contact_name_history and contact_advert_paths cascade via FK.
|
||||||
# Messages are intentionally preserved so history re-surfaces
|
# Messages are intentionally preserved so history re-surfaces
|
||||||
# if the contact is re-added later.
|
# if the contact is re-added later.
|
||||||
await db.conn.execute("DELETE FROM contacts WHERE public_key = ?", (normalized,))
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute("DELETE FROM contacts WHERE public_key = ?", (normalized,)):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update_last_contacted(public_key: str, timestamp: int | None = None) -> None:
|
async def update_last_contacted(public_key: str, timestamp: int | None = None) -> None:
|
||||||
"""Update the last_contacted timestamp for a contact."""
|
"""Update the last_contacted timestamp for a contact.
|
||||||
|
|
||||||
|
``last_contacted`` tracks the most recent direct-conversation activity
|
||||||
|
with this contact in either direction (incoming or outgoing DM). It is
|
||||||
|
the field that powers "recent conversations" ordering on the frontend.
|
||||||
|
|
||||||
|
It deliberately does not touch ``last_seen``: ``last_seen`` is reserved
|
||||||
|
for actual RF reception from the contact, and outgoing sends are not
|
||||||
|
evidence that we heard from them. RF observations from DM ingest update
|
||||||
|
``last_seen`` via :meth:`touch_last_seen` on incoming DMs only.
|
||||||
|
"""
|
||||||
ts = timestamp if timestamp is not None else int(time.time())
|
ts = timestamp if timestamp is not None else int(time.time())
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE contacts SET last_contacted = ?, last_seen = ? WHERE public_key = ?",
|
async with conn.execute(
|
||||||
(ts, ts, public_key.lower()),
|
"UPDATE contacts SET last_contacted = ? WHERE public_key = ?",
|
||||||
)
|
(ts, public_key.lower()),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def touch_last_seen(public_key: str, timestamp: int) -> None:
|
||||||
|
"""Monotonically bump last_seen for a contact from an RF observation.
|
||||||
|
|
||||||
|
Never moves last_seen backwards; a no-op if the contact row does not
|
||||||
|
exist. Use this from packet-ingest paths that have attributed a packet
|
||||||
|
to a specific contact pubkey (advert, incoming DM, decrypted PATH, etc.).
|
||||||
|
"""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
UPDATE contacts
|
||||||
|
SET last_seen = CASE
|
||||||
|
WHEN last_seen IS NULL THEN ?
|
||||||
|
WHEN ? > last_seen THEN ?
|
||||||
|
ELSE last_seen
|
||||||
|
END
|
||||||
|
WHERE public_key = ?
|
||||||
|
""",
|
||||||
|
(timestamp, timestamp, timestamp, public_key.lower()),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update_last_read_at(public_key: str, timestamp: int | None = None) -> bool:
|
async def update_last_read_at(public_key: str, timestamp: int | None = None) -> bool:
|
||||||
@@ -437,22 +532,25 @@ class ContactRepository:
|
|||||||
Returns True if a row was updated, False if contact not found.
|
Returns True if a row was updated, False if contact not found.
|
||||||
"""
|
"""
|
||||||
ts = timestamp if timestamp is not None else int(time.time())
|
ts = timestamp if timestamp is not None else int(time.time())
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE contacts SET last_read_at = ? WHERE public_key = ?",
|
async with conn.execute(
|
||||||
(ts, public_key.lower()),
|
"UPDATE contacts SET last_read_at = ? WHERE public_key = ?",
|
||||||
)
|
(ts, public_key.lower()),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount > 0
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount > 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def promote_prefix_placeholders(full_key: str) -> list[str]:
|
async def promote_prefix_placeholders(full_key: str) -> list[str]:
|
||||||
"""Promote prefix-only placeholder contacts to a resolved full key.
|
"""Promote prefix-only placeholder contacts to a resolved full key.
|
||||||
|
|
||||||
Returns the placeholder public keys that were merged into the full key.
|
Returns the placeholder public keys that were merged into the full key.
|
||||||
|
All operations for the promotion happen inside one ``db.tx()`` so
|
||||||
|
partial promotions never leak to readers between steps.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
async def migrate_child_rows(old_key: str, new_key: str) -> None:
|
async def migrate_child_rows(conn, old_key: str, new_key: str) -> None:
|
||||||
await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
INSERT INTO contact_name_history (public_key, name, first_seen, last_seen)
|
INSERT INTO contact_name_history (public_key, name, first_seen, last_seen)
|
||||||
SELECT ?, name, first_seen, last_seen
|
SELECT ?, name, first_seen, last_seen
|
||||||
@@ -463,8 +561,9 @@ class ContactRepository:
|
|||||||
last_seen = MAX(contact_name_history.last_seen, excluded.last_seen)
|
last_seen = MAX(contact_name_history.last_seen, excluded.last_seen)
|
||||||
""",
|
""",
|
||||||
(new_key, old_key),
|
(new_key, old_key),
|
||||||
)
|
):
|
||||||
await db.conn.execute(
|
pass
|
||||||
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
INSERT INTO contact_advert_paths
|
INSERT INTO contact_advert_paths
|
||||||
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
||||||
@@ -477,132 +576,138 @@ class ContactRepository:
|
|||||||
heard_count = contact_advert_paths.heard_count + excluded.heard_count
|
heard_count = contact_advert_paths.heard_count + excluded.heard_count
|
||||||
""",
|
""",
|
||||||
(new_key, old_key),
|
(new_key, old_key),
|
||||||
)
|
):
|
||||||
await db.conn.execute(
|
pass
|
||||||
|
async with conn.execute(
|
||||||
"DELETE FROM contact_name_history WHERE public_key = ?",
|
"DELETE FROM contact_name_history WHERE public_key = ?",
|
||||||
(old_key,),
|
(old_key,),
|
||||||
)
|
):
|
||||||
await db.conn.execute(
|
pass
|
||||||
|
async with conn.execute(
|
||||||
"DELETE FROM contact_advert_paths WHERE public_key = ?",
|
"DELETE FROM contact_advert_paths WHERE public_key = ?",
|
||||||
(old_key,),
|
(old_key,),
|
||||||
)
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
normalized_full_key = full_key.lower()
|
normalized_full_key = full_key.lower()
|
||||||
cursor = await db.conn.execute(
|
|
||||||
"""
|
|
||||||
SELECT public_key, last_seen, last_contacted, first_seen, last_read_at
|
|
||||||
FROM contacts
|
|
||||||
WHERE length(public_key) < 64
|
|
||||||
AND ? LIKE public_key || '%'
|
|
||||||
ORDER BY length(public_key) DESC, public_key
|
|
||||||
""",
|
|
||||||
(normalized_full_key,),
|
|
||||||
)
|
|
||||||
rows = list(await cursor.fetchall())
|
|
||||||
if not rows:
|
|
||||||
return []
|
|
||||||
|
|
||||||
promoted_keys: list[str] = []
|
promoted_keys: list[str] = []
|
||||||
|
async with db.tx() as conn:
|
||||||
for row in rows:
|
async with conn.execute(
|
||||||
old_key = row["public_key"]
|
|
||||||
if old_key == normalized_full_key:
|
|
||||||
continue
|
|
||||||
|
|
||||||
match_cursor = await db.conn.execute(
|
|
||||||
"""
|
"""
|
||||||
SELECT COUNT(*) AS match_count
|
SELECT public_key, last_seen, last_contacted, first_seen, last_read_at
|
||||||
FROM contacts
|
FROM contacts
|
||||||
WHERE length(public_key) = 64
|
WHERE length(public_key) < 64
|
||||||
AND public_key LIKE ? || '%'
|
AND ? LIKE public_key || '%'
|
||||||
|
ORDER BY length(public_key) DESC, public_key
|
||||||
""",
|
""",
|
||||||
(old_key,),
|
(normalized_full_key,),
|
||||||
)
|
) as cursor:
|
||||||
match_row = await match_cursor.fetchone()
|
rows = list(await cursor.fetchall())
|
||||||
match_count = match_row["match_count"] if match_row is not None else 0
|
if not rows:
|
||||||
if match_count != 1:
|
return []
|
||||||
logger.warning(
|
|
||||||
"Skipping prefix promotion for %s: %d full-key contacts match (expected 1)",
|
|
||||||
old_key,
|
|
||||||
match_count,
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
await migrate_child_rows(old_key, normalized_full_key)
|
for row in rows:
|
||||||
|
old_key = row["public_key"]
|
||||||
|
if old_key == normalized_full_key:
|
||||||
|
continue
|
||||||
|
|
||||||
# Merge timestamp metadata from the old prefix contact into the
|
async with conn.execute(
|
||||||
# full-key contact (which all callers guarantee already exists),
|
"""
|
||||||
# then delete the prefix placeholder.
|
SELECT COUNT(*) AS match_count
|
||||||
await db.conn.execute(
|
FROM contacts
|
||||||
"""
|
WHERE length(public_key) = 64
|
||||||
UPDATE contacts
|
AND public_key LIKE ? || '%'
|
||||||
SET last_seen = CASE
|
""",
|
||||||
WHEN contacts.last_seen IS NULL THEN ?
|
(old_key,),
|
||||||
WHEN ? IS NULL THEN contacts.last_seen
|
) as match_cursor:
|
||||||
WHEN ? > contacts.last_seen THEN ?
|
match_row = await match_cursor.fetchone()
|
||||||
ELSE contacts.last_seen
|
match_count = match_row["match_count"] if match_row is not None else 0
|
||||||
END,
|
if match_count != 1:
|
||||||
last_contacted = CASE
|
logger.warning(
|
||||||
WHEN contacts.last_contacted IS NULL THEN ?
|
"Skipping prefix promotion for %s: %d full-key contacts match (expected 1)",
|
||||||
WHEN ? IS NULL THEN contacts.last_contacted
|
old_key,
|
||||||
WHEN ? > contacts.last_contacted THEN ?
|
match_count,
|
||||||
ELSE contacts.last_contacted
|
)
|
||||||
END,
|
continue
|
||||||
first_seen = CASE
|
|
||||||
WHEN contacts.first_seen IS NULL THEN ?
|
|
||||||
WHEN ? IS NULL THEN contacts.first_seen
|
|
||||||
WHEN ? < contacts.first_seen THEN ?
|
|
||||||
ELSE contacts.first_seen
|
|
||||||
END,
|
|
||||||
last_read_at = CASE
|
|
||||||
WHEN contacts.last_read_at IS NULL THEN ?
|
|
||||||
WHEN ? IS NULL THEN contacts.last_read_at
|
|
||||||
WHEN ? > contacts.last_read_at THEN ?
|
|
||||||
ELSE contacts.last_read_at
|
|
||||||
END
|
|
||||||
WHERE public_key = ?
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
row["last_seen"],
|
|
||||||
row["last_seen"],
|
|
||||||
row["last_seen"],
|
|
||||||
row["last_seen"],
|
|
||||||
row["last_contacted"],
|
|
||||||
row["last_contacted"],
|
|
||||||
row["last_contacted"],
|
|
||||||
row["last_contacted"],
|
|
||||||
row["first_seen"],
|
|
||||||
row["first_seen"],
|
|
||||||
row["first_seen"],
|
|
||||||
row["first_seen"],
|
|
||||||
row["last_read_at"],
|
|
||||||
row["last_read_at"],
|
|
||||||
row["last_read_at"],
|
|
||||||
row["last_read_at"],
|
|
||||||
normalized_full_key,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
await db.conn.execute("DELETE FROM contacts WHERE public_key = ?", (old_key,))
|
|
||||||
|
|
||||||
promoted_keys.append(old_key)
|
await migrate_child_rows(conn, old_key, normalized_full_key)
|
||||||
|
|
||||||
|
# Merge timestamp metadata from the old prefix contact into the
|
||||||
|
# full-key contact (which all callers guarantee already exists),
|
||||||
|
# then delete the prefix placeholder.
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
UPDATE contacts
|
||||||
|
SET last_seen = CASE
|
||||||
|
WHEN contacts.last_seen IS NULL THEN ?
|
||||||
|
WHEN ? IS NULL THEN contacts.last_seen
|
||||||
|
WHEN ? > contacts.last_seen THEN ?
|
||||||
|
ELSE contacts.last_seen
|
||||||
|
END,
|
||||||
|
last_contacted = CASE
|
||||||
|
WHEN contacts.last_contacted IS NULL THEN ?
|
||||||
|
WHEN ? IS NULL THEN contacts.last_contacted
|
||||||
|
WHEN ? > contacts.last_contacted THEN ?
|
||||||
|
ELSE contacts.last_contacted
|
||||||
|
END,
|
||||||
|
first_seen = CASE
|
||||||
|
WHEN contacts.first_seen IS NULL THEN ?
|
||||||
|
WHEN ? IS NULL THEN contacts.first_seen
|
||||||
|
WHEN ? < contacts.first_seen THEN ?
|
||||||
|
ELSE contacts.first_seen
|
||||||
|
END,
|
||||||
|
last_read_at = CASE
|
||||||
|
WHEN contacts.last_read_at IS NULL THEN ?
|
||||||
|
WHEN ? IS NULL THEN contacts.last_read_at
|
||||||
|
WHEN ? > contacts.last_read_at THEN ?
|
||||||
|
ELSE contacts.last_read_at
|
||||||
|
END
|
||||||
|
WHERE public_key = ?
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
row["last_seen"],
|
||||||
|
row["last_seen"],
|
||||||
|
row["last_seen"],
|
||||||
|
row["last_seen"],
|
||||||
|
row["last_contacted"],
|
||||||
|
row["last_contacted"],
|
||||||
|
row["last_contacted"],
|
||||||
|
row["last_contacted"],
|
||||||
|
row["first_seen"],
|
||||||
|
row["first_seen"],
|
||||||
|
row["first_seen"],
|
||||||
|
row["first_seen"],
|
||||||
|
row["last_read_at"],
|
||||||
|
row["last_read_at"],
|
||||||
|
row["last_read_at"],
|
||||||
|
row["last_read_at"],
|
||||||
|
normalized_full_key,
|
||||||
|
),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
async with conn.execute("DELETE FROM contacts WHERE public_key = ?", (old_key,)):
|
||||||
|
pass
|
||||||
|
|
||||||
|
promoted_keys.append(old_key)
|
||||||
|
|
||||||
await db.conn.commit()
|
|
||||||
return promoted_keys
|
return promoted_keys
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def mark_all_read(timestamp: int) -> None:
|
async def mark_all_read(timestamp: int) -> None:
|
||||||
"""Mark all contacts as read at the given timestamp."""
|
"""Mark all contacts as read at the given timestamp."""
|
||||||
await db.conn.execute("UPDATE contacts SET last_read_at = ?", (timestamp,))
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute("UPDATE contacts SET last_read_at = ?", (timestamp,)):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_pubkey_first_byte(hex_byte: str) -> list[Contact]:
|
async def get_by_pubkey_first_byte(hex_byte: str) -> list[Contact]:
|
||||||
"""Get contacts whose public key starts with the given hex byte (2 chars)."""
|
"""Get contacts whose public key starts with the given hex byte (2 chars)."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM contacts WHERE substr(public_key, 1, 2) = ?",
|
async with conn.execute(
|
||||||
(hex_byte.lower(),),
|
"SELECT * FROM contacts WHERE substr(public_key, 1, 2) = ?",
|
||||||
)
|
(hex_byte.lower(),),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||||
|
|
||||||
|
|
||||||
@@ -641,71 +746,75 @@ class ContactAdvertPathRepository:
|
|||||||
normalized_path = path_hex.lower()
|
normalized_path = path_hex.lower()
|
||||||
path_len = hop_count if hop_count is not None else len(normalized_path) // 2
|
path_len = hop_count if hop_count is not None else len(normalized_path) // 2
|
||||||
|
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
INSERT INTO contact_advert_paths
|
"""
|
||||||
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
INSERT INTO contact_advert_paths
|
||||||
VALUES (?, ?, ?, ?, ?, 1)
|
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
||||||
ON CONFLICT(public_key, path_hex, path_len) DO UPDATE SET
|
VALUES (?, ?, ?, ?, ?, 1)
|
||||||
last_seen = MAX(contact_advert_paths.last_seen, excluded.last_seen),
|
ON CONFLICT(public_key, path_hex, path_len) DO UPDATE SET
|
||||||
heard_count = contact_advert_paths.heard_count + 1
|
last_seen = MAX(contact_advert_paths.last_seen, excluded.last_seen),
|
||||||
""",
|
heard_count = contact_advert_paths.heard_count + 1
|
||||||
(normalized_key, normalized_path, path_len, timestamp, timestamp),
|
""",
|
||||||
)
|
(normalized_key, normalized_path, path_len, timestamp, timestamp),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
# Keep only the N most recent unique paths per contact.
|
# Keep only the N most recent unique paths per contact.
|
||||||
await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
DELETE FROM contact_advert_paths
|
DELETE FROM contact_advert_paths
|
||||||
WHERE public_key = ?
|
WHERE public_key = ?
|
||||||
AND id NOT IN (
|
AND id NOT IN (
|
||||||
SELECT id
|
SELECT id
|
||||||
FROM contact_advert_paths
|
FROM contact_advert_paths
|
||||||
WHERE public_key = ?
|
WHERE public_key = ?
|
||||||
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
||||||
LIMIT ?
|
LIMIT ?
|
||||||
)
|
)
|
||||||
""",
|
""",
|
||||||
(normalized_key, normalized_key, max_paths),
|
(normalized_key, normalized_key, max_paths),
|
||||||
)
|
):
|
||||||
await db.conn.commit()
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_recent_for_contact(public_key: str, limit: int = 10) -> list[ContactAdvertPath]:
|
async def get_recent_for_contact(public_key: str, limit: int = 10) -> list[ContactAdvertPath]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT path_hex, path_len, first_seen, last_seen, heard_count
|
"""
|
||||||
FROM contact_advert_paths
|
SELECT path_hex, path_len, first_seen, last_seen, heard_count
|
||||||
WHERE public_key = ?
|
FROM contact_advert_paths
|
||||||
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
WHERE public_key = ?
|
||||||
LIMIT ?
|
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
||||||
""",
|
LIMIT ?
|
||||||
(public_key.lower(), limit),
|
""",
|
||||||
)
|
(public_key.lower(), limit),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [ContactAdvertPathRepository._row_to_path(row) for row in rows]
|
return [ContactAdvertPathRepository._row_to_path(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_recent_for_all_contacts(
|
async def get_recent_for_all_contacts(
|
||||||
limit_per_contact: int = 10,
|
limit_per_contact: int = 10,
|
||||||
) -> list[ContactAdvertPathSummary]:
|
) -> list[ContactAdvertPathSummary]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT public_key, path_hex, path_len, first_seen, last_seen, heard_count
|
"""
|
||||||
FROM (
|
SELECT public_key, path_hex, path_len, first_seen, last_seen, heard_count
|
||||||
SELECT *,
|
FROM (
|
||||||
ROW_NUMBER() OVER (
|
SELECT *,
|
||||||
PARTITION BY public_key
|
ROW_NUMBER() OVER (
|
||||||
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
PARTITION BY public_key
|
||||||
) AS rn
|
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
||||||
FROM contact_advert_paths
|
) AS rn
|
||||||
)
|
FROM contact_advert_paths
|
||||||
WHERE rn <= ?
|
)
|
||||||
ORDER BY public_key ASC, last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
WHERE rn <= ?
|
||||||
""",
|
ORDER BY public_key ASC, last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
||||||
(limit_per_contact,),
|
""",
|
||||||
)
|
(limit_per_contact,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
|
||||||
grouped: dict[str, list[ContactAdvertPath]] = {}
|
grouped: dict[str, list[ContactAdvertPath]] = {}
|
||||||
for row in rows:
|
for row in rows:
|
||||||
@@ -727,29 +836,31 @@ class ContactNameHistoryRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def record_name(public_key: str, name: str, timestamp: int) -> None:
|
async def record_name(public_key: str, name: str, timestamp: int) -> None:
|
||||||
"""Record a name observation. Upserts: updates last_seen if name already known."""
|
"""Record a name observation. Upserts: updates last_seen if name already known."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
INSERT INTO contact_name_history (public_key, name, first_seen, last_seen)
|
"""
|
||||||
VALUES (?, ?, ?, ?)
|
INSERT INTO contact_name_history (public_key, name, first_seen, last_seen)
|
||||||
ON CONFLICT(public_key, name) DO UPDATE SET
|
VALUES (?, ?, ?, ?)
|
||||||
last_seen = MAX(contact_name_history.last_seen, excluded.last_seen)
|
ON CONFLICT(public_key, name) DO UPDATE SET
|
||||||
""",
|
last_seen = MAX(contact_name_history.last_seen, excluded.last_seen)
|
||||||
(public_key.lower(), name, timestamp, timestamp),
|
""",
|
||||||
)
|
(public_key.lower(), name, timestamp, timestamp),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_history(public_key: str) -> list[ContactNameHistory]:
|
async def get_history(public_key: str) -> list[ContactNameHistory]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT name, first_seen, last_seen
|
"""
|
||||||
FROM contact_name_history
|
SELECT name, first_seen, last_seen
|
||||||
WHERE public_key = ?
|
FROM contact_name_history
|
||||||
ORDER BY last_seen DESC
|
WHERE public_key = ?
|
||||||
""",
|
ORDER BY last_seen DESC
|
||||||
(public_key.lower(),),
|
""",
|
||||||
)
|
(public_key.lower(),),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [
|
return [
|
||||||
ContactNameHistory(
|
ContactNameHistory(
|
||||||
name=row["name"], first_seen=row["first_seen"], last_seen=row["last_seen"]
|
name=row["name"], first_seen=row["first_seen"], last_seen=row["last_seen"]
|
||||||
|
|||||||
+61
-44
@@ -6,6 +6,8 @@ import time
|
|||||||
import uuid
|
import uuid
|
||||||
from typing import Any
|
from typing import Any
|
||||||
|
|
||||||
|
import aiosqlite
|
||||||
|
|
||||||
from app.database import db
|
from app.database import db
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -31,26 +33,37 @@ def _row_to_dict(row: Any) -> dict[str, Any]:
|
|||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def _get_in_conn(conn: aiosqlite.Connection, config_id: str) -> dict[str, Any] | None:
|
||||||
|
"""Fetch a config using an already-acquired connection.
|
||||||
|
|
||||||
|
Used by ``create`` and ``update`` to return the freshly-written row
|
||||||
|
without re-entering the non-reentrant DB lock.
|
||||||
|
"""
|
||||||
|
async with conn.execute("SELECT * FROM fanout_configs WHERE id = ?", (config_id,)) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
if row is None:
|
||||||
|
return None
|
||||||
|
return _row_to_dict(row)
|
||||||
|
|
||||||
|
|
||||||
class FanoutConfigRepository:
|
class FanoutConfigRepository:
|
||||||
"""CRUD operations for fanout_configs table."""
|
"""CRUD operations for fanout_configs table."""
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_all() -> list[dict[str, Any]]:
|
async def get_all() -> list[dict[str, Any]]:
|
||||||
"""Get all fanout configs ordered by sort_order."""
|
"""Get all fanout configs ordered by sort_order."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM fanout_configs ORDER BY sort_order, created_at"
|
async with conn.execute(
|
||||||
)
|
"SELECT * FROM fanout_configs ORDER BY sort_order, created_at"
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [_row_to_dict(row) for row in rows]
|
return [_row_to_dict(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get(config_id: str) -> dict[str, Any] | None:
|
async def get(config_id: str) -> dict[str, Any] | None:
|
||||||
"""Get a single fanout config by ID."""
|
"""Get a single fanout config by ID."""
|
||||||
cursor = await db.conn.execute("SELECT * FROM fanout_configs WHERE id = ?", (config_id,))
|
async with db.readonly() as conn:
|
||||||
row = await cursor.fetchone()
|
return await _get_in_conn(conn, config_id)
|
||||||
if row is None:
|
|
||||||
return None
|
|
||||||
return _row_to_dict(row)
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def create(
|
async def create(
|
||||||
@@ -65,39 +78,41 @@ class FanoutConfigRepository:
|
|||||||
new_id = config_id or str(uuid.uuid4())
|
new_id = config_id or str(uuid.uuid4())
|
||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
|
|
||||||
# Get next sort_order
|
async with db.tx() as conn:
|
||||||
cursor = await db.conn.execute(
|
# Determine next sort_order under the same lock as the insert,
|
||||||
"SELECT COALESCE(MAX(sort_order), -1) + 1 FROM fanout_configs"
|
# so two concurrent ``create()`` calls cannot collide.
|
||||||
)
|
async with conn.execute(
|
||||||
row = await cursor.fetchone()
|
"SELECT COALESCE(MAX(sort_order), -1) + 1 FROM fanout_configs"
|
||||||
sort_order = row[0] if row else 0
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
sort_order = row[0] if row else 0
|
||||||
|
|
||||||
await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
INSERT INTO fanout_configs (id, type, name, enabled, config, scope, sort_order, created_at)
|
INSERT INTO fanout_configs (id, type, name, enabled, config, scope, sort_order, created_at)
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
""",
|
""",
|
||||||
(
|
(
|
||||||
new_id,
|
new_id,
|
||||||
config_type,
|
config_type,
|
||||||
name,
|
name,
|
||||||
1 if enabled else 0,
|
1 if enabled else 0,
|
||||||
json.dumps(config),
|
json.dumps(config),
|
||||||
json.dumps(scope),
|
json.dumps(scope),
|
||||||
sort_order,
|
sort_order,
|
||||||
now,
|
now,
|
||||||
),
|
),
|
||||||
)
|
):
|
||||||
await db.conn.commit()
|
pass
|
||||||
|
|
||||||
result = await FanoutConfigRepository.get(new_id)
|
result = await _get_in_conn(conn, new_id)
|
||||||
assert result is not None
|
assert result is not None
|
||||||
return result
|
return result
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update(config_id: str, **fields: Any) -> dict[str, Any] | None:
|
async def update(config_id: str, **fields: Any) -> dict[str, Any] | None:
|
||||||
"""Update a fanout config. Only provided fields are updated."""
|
"""Update a fanout config. Only provided fields are updated."""
|
||||||
updates = []
|
updates: list[str] = []
|
||||||
params: list[Any] = []
|
params: list[Any] = []
|
||||||
|
|
||||||
for field in ("name", "enabled", "config", "scope", "sort_order"):
|
for field in ("name", "enabled", "config", "scope", "sort_order"):
|
||||||
@@ -115,23 +130,25 @@ class FanoutConfigRepository:
|
|||||||
|
|
||||||
params.append(config_id)
|
params.append(config_id)
|
||||||
query = f"UPDATE fanout_configs SET {', '.join(updates)} WHERE id = ?"
|
query = f"UPDATE fanout_configs SET {', '.join(updates)} WHERE id = ?"
|
||||||
await db.conn.execute(query, params)
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute(query, params):
|
||||||
|
pass
|
||||||
return await FanoutConfigRepository.get(config_id)
|
return await _get_in_conn(conn, config_id)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def delete(config_id: str) -> None:
|
async def delete(config_id: str) -> None:
|
||||||
"""Delete a fanout config."""
|
"""Delete a fanout config."""
|
||||||
await db.conn.execute("DELETE FROM fanout_configs WHERE id = ?", (config_id,))
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute("DELETE FROM fanout_configs WHERE id = ?", (config_id,)):
|
||||||
|
pass
|
||||||
_configs_cache.pop(config_id, None)
|
_configs_cache.pop(config_id, None)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_enabled() -> list[dict[str, Any]]:
|
async def get_enabled() -> list[dict[str, Any]]:
|
||||||
"""Get all enabled fanout configs."""
|
"""Get all enabled fanout configs."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT * FROM fanout_configs WHERE enabled = 1 ORDER BY sort_order, created_at"
|
async with conn.execute(
|
||||||
)
|
"SELECT * FROM fanout_configs WHERE enabled = 1 ORDER BY sort_order, created_at"
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [_row_to_dict(row) for row in rows]
|
return [_row_to_dict(row) for row in rows]
|
||||||
|
|||||||
+391
-346
@@ -89,32 +89,34 @@ class MessageRepository:
|
|||||||
# Normalize sender_key to lowercase so queries can match without LOWER().
|
# Normalize sender_key to lowercase so queries can match without LOWER().
|
||||||
normalized_sender_key = sender_key.lower() if sender_key else sender_key
|
normalized_sender_key = sender_key.lower() if sender_key else sender_key
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
INSERT OR IGNORE INTO messages (type, conversation_key, text, sender_timestamp,
|
"""
|
||||||
received_at, paths, txt_type, signature, outgoing,
|
INSERT OR IGNORE INTO messages (type, conversation_key, text, sender_timestamp,
|
||||||
sender_name, sender_key)
|
received_at, paths, txt_type, signature, outgoing,
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
sender_name, sender_key)
|
||||||
""",
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
(
|
""",
|
||||||
msg_type,
|
(
|
||||||
conversation_key,
|
msg_type,
|
||||||
text,
|
conversation_key,
|
||||||
sender_timestamp,
|
text,
|
||||||
received_at,
|
sender_timestamp,
|
||||||
paths_json,
|
received_at,
|
||||||
txt_type,
|
paths_json,
|
||||||
signature,
|
txt_type,
|
||||||
outgoing,
|
signature,
|
||||||
sender_name,
|
outgoing,
|
||||||
normalized_sender_key,
|
sender_name,
|
||||||
),
|
normalized_sender_key,
|
||||||
)
|
),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
|
rowcount = cursor.rowcount
|
||||||
|
lastrowid = cursor.lastrowid
|
||||||
# rowcount is 0 if INSERT was ignored due to UNIQUE constraint violation
|
# rowcount is 0 if INSERT was ignored due to UNIQUE constraint violation
|
||||||
if cursor.rowcount == 0:
|
if rowcount == 0:
|
||||||
return None
|
return None
|
||||||
return cursor.lastrowid
|
return lastrowid
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def add_path(
|
async def add_path(
|
||||||
@@ -142,17 +144,20 @@ class MessageRepository:
|
|||||||
if snr is not None:
|
if snr is not None:
|
||||||
entry["snr"] = snr
|
entry["snr"] = snr
|
||||||
new_entry = json.dumps(entry)
|
new_entry = json.dumps(entry)
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""UPDATE messages SET paths = json_insert(
|
async with conn.execute(
|
||||||
COALESCE(paths, '[]'), '$[#]', json(?)
|
"""UPDATE messages SET paths = json_insert(
|
||||||
) WHERE id = ?""",
|
COALESCE(paths, '[]'), '$[#]', json(?)
|
||||||
(new_entry, message_id),
|
) WHERE id = ?""",
|
||||||
)
|
(new_entry, message_id),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
# Read back the full list for the return value
|
# Read back the full list for the return value, same transaction.
|
||||||
cursor = await db.conn.execute("SELECT paths FROM messages WHERE id = ?", (message_id,))
|
async with conn.execute(
|
||||||
row = await cursor.fetchone()
|
"SELECT paths FROM messages WHERE id = ?", (message_id,)
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row or not row["paths"]:
|
if not row or not row["paths"]:
|
||||||
return []
|
return []
|
||||||
|
|
||||||
@@ -171,23 +176,24 @@ class MessageRepository:
|
|||||||
only a prefix as conversation_key are updated to use the full key.
|
only a prefix as conversation_key are updated to use the full key.
|
||||||
"""
|
"""
|
||||||
lower_key = full_key.lower()
|
lower_key = full_key.lower()
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""UPDATE messages SET conversation_key = ?,
|
async with conn.execute(
|
||||||
sender_key = CASE
|
"""UPDATE messages SET conversation_key = ?,
|
||||||
WHEN sender_key IS NOT NULL AND length(sender_key) < 64
|
sender_key = CASE
|
||||||
AND ? LIKE sender_key || '%'
|
WHEN sender_key IS NOT NULL AND length(sender_key) < 64
|
||||||
THEN ? ELSE sender_key END
|
AND ? LIKE sender_key || '%'
|
||||||
WHERE type = 'PRIV' AND length(conversation_key) < 64
|
THEN ? ELSE sender_key END
|
||||||
AND ? LIKE conversation_key || '%'
|
WHERE type = 'PRIV' AND length(conversation_key) < 64
|
||||||
AND (
|
AND ? LIKE conversation_key || '%'
|
||||||
SELECT COUNT(*) FROM contacts
|
AND (
|
||||||
WHERE length(public_key) = 64
|
SELECT COUNT(*) FROM contacts
|
||||||
AND public_key LIKE messages.conversation_key || '%'
|
WHERE length(public_key) = 64
|
||||||
) = 1""",
|
AND public_key LIKE messages.conversation_key || '%'
|
||||||
(lower_key, lower_key, lower_key, lower_key),
|
) = 1""",
|
||||||
)
|
(lower_key, lower_key, lower_key, lower_key),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def backfill_channel_sender_key(public_key: str, name: str) -> int:
|
async def backfill_channel_sender_key(public_key: str, name: str) -> int:
|
||||||
@@ -197,21 +203,22 @@ class MessageRepository:
|
|||||||
any channel messages with a matching sender_name but no sender_key
|
any channel messages with a matching sender_name but no sender_key
|
||||||
are updated to associate them with this contact's public key.
|
are updated to associate them with this contact's public key.
|
||||||
"""
|
"""
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"""UPDATE messages SET sender_key = ?
|
async with conn.execute(
|
||||||
WHERE type = 'CHAN' AND sender_name = ? AND sender_key IS NULL
|
"""UPDATE messages SET sender_key = ?
|
||||||
AND (
|
WHERE type = 'CHAN' AND sender_name = ? AND sender_key IS NULL
|
||||||
SELECT COUNT(*) FROM contacts
|
AND (
|
||||||
WHERE name = ?
|
SELECT COUNT(*) FROM contacts
|
||||||
) = 1
|
WHERE name = ?
|
||||||
AND EXISTS (
|
) = 1
|
||||||
SELECT 1 FROM contacts
|
AND EXISTS (
|
||||||
WHERE public_key = ? AND name = ?
|
SELECT 1 FROM contacts
|
||||||
)""",
|
WHERE public_key = ? AND name = ?
|
||||||
(public_key.lower(), name, name, public_key.lower(), name),
|
)""",
|
||||||
)
|
(public_key.lower(), name, name, public_key.lower(), name),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _normalize_conversation_key(conversation_key: str) -> tuple[str, str]:
|
def _normalize_conversation_key(conversation_key: str) -> tuple[str, str]:
|
||||||
@@ -462,8 +469,9 @@ class MessageRepository:
|
|||||||
query += " OFFSET ?"
|
query += " OFFSET ?"
|
||||||
params.append(offset)
|
params.append(offset)
|
||||||
|
|
||||||
cursor = await db.conn.execute(query, params)
|
async with db.readonly() as conn:
|
||||||
rows = await cursor.fetchall()
|
async with conn.execute(query, params) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [MessageRepository._row_to_message(row) for row in rows]
|
return [MessageRepository._row_to_message(row) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -501,51 +509,54 @@ class MessageRepository:
|
|||||||
where_sql = " AND ".join(["1=1", *where_parts])
|
where_sql = " AND ".join(["1=1", *where_parts])
|
||||||
|
|
||||||
# 1. Get the target message (must satisfy filters if provided)
|
# 1. Get the target message (must satisfy filters if provided)
|
||||||
target_cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
f"SELECT {MessageRepository._message_select('messages')} "
|
async with conn.execute(
|
||||||
f"FROM messages WHERE id = ? AND {where_sql}",
|
f"SELECT {MessageRepository._message_select('messages')} "
|
||||||
(message_id, *base_params),
|
f"FROM messages WHERE id = ? AND {where_sql}",
|
||||||
)
|
(message_id, *base_params),
|
||||||
target_row = await target_cursor.fetchone()
|
) as target_cursor:
|
||||||
if not target_row:
|
target_row = await target_cursor.fetchone()
|
||||||
return [], False, False
|
if not target_row:
|
||||||
|
return [], False, False
|
||||||
|
|
||||||
target = MessageRepository._row_to_message(target_row)
|
target = MessageRepository._row_to_message(target_row)
|
||||||
|
|
||||||
# 2. Get context_size+1 messages before target (DESC)
|
# 2. Get context_size+1 messages before target (DESC)
|
||||||
before_query = f"""
|
before_query = f"""
|
||||||
SELECT {MessageRepository._message_select("messages")} FROM messages WHERE {where_sql}
|
SELECT {MessageRepository._message_select("messages")} FROM messages WHERE {where_sql}
|
||||||
AND (received_at < ? OR (received_at = ? AND id < ?))
|
AND (received_at < ? OR (received_at = ? AND id < ?))
|
||||||
ORDER BY received_at DESC, id DESC LIMIT ?
|
ORDER BY received_at DESC, id DESC LIMIT ?
|
||||||
"""
|
"""
|
||||||
before_params = [
|
before_params = [
|
||||||
*base_params,
|
*base_params,
|
||||||
target.received_at,
|
target.received_at,
|
||||||
target.received_at,
|
target.received_at,
|
||||||
target.id,
|
target.id,
|
||||||
context_size + 1,
|
context_size + 1,
|
||||||
]
|
]
|
||||||
before_cursor = await db.conn.execute(before_query, before_params)
|
async with conn.execute(before_query, before_params) as before_cursor:
|
||||||
before_rows = list(await before_cursor.fetchall())
|
before_rows = list(await before_cursor.fetchall())
|
||||||
|
|
||||||
has_older = len(before_rows) > context_size
|
has_older = len(before_rows) > context_size
|
||||||
before_messages = [MessageRepository._row_to_message(r) for r in before_rows[:context_size]]
|
before_messages = [
|
||||||
|
MessageRepository._row_to_message(r) for r in before_rows[:context_size]
|
||||||
|
]
|
||||||
|
|
||||||
# 3. Get context_size+1 messages after target (ASC)
|
# 3. Get context_size+1 messages after target (ASC)
|
||||||
after_query = f"""
|
after_query = f"""
|
||||||
SELECT {MessageRepository._message_select("messages")} FROM messages WHERE {where_sql}
|
SELECT {MessageRepository._message_select("messages")} FROM messages WHERE {where_sql}
|
||||||
AND (received_at > ? OR (received_at = ? AND id > ?))
|
AND (received_at > ? OR (received_at = ? AND id > ?))
|
||||||
ORDER BY received_at ASC, id ASC LIMIT ?
|
ORDER BY received_at ASC, id ASC LIMIT ?
|
||||||
"""
|
"""
|
||||||
after_params = [
|
after_params = [
|
||||||
*base_params,
|
*base_params,
|
||||||
target.received_at,
|
target.received_at,
|
||||||
target.received_at,
|
target.received_at,
|
||||||
target.id,
|
target.id,
|
||||||
context_size + 1,
|
context_size + 1,
|
||||||
]
|
]
|
||||||
after_cursor = await db.conn.execute(after_query, after_params)
|
async with conn.execute(after_query, after_params) as after_cursor:
|
||||||
after_rows = list(await after_cursor.fetchall())
|
after_rows = list(await after_cursor.fetchall())
|
||||||
|
|
||||||
has_newer = len(after_rows) > context_size
|
has_newer = len(after_rows) > context_size
|
||||||
after_messages = [MessageRepository._row_to_message(r) for r in after_rows[:context_size]]
|
after_messages = [MessageRepository._row_to_message(r) for r in after_rows[:context_size]]
|
||||||
@@ -556,21 +567,29 @@ class MessageRepository:
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def increment_ack_count(message_id: int) -> int:
|
async def increment_ack_count(message_id: int) -> int:
|
||||||
"""Increment ack count and return the new value."""
|
"""Increment ack count and return the new value.
|
||||||
cursor = await db.conn.execute(
|
|
||||||
"UPDATE messages SET acked = acked + 1 WHERE id = ? RETURNING acked", (message_id,)
|
NOTE: ``RETURNING`` leaves the prepared statement active until the
|
||||||
)
|
row is fetched, so we MUST consume it inside the ``async with``
|
||||||
row = await cursor.fetchone()
|
block. Without that, the commit at the end of ``db.tx()`` fails
|
||||||
await db.conn.commit()
|
with ``cannot commit transaction - SQL statements in progress``.
|
||||||
|
"""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"UPDATE messages SET acked = acked + 1 WHERE id = ? RETURNING acked",
|
||||||
|
(message_id,),
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["acked"] if row else 1
|
return row["acked"] if row else 1
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_ack_and_paths(message_id: int) -> tuple[int, list[MessagePath] | None]:
|
async def get_ack_and_paths(message_id: int) -> tuple[int, list[MessagePath] | None]:
|
||||||
"""Get the current ack count and paths for a message."""
|
"""Get the current ack count and paths for a message."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT acked, paths FROM messages WHERE id = ?", (message_id,)
|
async with conn.execute(
|
||||||
)
|
"SELECT acked, paths FROM messages WHERE id = ?", (message_id,)
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row:
|
if not row:
|
||||||
return 0, None
|
return 0, None
|
||||||
return row["acked"], MessageRepository._parse_paths(row["paths"])
|
return row["acked"], MessageRepository._parse_paths(row["paths"])
|
||||||
@@ -578,11 +597,12 @@ class MessageRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_id(message_id: int) -> "Message | None":
|
async def get_by_id(message_id: int) -> "Message | None":
|
||||||
"""Look up a message by its ID."""
|
"""Look up a message by its ID."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
f"SELECT {MessageRepository._message_select('messages')} FROM messages WHERE id = ?",
|
async with conn.execute(
|
||||||
(message_id,),
|
f"SELECT {MessageRepository._message_select('messages')} FROM messages WHERE id = ?",
|
||||||
)
|
(message_id,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row:
|
if not row:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -591,11 +611,14 @@ class MessageRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def delete_by_id(message_id: int) -> None:
|
async def delete_by_id(message_id: int) -> None:
|
||||||
"""Delete a message row by ID."""
|
"""Delete a message row by ID."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE raw_packets SET message_id = NULL WHERE message_id = ?", (message_id,)
|
async with conn.execute(
|
||||||
)
|
"UPDATE raw_packets SET message_id = NULL WHERE message_id = ?",
|
||||||
await db.conn.execute("DELETE FROM messages WHERE id = ?", (message_id,))
|
(message_id,),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
async with conn.execute("DELETE FROM messages WHERE id = ?", (message_id,)):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_content(
|
async def get_by_content(
|
||||||
@@ -618,8 +641,9 @@ class MessageRepository:
|
|||||||
query += " AND outgoing = ?"
|
query += " AND outgoing = ?"
|
||||||
params.append(1 if outgoing else 0)
|
params.append(1 if outgoing else 0)
|
||||||
query += " ORDER BY id ASC"
|
query += " ORDER BY id ASC"
|
||||||
cursor = await db.conn.execute(query, params)
|
async with db.readonly() as conn:
|
||||||
row = await cursor.fetchone()
|
async with conn.execute(query, params) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row:
|
if not row:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -653,76 +677,6 @@ class MessageRepository:
|
|||||||
)
|
)
|
||||||
blocked_sql = f" AND {blocked_clause}" if blocked_clause else ""
|
blocked_sql = f" AND {blocked_clause}" if blocked_clause else ""
|
||||||
|
|
||||||
# Channel unreads
|
|
||||||
cursor = await db.conn.execute(
|
|
||||||
f"""
|
|
||||||
SELECT m.conversation_key,
|
|
||||||
COUNT(*) as unread_count,
|
|
||||||
SUM(CASE
|
|
||||||
WHEN ? <> '' AND INSTR(LOWER(m.text), LOWER(?)) > 0 THEN 1
|
|
||||||
ELSE 0
|
|
||||||
END) > 0 as has_mention
|
|
||||||
FROM messages m
|
|
||||||
JOIN channels c ON m.conversation_key = c.key
|
|
||||||
WHERE m.type = 'CHAN' AND m.outgoing = 0
|
|
||||||
AND m.received_at > COALESCE(c.last_read_at, 0)
|
|
||||||
{blocked_sql}
|
|
||||||
GROUP BY m.conversation_key
|
|
||||||
""",
|
|
||||||
(mention_token or "", mention_token or "", *blocked_params),
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
for row in rows:
|
|
||||||
state_key = f"channel-{row['conversation_key']}"
|
|
||||||
counts[state_key] = row["unread_count"]
|
|
||||||
if mention_token and row["has_mention"]:
|
|
||||||
mention_flags[state_key] = True
|
|
||||||
|
|
||||||
# Contact unreads
|
|
||||||
cursor = await db.conn.execute(
|
|
||||||
f"""
|
|
||||||
SELECT m.conversation_key,
|
|
||||||
COUNT(*) as unread_count,
|
|
||||||
SUM(CASE
|
|
||||||
WHEN ? <> '' AND INSTR(LOWER(m.text), LOWER(?)) > 0 THEN 1
|
|
||||||
ELSE 0
|
|
||||||
END) > 0 as has_mention
|
|
||||||
FROM messages m
|
|
||||||
LEFT JOIN contacts ct ON m.conversation_key = ct.public_key
|
|
||||||
WHERE m.type = 'PRIV' AND m.outgoing = 0
|
|
||||||
AND m.received_at > COALESCE(ct.last_read_at, 0)
|
|
||||||
{blocked_sql}
|
|
||||||
GROUP BY m.conversation_key
|
|
||||||
""",
|
|
||||||
(mention_token or "", mention_token or "", *blocked_params),
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
for row in rows:
|
|
||||||
state_key = f"contact-{row['conversation_key']}"
|
|
||||||
counts[state_key] = row["unread_count"]
|
|
||||||
if mention_token and row["has_mention"]:
|
|
||||||
mention_flags[state_key] = True
|
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
|
||||||
"""
|
|
||||||
SELECT key, last_read_at
|
|
||||||
FROM channels
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
for row in rows:
|
|
||||||
last_read_ats[f"channel-{row['key']}"] = row["last_read_at"]
|
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
|
||||||
"""
|
|
||||||
SELECT public_key, last_read_at
|
|
||||||
FROM contacts
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
for row in rows:
|
|
||||||
last_read_ats[f"contact-{row['public_key']}"] = row["last_read_at"]
|
|
||||||
|
|
||||||
# Last message times for all conversations (including read ones),
|
# Last message times for all conversations (including read ones),
|
||||||
# excluding blocked incoming traffic so refresh matches live WS behavior.
|
# excluding blocked incoming traffic so refresh matches live WS behavior.
|
||||||
last_time_clause, last_time_params = MessageRepository._build_blocked_incoming_clause(
|
last_time_clause, last_time_params = MessageRepository._build_blocked_incoming_clause(
|
||||||
@@ -730,20 +684,94 @@ class MessageRepository:
|
|||||||
)
|
)
|
||||||
last_time_where_sql = f"WHERE {last_time_clause}" if last_time_clause else ""
|
last_time_where_sql = f"WHERE {last_time_clause}" if last_time_clause else ""
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
# Single readonly acquisition for all 5 queries — they form one logical
|
||||||
f"""
|
# snapshot, and holding the lock for the batch is cheaper than acquiring
|
||||||
SELECT type, conversation_key, MAX(received_at) as last_message_time
|
# it 5 times.
|
||||||
FROM messages
|
async with db.readonly() as conn:
|
||||||
{last_time_where_sql}
|
# Channel unreads
|
||||||
GROUP BY type, conversation_key
|
async with conn.execute(
|
||||||
""",
|
f"""
|
||||||
last_time_params,
|
SELECT m.conversation_key,
|
||||||
)
|
COUNT(*) as unread_count,
|
||||||
rows = await cursor.fetchall()
|
SUM(CASE
|
||||||
for row in rows:
|
WHEN ? <> '' AND INSTR(LOWER(m.text), LOWER(?)) > 0 THEN 1
|
||||||
prefix = "channel" if row["type"] == "CHAN" else "contact"
|
ELSE 0
|
||||||
state_key = f"{prefix}-{row['conversation_key']}"
|
END) > 0 as has_mention
|
||||||
last_message_times[state_key] = row["last_message_time"]
|
FROM messages m
|
||||||
|
JOIN channels c ON m.conversation_key = c.key
|
||||||
|
WHERE m.type = 'CHAN' AND m.outgoing = 0
|
||||||
|
AND m.received_at > COALESCE(c.last_read_at, 0)
|
||||||
|
{blocked_sql}
|
||||||
|
GROUP BY m.conversation_key
|
||||||
|
""",
|
||||||
|
(mention_token or "", mention_token or "", *blocked_params),
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
for row in rows:
|
||||||
|
state_key = f"channel-{row['conversation_key']}"
|
||||||
|
counts[state_key] = row["unread_count"]
|
||||||
|
if mention_token and row["has_mention"]:
|
||||||
|
mention_flags[state_key] = True
|
||||||
|
|
||||||
|
# Contact unreads
|
||||||
|
async with conn.execute(
|
||||||
|
f"""
|
||||||
|
SELECT m.conversation_key,
|
||||||
|
COUNT(*) as unread_count,
|
||||||
|
SUM(CASE
|
||||||
|
WHEN ? <> '' AND INSTR(LOWER(m.text), LOWER(?)) > 0 THEN 1
|
||||||
|
ELSE 0
|
||||||
|
END) > 0 as has_mention
|
||||||
|
FROM messages m
|
||||||
|
LEFT JOIN contacts ct ON m.conversation_key = ct.public_key
|
||||||
|
WHERE m.type = 'PRIV' AND m.outgoing = 0
|
||||||
|
AND m.received_at > COALESCE(ct.last_read_at, 0)
|
||||||
|
{blocked_sql}
|
||||||
|
GROUP BY m.conversation_key
|
||||||
|
""",
|
||||||
|
(mention_token or "", mention_token or "", *blocked_params),
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
for row in rows:
|
||||||
|
state_key = f"contact-{row['conversation_key']}"
|
||||||
|
counts[state_key] = row["unread_count"]
|
||||||
|
if mention_token and row["has_mention"]:
|
||||||
|
mention_flags[state_key] = True
|
||||||
|
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
SELECT key, last_read_at
|
||||||
|
FROM channels
|
||||||
|
"""
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
for row in rows:
|
||||||
|
last_read_ats[f"channel-{row['key']}"] = row["last_read_at"]
|
||||||
|
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
SELECT public_key, last_read_at
|
||||||
|
FROM contacts
|
||||||
|
"""
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
for row in rows:
|
||||||
|
last_read_ats[f"contact-{row['public_key']}"] = row["last_read_at"]
|
||||||
|
|
||||||
|
async with conn.execute(
|
||||||
|
f"""
|
||||||
|
SELECT type, conversation_key, MAX(received_at) as last_message_time
|
||||||
|
FROM messages
|
||||||
|
{last_time_where_sql}
|
||||||
|
GROUP BY type, conversation_key
|
||||||
|
""",
|
||||||
|
last_time_params,
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
for row in rows:
|
||||||
|
prefix = "channel" if row["type"] == "CHAN" else "contact"
|
||||||
|
state_key = f"{prefix}-{row['conversation_key']}"
|
||||||
|
last_message_times[state_key] = row["last_message_time"]
|
||||||
|
|
||||||
# Only include last_read_ats for conversations that actually have messages.
|
# Only include last_read_ats for conversations that actually have messages.
|
||||||
# Without this filter, every contact heard via advertisement (even without
|
# Without this filter, every contact heard via advertisement (even without
|
||||||
@@ -760,41 +788,45 @@ class MessageRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def count_dm_messages(contact_key: str) -> int:
|
async def count_dm_messages(contact_key: str) -> int:
|
||||||
"""Count total DM messages for a contact."""
|
"""Count total DM messages for a contact."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'PRIV' AND conversation_key = ?",
|
async with conn.execute(
|
||||||
(contact_key.lower(),),
|
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'PRIV' AND conversation_key = ?",
|
||||||
)
|
(contact_key.lower(),),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["cnt"] if row else 0
|
return row["cnt"] if row else 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def count_channel_messages_by_sender(sender_key: str) -> int:
|
async def count_channel_messages_by_sender(sender_key: str) -> int:
|
||||||
"""Count channel messages sent by a specific contact."""
|
"""Count channel messages sent by a specific contact."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'CHAN' AND sender_key = ?",
|
async with conn.execute(
|
||||||
(sender_key.lower(),),
|
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'CHAN' AND sender_key = ?",
|
||||||
)
|
(sender_key.lower(),),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["cnt"] if row else 0
|
return row["cnt"] if row else 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def count_channel_messages_by_sender_name(sender_name: str) -> int:
|
async def count_channel_messages_by_sender_name(sender_name: str) -> int:
|
||||||
"""Count channel messages attributed to a display name."""
|
"""Count channel messages attributed to a display name."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'CHAN' AND sender_name = ?",
|
async with conn.execute(
|
||||||
(sender_name,),
|
"SELECT COUNT(*) as cnt FROM messages WHERE type = 'CHAN' AND sender_name = ?",
|
||||||
)
|
(sender_name,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["cnt"] if row else 0
|
return row["cnt"] if row else 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_first_channel_message_by_sender_name(sender_name: str) -> int | None:
|
async def get_first_channel_message_by_sender_name(sender_name: str) -> int | None:
|
||||||
"""Get the earliest stored channel message timestamp for a display name."""
|
"""Get the earliest stored channel message timestamp for a display name."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT MIN(received_at) AS first_seen FROM messages WHERE type = 'CHAN' AND sender_name = ?",
|
async with conn.execute(
|
||||||
(sender_name,),
|
"SELECT MIN(received_at) AS first_seen FROM messages WHERE type = 'CHAN' AND sender_name = ?",
|
||||||
)
|
(sender_name,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["first_seen"] if row and row["first_seen"] is not None else None
|
return row["first_seen"] if row and row["first_seen"] is not None else None
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -813,68 +845,76 @@ class MessageRepository:
|
|||||||
t_48h = now - 172800
|
t_48h = now - 172800
|
||||||
t_7d = now - 604800
|
t_7d = now - 604800
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT COUNT(*) AS all_time,
|
"""
|
||||||
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_1h,
|
SELECT COUNT(*) AS all_time,
|
||||||
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_24h,
|
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_1h,
|
||||||
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_48h,
|
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_24h,
|
||||||
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_7d,
|
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_48h,
|
||||||
MIN(received_at) AS first_message_at,
|
SUM(CASE WHEN received_at >= ? THEN 1 ELSE 0 END) AS last_7d,
|
||||||
COUNT(DISTINCT sender_key) AS unique_sender_count
|
MIN(received_at) AS first_message_at,
|
||||||
FROM messages WHERE type = 'CHAN' AND conversation_key = ?
|
COUNT(DISTINCT sender_key) AS unique_sender_count
|
||||||
""",
|
FROM messages WHERE type = 'CHAN' AND conversation_key = ?
|
||||||
(t_1h, t_24h, t_48h, t_7d, conversation_key),
|
""",
|
||||||
)
|
(t_1h, t_24h, t_48h, t_7d, conversation_key),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
assert row is not None # Aggregate query always returns a row
|
row = await cursor.fetchone()
|
||||||
|
assert row is not None # Aggregate query always returns a row
|
||||||
|
|
||||||
message_counts = {
|
message_counts = {
|
||||||
"last_1h": row["last_1h"] or 0,
|
"last_1h": row["last_1h"] or 0,
|
||||||
"last_24h": row["last_24h"] or 0,
|
"last_24h": row["last_24h"] or 0,
|
||||||
"last_48h": row["last_48h"] or 0,
|
"last_48h": row["last_48h"] or 0,
|
||||||
"last_7d": row["last_7d"] or 0,
|
"last_7d": row["last_7d"] or 0,
|
||||||
"all_time": row["all_time"] or 0,
|
"all_time": row["all_time"] or 0,
|
||||||
}
|
|
||||||
|
|
||||||
cursor2 = await db.conn.execute(
|
|
||||||
"""
|
|
||||||
SELECT COALESCE(sender_name, sender_key, 'Unknown') AS display_name,
|
|
||||||
sender_key, COUNT(*) AS cnt
|
|
||||||
FROM messages
|
|
||||||
WHERE type = 'CHAN' AND conversation_key = ?
|
|
||||||
AND received_at >= ? AND sender_key IS NOT NULL
|
|
||||||
GROUP BY sender_key ORDER BY cnt DESC LIMIT 5
|
|
||||||
""",
|
|
||||||
(conversation_key, t_24h),
|
|
||||||
)
|
|
||||||
top_rows = await cursor2.fetchall()
|
|
||||||
top_senders = [
|
|
||||||
{
|
|
||||||
"sender_name": r["display_name"],
|
|
||||||
"sender_key": r["sender_key"],
|
|
||||||
"message_count": r["cnt"],
|
|
||||||
}
|
}
|
||||||
for r in top_rows
|
|
||||||
]
|
|
||||||
|
|
||||||
# Path hash width distribution for last 24h (in-Python parse of raw packet envelopes)
|
async with conn.execute(
|
||||||
cursor3 = await db.conn.execute(
|
"""
|
||||||
"""
|
SELECT COALESCE(sender_name, sender_key, 'Unknown') AS display_name,
|
||||||
SELECT rp.data FROM raw_packets rp
|
sender_key, COUNT(*) AS cnt
|
||||||
JOIN messages m ON rp.message_id = m.id
|
FROM messages
|
||||||
WHERE m.type = 'CHAN' AND m.conversation_key = ?
|
WHERE type = 'CHAN' AND conversation_key = ?
|
||||||
AND rp.timestamp >= ?
|
AND received_at >= ? AND sender_key IS NOT NULL
|
||||||
""",
|
GROUP BY sender_key ORDER BY cnt DESC LIMIT 5
|
||||||
(conversation_key, t_24h),
|
""",
|
||||||
)
|
(conversation_key, t_24h),
|
||||||
rows3 = await cursor3.fetchall()
|
) as cursor:
|
||||||
|
top_rows = await cursor.fetchall()
|
||||||
|
top_senders = [
|
||||||
|
{
|
||||||
|
"sender_name": r["display_name"],
|
||||||
|
"sender_key": r["sender_key"],
|
||||||
|
"message_count": r["cnt"],
|
||||||
|
}
|
||||||
|
for r in top_rows
|
||||||
|
]
|
||||||
|
|
||||||
|
# Path hash width distribution for last 24h: fetch raw rows under
|
||||||
|
# the lock, then release BEFORE the CPU-bound in-Python envelope
|
||||||
|
# parse. Parsing can iterate thousands of rows and previously held
|
||||||
|
# the DB lock for the whole traversal — blocking every other repo
|
||||||
|
# caller on a Pi. Keep the lock only for the fetch.
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
SELECT rp.data FROM raw_packets rp
|
||||||
|
JOIN messages m ON rp.message_id = m.id
|
||||||
|
WHERE m.type = 'CHAN' AND m.conversation_key = ?
|
||||||
|
AND rp.timestamp >= ?
|
||||||
|
""",
|
||||||
|
(conversation_key, t_24h),
|
||||||
|
) as cursor:
|
||||||
|
rows3 = await cursor.fetchall()
|
||||||
|
first_message_at = row["first_message_at"]
|
||||||
|
unique_sender_count = row["unique_sender_count"] or 0
|
||||||
|
|
||||||
path_hash_width_24h = bucket_path_hash_widths(rows3)
|
path_hash_width_24h = bucket_path_hash_widths(rows3)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"message_counts": message_counts,
|
"message_counts": message_counts,
|
||||||
"first_message_at": row["first_message_at"],
|
"first_message_at": first_message_at,
|
||||||
"unique_sender_count": row["unique_sender_count"] or 0,
|
"unique_sender_count": unique_sender_count,
|
||||||
"top_senders_24h": top_senders,
|
"top_senders_24h": top_senders,
|
||||||
"path_hash_width_24h": path_hash_width_24h,
|
"path_hash_width_24h": path_hash_width_24h,
|
||||||
}
|
}
|
||||||
@@ -882,14 +922,15 @@ class MessageRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def count_channels_with_incoming_messages() -> int:
|
async def count_channels_with_incoming_messages() -> int:
|
||||||
"""Count distinct channel conversations with at least one incoming message."""
|
"""Count distinct channel conversations with at least one incoming message."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT COUNT(DISTINCT conversation_key) AS cnt
|
"""
|
||||||
FROM messages
|
SELECT COUNT(DISTINCT conversation_key) AS cnt
|
||||||
WHERE type = 'CHAN' AND outgoing = 0
|
FROM messages
|
||||||
"""
|
WHERE type = 'CHAN' AND outgoing = 0
|
||||||
)
|
"""
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return int(row["cnt"]) if row and row["cnt"] is not None else 0
|
return int(row["cnt"]) if row and row["cnt"] is not None else 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -898,20 +939,21 @@ class MessageRepository:
|
|||||||
|
|
||||||
Returns list of (channel_key, channel_name, message_count) tuples.
|
Returns list of (channel_key, channel_name, message_count) tuples.
|
||||||
"""
|
"""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
"""
|
||||||
COUNT(*) AS cnt
|
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
||||||
FROM messages m
|
COUNT(*) AS cnt
|
||||||
LEFT JOIN channels c ON m.conversation_key = c.key
|
FROM messages m
|
||||||
WHERE m.type = 'CHAN' AND m.sender_key = ?
|
LEFT JOIN channels c ON m.conversation_key = c.key
|
||||||
GROUP BY m.conversation_key
|
WHERE m.type = 'CHAN' AND m.sender_key = ?
|
||||||
ORDER BY cnt DESC
|
GROUP BY m.conversation_key
|
||||||
LIMIT ?
|
ORDER BY cnt DESC
|
||||||
""",
|
LIMIT ?
|
||||||
(sender_key.lower(), limit),
|
""",
|
||||||
)
|
(sender_key.lower(), limit),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [(row["conversation_key"], row["channel_name"], row["cnt"]) for row in rows]
|
return [(row["conversation_key"], row["channel_name"], row["cnt"]) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -919,34 +961,36 @@ class MessageRepository:
|
|||||||
sender_name: str, limit: int = 5
|
sender_name: str, limit: int = 5
|
||||||
) -> list[tuple[str, str, int]]:
|
) -> list[tuple[str, str, int]]:
|
||||||
"""Get channels where a display name has sent the most messages."""
|
"""Get channels where a display name has sent the most messages."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
"""
|
||||||
COUNT(*) AS cnt
|
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
||||||
FROM messages m
|
COUNT(*) AS cnt
|
||||||
LEFT JOIN channels c ON m.conversation_key = c.key
|
FROM messages m
|
||||||
WHERE m.type = 'CHAN' AND m.sender_name = ?
|
LEFT JOIN channels c ON m.conversation_key = c.key
|
||||||
GROUP BY m.conversation_key
|
WHERE m.type = 'CHAN' AND m.sender_name = ?
|
||||||
ORDER BY cnt DESC
|
GROUP BY m.conversation_key
|
||||||
LIMIT ?
|
ORDER BY cnt DESC
|
||||||
""",
|
LIMIT ?
|
||||||
(sender_name, limit),
|
""",
|
||||||
)
|
(sender_name, limit),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [(row["conversation_key"], row["channel_name"], row["cnt"]) for row in rows]
|
return [(row["conversation_key"], row["channel_name"], row["cnt"]) for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def _get_activity_hour_buckets(where_sql: str, params: list[Any]) -> dict[int, int]:
|
async def _get_activity_hour_buckets(where_sql: str, params: list[Any]) -> dict[int, int]:
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
f"""
|
async with conn.execute(
|
||||||
SELECT received_at / 3600 AS hour_bucket, COUNT(*) AS cnt
|
f"""
|
||||||
FROM messages
|
SELECT received_at / 3600 AS hour_bucket, COUNT(*) AS cnt
|
||||||
WHERE {where_sql}
|
FROM messages
|
||||||
GROUP BY hour_bucket
|
WHERE {where_sql}
|
||||||
""",
|
GROUP BY hour_bucket
|
||||||
params,
|
""",
|
||||||
)
|
params,
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return {int(row["hour_bucket"]): row["cnt"] for row in rows}
|
return {int(row["hour_bucket"]): row["cnt"] for row in rows}
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@@ -1000,16 +1044,17 @@ class MessageRepository:
|
|||||||
current_day_start = (now // 86400) * 86400
|
current_day_start = (now // 86400) * 86400
|
||||||
start = current_day_start - (weeks - 1) * bucket_seconds
|
start = current_day_start - (weeks - 1) * bucket_seconds
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
f"""
|
async with conn.execute(
|
||||||
SELECT (received_at - ?) / ? AS bucket_idx, COUNT(*) AS cnt
|
f"""
|
||||||
FROM messages
|
SELECT (received_at - ?) / ? AS bucket_idx, COUNT(*) AS cnt
|
||||||
WHERE {where_sql} AND received_at >= ?
|
FROM messages
|
||||||
GROUP BY bucket_idx
|
WHERE {where_sql} AND received_at >= ?
|
||||||
""",
|
GROUP BY bucket_idx
|
||||||
[start, bucket_seconds, *params, start],
|
""",
|
||||||
)
|
[start, bucket_seconds, *params, start],
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
counts = {int(row["bucket_idx"]): row["cnt"] for row in rows}
|
counts = {int(row["bucket_idx"]): row["cnt"] for row in rows}
|
||||||
|
|
||||||
return [
|
return [
|
||||||
|
|||||||
@@ -0,0 +1,162 @@
|
|||||||
|
"""Repository for push_subscriptions table."""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
import uuid
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from app.database import db
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Auto-delete subscriptions that have failed this many times consecutively
|
||||||
|
# without any successful delivery in between.
|
||||||
|
MAX_CONSECUTIVE_FAILURES = 15
|
||||||
|
|
||||||
|
|
||||||
|
def _row_to_dict(row: Any) -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"id": row["id"],
|
||||||
|
"endpoint": row["endpoint"],
|
||||||
|
"p256dh": row["p256dh"],
|
||||||
|
"auth": row["auth"],
|
||||||
|
"label": row["label"] or "",
|
||||||
|
"created_at": row["created_at"] or 0,
|
||||||
|
"last_success_at": row["last_success_at"],
|
||||||
|
"failure_count": row["failure_count"] or 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class PushSubscriptionRepository:
|
||||||
|
@staticmethod
|
||||||
|
async def create(
|
||||||
|
endpoint: str,
|
||||||
|
p256dh: str,
|
||||||
|
auth: str,
|
||||||
|
label: str = "",
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Create or upsert a push subscription (keyed by endpoint)."""
|
||||||
|
sub_id = str(uuid.uuid4())
|
||||||
|
now = int(time.time())
|
||||||
|
|
||||||
|
async with db.tx() as conn:
|
||||||
|
await conn.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO push_subscriptions
|
||||||
|
(id, endpoint, p256dh, auth, label, created_at, failure_count)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?, 0)
|
||||||
|
ON CONFLICT(endpoint) DO UPDATE SET
|
||||||
|
p256dh = excluded.p256dh,
|
||||||
|
auth = excluded.auth,
|
||||||
|
label = CASE WHEN excluded.label != '' THEN excluded.label
|
||||||
|
ELSE push_subscriptions.label END,
|
||||||
|
failure_count = 0
|
||||||
|
""",
|
||||||
|
(sub_id, endpoint, p256dh, auth, label, now),
|
||||||
|
)
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT * FROM push_subscriptions WHERE endpoint = ?", (endpoint,)
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
|
||||||
|
return _row_to_dict(row) if row else {"id": sub_id} # type: ignore[arg-type]
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get(subscription_id: str) -> dict[str, Any] | None:
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT * FROM push_subscriptions WHERE id = ?", (subscription_id,)
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
return _row_to_dict(row) if row else None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_by_endpoint(endpoint: str) -> dict[str, Any] | None:
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT * FROM push_subscriptions WHERE endpoint = ?", (endpoint,)
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
return _row_to_dict(row) if row else None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_all() -> list[dict[str, Any]]:
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT * FROM push_subscriptions ORDER BY created_at DESC"
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
return [_row_to_dict(row) for row in rows]
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def update(subscription_id: str, **fields: Any) -> dict[str, Any] | None:
|
||||||
|
updates: list[str] = []
|
||||||
|
params: list[Any] = []
|
||||||
|
|
||||||
|
if "label" in fields:
|
||||||
|
updates.append("label = ?")
|
||||||
|
params.append(fields["label"])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return await PushSubscriptionRepository.get(subscription_id)
|
||||||
|
|
||||||
|
params.append(subscription_id)
|
||||||
|
async with db.tx() as conn:
|
||||||
|
await conn.execute(
|
||||||
|
f"UPDATE push_subscriptions SET {', '.join(updates)} WHERE id = ?",
|
||||||
|
params,
|
||||||
|
)
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT * FROM push_subscriptions WHERE id = ?", (subscription_id,)
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
return _row_to_dict(row) if row else None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def delete(subscription_id: str) -> bool:
|
||||||
|
async with db.tx() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"DELETE FROM push_subscriptions WHERE id = ?", (subscription_id,)
|
||||||
|
) as cursor:
|
||||||
|
return cursor.rowcount > 0
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def delete_by_endpoint(endpoint: str) -> bool:
|
||||||
|
async with db.tx() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"DELETE FROM push_subscriptions WHERE endpoint = ?", (endpoint,)
|
||||||
|
) as cursor:
|
||||||
|
return cursor.rowcount > 0
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def batch_record_outcomes(
|
||||||
|
success_ids: list[str], failure_ids: list[str], remove_ids: list[str]
|
||||||
|
) -> None:
|
||||||
|
"""Batch-update delivery outcomes in a single transaction."""
|
||||||
|
now = int(time.time())
|
||||||
|
async with db.tx() as conn:
|
||||||
|
if remove_ids:
|
||||||
|
placeholders = ",".join("?" for _ in remove_ids)
|
||||||
|
await conn.execute(
|
||||||
|
f"DELETE FROM push_subscriptions WHERE id IN ({placeholders})",
|
||||||
|
remove_ids,
|
||||||
|
)
|
||||||
|
if success_ids:
|
||||||
|
placeholders = ",".join("?" for _ in success_ids)
|
||||||
|
await conn.execute(
|
||||||
|
f"UPDATE push_subscriptions SET last_success_at = ?, failure_count = 0 "
|
||||||
|
f"WHERE id IN ({placeholders})",
|
||||||
|
[now, *success_ids],
|
||||||
|
)
|
||||||
|
if failure_ids:
|
||||||
|
placeholders = ",".join("?" for _ in failure_ids)
|
||||||
|
await conn.execute(
|
||||||
|
f"UPDATE push_subscriptions SET failure_count = failure_count + 1 "
|
||||||
|
f"WHERE id IN ({placeholders})",
|
||||||
|
failure_ids,
|
||||||
|
)
|
||||||
|
# Evict subscriptions that have exceeded the failure threshold
|
||||||
|
await conn.execute(
|
||||||
|
"DELETE FROM push_subscriptions WHERE failure_count >= ?",
|
||||||
|
(MAX_CONSECUTIVE_FAILURES,),
|
||||||
|
)
|
||||||
@@ -34,65 +34,85 @@ class RawPacketRepository:
|
|||||||
# For malformed packets, hash the full data
|
# For malformed packets, hash the full data
|
||||||
payload_hash = sha256(data).digest()
|
payload_hash = sha256(data).digest()
|
||||||
|
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"INSERT OR IGNORE INTO raw_packets (timestamp, data, payload_hash) VALUES (?, ?, ?)",
|
async with conn.execute(
|
||||||
(ts, data, payload_hash),
|
"INSERT OR IGNORE INTO raw_packets (timestamp, data, payload_hash) VALUES (?, ?, ?)",
|
||||||
)
|
(ts, data, payload_hash),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
|
rowcount = cursor.rowcount
|
||||||
|
lastrowid = cursor.lastrowid
|
||||||
|
|
||||||
if cursor.rowcount > 0:
|
if rowcount > 0:
|
||||||
assert cursor.lastrowid is not None
|
assert lastrowid is not None
|
||||||
return (cursor.lastrowid, True)
|
return (lastrowid, True)
|
||||||
|
|
||||||
# Duplicate payload — look up the existing row.
|
# Duplicate payload — look up the existing row (same transaction).
|
||||||
cursor = await db.conn.execute(
|
async with conn.execute(
|
||||||
"SELECT id FROM raw_packets WHERE payload_hash = ?", (payload_hash,)
|
"SELECT id FROM raw_packets WHERE payload_hash = ?", (payload_hash,)
|
||||||
)
|
) as cursor:
|
||||||
existing = await cursor.fetchone()
|
existing = await cursor.fetchone()
|
||||||
assert existing is not None
|
assert existing is not None
|
||||||
return (existing["id"], False)
|
return (existing["id"], False)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_undecrypted_count() -> int:
|
async def get_undecrypted_count() -> int:
|
||||||
"""Get count of undecrypted packets (those without a linked message)."""
|
"""Get count of undecrypted packets (those without a linked message)."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT COUNT(*) as count FROM raw_packets WHERE message_id IS NULL"
|
async with conn.execute(
|
||||||
)
|
"SELECT COUNT(*) as count FROM raw_packets WHERE message_id IS NULL"
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["count"] if row else 0
|
return row["count"] if row else 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_oldest_undecrypted() -> int | None:
|
async def get_oldest_undecrypted() -> int | None:
|
||||||
"""Get timestamp of oldest undecrypted packet, or None if none exist."""
|
"""Get timestamp of oldest undecrypted packet, or None if none exist."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT MIN(timestamp) as oldest FROM raw_packets WHERE message_id IS NULL"
|
async with conn.execute(
|
||||||
)
|
"SELECT MIN(timestamp) as oldest FROM raw_packets WHERE message_id IS NULL"
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
return row["oldest"] if row and row["oldest"] is not None else None
|
return row["oldest"] if row and row["oldest"] is not None else None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def _stream_undecrypted_rows(
|
||||||
|
batch_size: int,
|
||||||
|
) -> AsyncIterator[tuple[int, bytes, int]]:
|
||||||
|
"""Internal: keyset-paginated scan of every undecrypted raw packet.
|
||||||
|
|
||||||
|
Yields ``(id, data, timestamp)`` for each row across all batches.
|
||||||
|
Lock is acquired per batch only — concurrent writes can interleave
|
||||||
|
at batch boundaries rather than being blocked for the full scan.
|
||||||
|
Each batch opens a fresh cursor and consumes it fully with
|
||||||
|
``fetchall()`` before releasing, so no prepared statement is alive
|
||||||
|
at a yield boundary.
|
||||||
|
|
||||||
|
``last_id`` advances per row, not per yield, so external filters
|
||||||
|
(see ``stream_undecrypted_text_messages``) that drop rows do not
|
||||||
|
cause a re-scan of skipped IDs.
|
||||||
|
"""
|
||||||
|
last_id = -1
|
||||||
|
while True:
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT id, data, timestamp FROM raw_packets "
|
||||||
|
"WHERE message_id IS NULL AND id > ? ORDER BY id ASC LIMIT ?",
|
||||||
|
(last_id, batch_size),
|
||||||
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
if not rows:
|
||||||
|
return
|
||||||
|
for row in rows:
|
||||||
|
last_id = row["id"]
|
||||||
|
yield (row["id"], bytes(row["data"]), row["timestamp"])
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def stream_all_undecrypted(
|
async def stream_all_undecrypted(
|
||||||
batch_size: int = UNDECRYPTED_PACKET_BATCH_SIZE,
|
batch_size: int = UNDECRYPTED_PACKET_BATCH_SIZE,
|
||||||
) -> AsyncIterator[tuple[int, bytes, int]]:
|
) -> AsyncIterator[tuple[int, bytes, int]]:
|
||||||
"""Yield all undecrypted packets as (id, data, timestamp) in bounded batches.
|
"""Yield all undecrypted packets as (id, data, timestamp) in bounded batches."""
|
||||||
|
async for row in RawPacketRepository._stream_undecrypted_rows(batch_size):
|
||||||
Uses keyset pagination so each batch is a fresh query with a fully
|
yield row
|
||||||
consumed cursor — no open statement held across yield boundaries.
|
|
||||||
"""
|
|
||||||
last_id = -1
|
|
||||||
while True:
|
|
||||||
cursor = await db.conn.execute(
|
|
||||||
"SELECT id, data, timestamp FROM raw_packets "
|
|
||||||
"WHERE message_id IS NULL AND id > ? ORDER BY id ASC LIMIT ?",
|
|
||||||
(last_id, batch_size),
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
await cursor.close()
|
|
||||||
if not rows:
|
|
||||||
break
|
|
||||||
for row in rows:
|
|
||||||
last_id = row["id"]
|
|
||||||
yield (row["id"], bytes(row["data"]), row["timestamp"])
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def stream_undecrypted_text_messages(
|
async def stream_undecrypted_text_messages(
|
||||||
@@ -100,26 +120,15 @@ class RawPacketRepository:
|
|||||||
) -> AsyncIterator[tuple[int, bytes, int]]:
|
) -> AsyncIterator[tuple[int, bytes, int]]:
|
||||||
"""Yield undecrypted TEXT_MESSAGE packets in bounded-size batches.
|
"""Yield undecrypted TEXT_MESSAGE packets in bounded-size batches.
|
||||||
|
|
||||||
Uses keyset pagination so each batch is a fresh query with a fully
|
Filters the shared scan to rows whose payload parses as a text
|
||||||
consumed cursor — no open statement held across yield boundaries.
|
message. Non-matching rows still advance the keyset cursor so they
|
||||||
|
aren't re-fetched on subsequent batches.
|
||||||
"""
|
"""
|
||||||
last_id = -1
|
async for packet_id, data, timestamp in RawPacketRepository._stream_undecrypted_rows(
|
||||||
while True:
|
batch_size
|
||||||
cursor = await db.conn.execute(
|
):
|
||||||
"SELECT id, data, timestamp FROM raw_packets "
|
if get_packet_payload_type(data) == PayloadType.TEXT_MESSAGE:
|
||||||
"WHERE message_id IS NULL AND id > ? ORDER BY id ASC LIMIT ?",
|
yield (packet_id, data, timestamp)
|
||||||
(last_id, batch_size),
|
|
||||||
)
|
|
||||||
rows = await cursor.fetchall()
|
|
||||||
await cursor.close()
|
|
||||||
if not rows:
|
|
||||||
break
|
|
||||||
for row in rows:
|
|
||||||
last_id = row["id"]
|
|
||||||
data = bytes(row["data"])
|
|
||||||
payload_type = get_packet_payload_type(data)
|
|
||||||
if payload_type == PayloadType.TEXT_MESSAGE:
|
|
||||||
yield (row["id"], data, row["timestamp"])
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def count_undecrypted_text_messages(
|
async def count_undecrypted_text_messages(
|
||||||
@@ -136,20 +145,22 @@ class RawPacketRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def mark_decrypted(packet_id: int, message_id: int) -> None:
|
async def mark_decrypted(packet_id: int, message_id: int) -> None:
|
||||||
"""Link a raw packet to its decrypted message."""
|
"""Link a raw packet to its decrypted message."""
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"UPDATE raw_packets SET message_id = ? WHERE id = ?",
|
async with conn.execute(
|
||||||
(message_id, packet_id),
|
"UPDATE raw_packets SET message_id = ? WHERE id = ?",
|
||||||
)
|
(message_id, packet_id),
|
||||||
await db.conn.commit()
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_linked_message_id(packet_id: int) -> int | None:
|
async def get_linked_message_id(packet_id: int) -> int | None:
|
||||||
"""Return the linked message ID for a raw packet, if any."""
|
"""Return the linked message ID for a raw packet, if any."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT message_id FROM raw_packets WHERE id = ?",
|
async with conn.execute(
|
||||||
(packet_id,),
|
"SELECT message_id FROM raw_packets WHERE id = ?",
|
||||||
)
|
(packet_id,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row:
|
if not row:
|
||||||
return None
|
return None
|
||||||
return row["message_id"]
|
return row["message_id"]
|
||||||
@@ -157,11 +168,12 @@ class RawPacketRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_by_id(packet_id: int) -> tuple[int, bytes, int, int | None] | None:
|
async def get_by_id(packet_id: int) -> tuple[int, bytes, int, int | None] | None:
|
||||||
"""Return a raw packet row as (id, data, timestamp, message_id)."""
|
"""Return a raw packet row as (id, data, timestamp, message_id)."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT id, data, timestamp, message_id FROM raw_packets WHERE id = ?",
|
async with conn.execute(
|
||||||
(packet_id,),
|
"SELECT id, data, timestamp, message_id FROM raw_packets WHERE id = ?",
|
||||||
)
|
(packet_id,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if not row:
|
if not row:
|
||||||
return None
|
return None
|
||||||
return (row["id"], bytes(row["data"]), row["timestamp"], row["message_id"])
|
return (row["id"], bytes(row["data"]), row["timestamp"], row["message_id"])
|
||||||
@@ -170,16 +182,20 @@ class RawPacketRepository:
|
|||||||
async def prune_old_undecrypted(max_age_days: int) -> int:
|
async def prune_old_undecrypted(max_age_days: int) -> int:
|
||||||
"""Delete undecrypted packets older than max_age_days. Returns count deleted."""
|
"""Delete undecrypted packets older than max_age_days. Returns count deleted."""
|
||||||
cutoff = int(time.time()) - (max_age_days * 86400)
|
cutoff = int(time.time()) - (max_age_days * 86400)
|
||||||
cursor = await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"DELETE FROM raw_packets WHERE message_id IS NULL AND timestamp < ?",
|
async with conn.execute(
|
||||||
(cutoff,),
|
"DELETE FROM raw_packets WHERE message_id IS NULL AND timestamp < ?",
|
||||||
)
|
(cutoff,),
|
||||||
await db.conn.commit()
|
) as cursor:
|
||||||
return cursor.rowcount
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def purge_linked_to_messages() -> int:
|
async def purge_linked_to_messages() -> int:
|
||||||
"""Delete raw packets that are already linked to a stored message."""
|
"""Delete raw packets that are already linked to a stored message."""
|
||||||
cursor = await db.conn.execute("DELETE FROM raw_packets WHERE message_id IS NOT NULL")
|
async with db.tx() as conn:
|
||||||
await db.conn.commit()
|
async with conn.execute(
|
||||||
return cursor.rowcount
|
"DELETE FROM raw_packets WHERE message_id IS NOT NULL"
|
||||||
|
) as cursor:
|
||||||
|
rowcount = cursor.rowcount
|
||||||
|
return rowcount
|
||||||
|
|||||||
@@ -21,51 +21,54 @@ class RepeaterTelemetryRepository:
|
|||||||
data: dict,
|
data: dict,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Insert a telemetry history row and prune stale entries."""
|
"""Insert a telemetry history row and prune stale entries."""
|
||||||
await db.conn.execute(
|
|
||||||
"""
|
|
||||||
INSERT INTO repeater_telemetry_history
|
|
||||||
(public_key, timestamp, data)
|
|
||||||
VALUES (?, ?, ?)
|
|
||||||
""",
|
|
||||||
(public_key, timestamp, json.dumps(data)),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Prune entries older than 30 days
|
|
||||||
cutoff = int(time.time()) - _MAX_AGE_SECONDS
|
cutoff = int(time.time()) - _MAX_AGE_SECONDS
|
||||||
await db.conn.execute(
|
async with db.tx() as conn:
|
||||||
"DELETE FROM repeater_telemetry_history WHERE public_key = ? AND timestamp < ?",
|
async with conn.execute(
|
||||||
(public_key, cutoff),
|
"""
|
||||||
)
|
INSERT INTO repeater_telemetry_history
|
||||||
|
(public_key, timestamp, data)
|
||||||
|
VALUES (?, ?, ?)
|
||||||
|
""",
|
||||||
|
(public_key, timestamp, json.dumps(data)),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
# Cap at _MAX_ENTRIES_PER_REPEATER (keep newest)
|
# Prune entries older than 30 days
|
||||||
await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"DELETE FROM repeater_telemetry_history WHERE public_key = ? AND timestamp < ?",
|
||||||
DELETE FROM repeater_telemetry_history
|
(public_key, cutoff),
|
||||||
WHERE public_key = ? AND id NOT IN (
|
):
|
||||||
SELECT id FROM repeater_telemetry_history
|
pass
|
||||||
WHERE public_key = ?
|
|
||||||
ORDER BY timestamp DESC
|
|
||||||
LIMIT ?
|
|
||||||
)
|
|
||||||
""",
|
|
||||||
(public_key, public_key, _MAX_ENTRIES_PER_REPEATER),
|
|
||||||
)
|
|
||||||
|
|
||||||
await db.conn.commit()
|
# Cap at _MAX_ENTRIES_PER_REPEATER (keep newest)
|
||||||
|
async with conn.execute(
|
||||||
|
"""
|
||||||
|
DELETE FROM repeater_telemetry_history
|
||||||
|
WHERE public_key = ? AND id NOT IN (
|
||||||
|
SELECT id FROM repeater_telemetry_history
|
||||||
|
WHERE public_key = ?
|
||||||
|
ORDER BY timestamp DESC
|
||||||
|
LIMIT ?
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
(public_key, public_key, _MAX_ENTRIES_PER_REPEATER),
|
||||||
|
):
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_history(public_key: str, since_timestamp: int) -> list[dict]:
|
async def get_history(public_key: str, since_timestamp: int) -> list[dict]:
|
||||||
"""Return telemetry rows for a repeater since a given timestamp, ordered ASC."""
|
"""Return telemetry rows for a repeater since a given timestamp, ordered ASC."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT timestamp, data
|
"""
|
||||||
FROM repeater_telemetry_history
|
SELECT timestamp, data
|
||||||
WHERE public_key = ? AND timestamp >= ?
|
FROM repeater_telemetry_history
|
||||||
ORDER BY timestamp ASC
|
WHERE public_key = ? AND timestamp >= ?
|
||||||
""",
|
ORDER BY timestamp ASC
|
||||||
(public_key, since_timestamp),
|
""",
|
||||||
)
|
(public_key, since_timestamp),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [
|
return [
|
||||||
{
|
{
|
||||||
"timestamp": row["timestamp"],
|
"timestamp": row["timestamp"],
|
||||||
@@ -77,17 +80,18 @@ class RepeaterTelemetryRepository:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_latest(public_key: str) -> dict | None:
|
async def get_latest(public_key: str) -> dict | None:
|
||||||
"""Return the most recent telemetry row for a repeater, or None."""
|
"""Return the most recent telemetry row for a repeater, or None."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT timestamp, data
|
"""
|
||||||
FROM repeater_telemetry_history
|
SELECT timestamp, data
|
||||||
WHERE public_key = ?
|
FROM repeater_telemetry_history
|
||||||
ORDER BY timestamp DESC
|
WHERE public_key = ?
|
||||||
LIMIT 1
|
ORDER BY timestamp DESC
|
||||||
""",
|
LIMIT 1
|
||||||
(public_key,),
|
""",
|
||||||
)
|
(public_key,),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
if row is None:
|
if row is None:
|
||||||
return None
|
return None
|
||||||
return {
|
return {
|
||||||
|
|||||||
+319
-137
@@ -3,9 +3,12 @@ import logging
|
|||||||
import time
|
import time
|
||||||
from typing import Any
|
from typing import Any
|
||||||
|
|
||||||
|
import aiosqlite
|
||||||
|
|
||||||
from app.database import db
|
from app.database import db
|
||||||
from app.models import AppSettings
|
from app.models import AppSettings
|
||||||
from app.path_utils import bucket_path_hash_widths
|
from app.path_utils import bucket_path_hash_widths
|
||||||
|
from app.telemetry_interval import DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -16,25 +19,34 @@ SECONDS_7D = 604800
|
|||||||
|
|
||||||
|
|
||||||
class AppSettingsRepository:
|
class AppSettingsRepository:
|
||||||
"""Repository for app_settings table (single-row pattern)."""
|
"""Repository for app_settings table (single-row pattern).
|
||||||
|
|
||||||
|
Public methods acquire the DB lock exactly once. ``toggle_*`` helpers that
|
||||||
|
need a read-modify-write do so inside a single ``db.tx()`` — the internal
|
||||||
|
``_get_in_conn`` / ``_apply_updates`` helpers run under the caller's
|
||||||
|
already-held lock and must NEVER call ``db.tx()`` or ``db.readonly()``.
|
||||||
|
"""
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get() -> AppSettings:
|
async def _get_in_conn(conn: aiosqlite.Connection) -> AppSettings:
|
||||||
"""Get the current app settings.
|
"""Load settings using an already-acquired connection.
|
||||||
|
|
||||||
Always returns settings - creates default row if needed (migration handles initial row).
|
Used by the public ``get()`` and by multi-step operations
|
||||||
|
(``toggle_blocked_key``, ``toggle_blocked_name``) to avoid re-entering
|
||||||
|
the non-reentrant DB lock.
|
||||||
"""
|
"""
|
||||||
cursor = await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
SELECT max_radio_contacts, auto_decrypt_dm_on_advert,
|
SELECT max_radio_contacts, auto_decrypt_dm_on_advert,
|
||||||
last_message_times,
|
last_message_times,
|
||||||
advert_interval, last_advert_time, flood_scope,
|
advert_interval, last_advert_time, flood_scope,
|
||||||
blocked_keys, blocked_names, discovery_blocked_types,
|
blocked_keys, blocked_names, discovery_blocked_types,
|
||||||
tracked_telemetry_repeaters, auto_resend_channel
|
tracked_telemetry_repeaters, auto_resend_channel,
|
||||||
|
telemetry_interval_hours
|
||||||
FROM app_settings WHERE id = 1
|
FROM app_settings WHERE id = 1
|
||||||
"""
|
"""
|
||||||
)
|
) as cursor:
|
||||||
row = await cursor.fetchone()
|
row = await cursor.fetchone()
|
||||||
|
|
||||||
if not row:
|
if not row:
|
||||||
# Should not happen after migration, but handle gracefully
|
# Should not happen after migration, but handle gracefully
|
||||||
@@ -91,6 +103,16 @@ class AppSettingsRepository:
|
|||||||
except (KeyError, TypeError):
|
except (KeyError, TypeError):
|
||||||
auto_resend_channel = False
|
auto_resend_channel = False
|
||||||
|
|
||||||
|
# Parse telemetry_interval_hours (migration adds the column with
|
||||||
|
# default=8, but guard against older rows / partial migrations).
|
||||||
|
try:
|
||||||
|
raw_interval = row["telemetry_interval_hours"]
|
||||||
|
telemetry_interval_hours = (
|
||||||
|
int(raw_interval) if raw_interval is not None else DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
)
|
||||||
|
except (KeyError, TypeError, ValueError):
|
||||||
|
telemetry_interval_hours = DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
return AppSettings(
|
return AppSettings(
|
||||||
max_radio_contacts=row["max_radio_contacts"],
|
max_radio_contacts=row["max_radio_contacts"],
|
||||||
auto_decrypt_dm_on_advert=bool(row["auto_decrypt_dm_on_advert"]),
|
auto_decrypt_dm_on_advert=bool(row["auto_decrypt_dm_on_advert"]),
|
||||||
@@ -103,10 +125,13 @@ class AppSettingsRepository:
|
|||||||
discovery_blocked_types=discovery_blocked_types,
|
discovery_blocked_types=discovery_blocked_types,
|
||||||
tracked_telemetry_repeaters=tracked_telemetry_repeaters,
|
tracked_telemetry_repeaters=tracked_telemetry_repeaters,
|
||||||
auto_resend_channel=auto_resend_channel,
|
auto_resend_channel=auto_resend_channel,
|
||||||
|
telemetry_interval_hours=telemetry_interval_hours,
|
||||||
)
|
)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def update(
|
async def _apply_updates(
|
||||||
|
conn: aiosqlite.Connection,
|
||||||
|
*,
|
||||||
max_radio_contacts: int | None = None,
|
max_radio_contacts: int | None = None,
|
||||||
auto_decrypt_dm_on_advert: bool | None = None,
|
auto_decrypt_dm_on_advert: bool | None = None,
|
||||||
last_message_times: dict[str, int] | None = None,
|
last_message_times: dict[str, int] | None = None,
|
||||||
@@ -118,9 +143,14 @@ class AppSettingsRepository:
|
|||||||
discovery_blocked_types: list[int] | None = None,
|
discovery_blocked_types: list[int] | None = None,
|
||||||
tracked_telemetry_repeaters: list[str] | None = None,
|
tracked_telemetry_repeaters: list[str] | None = None,
|
||||||
auto_resend_channel: bool | None = None,
|
auto_resend_channel: bool | None = None,
|
||||||
) -> AppSettings:
|
telemetry_interval_hours: int | None = None,
|
||||||
"""Update app settings. Only provided fields are updated."""
|
) -> None:
|
||||||
updates = []
|
"""Apply field updates using an already-acquired connection.
|
||||||
|
|
||||||
|
Emits a single UPDATE statement inside the caller's transaction. Does
|
||||||
|
NOT commit — the caller's ``db.tx()`` handles that.
|
||||||
|
"""
|
||||||
|
updates: list[str] = []
|
||||||
params: list[Any] = []
|
params: list[Any] = []
|
||||||
|
|
||||||
if max_radio_contacts is not None:
|
if max_radio_contacts is not None:
|
||||||
@@ -167,49 +197,186 @@ class AppSettingsRepository:
|
|||||||
updates.append("auto_resend_channel = ?")
|
updates.append("auto_resend_channel = ?")
|
||||||
params.append(1 if auto_resend_channel else 0)
|
params.append(1 if auto_resend_channel else 0)
|
||||||
|
|
||||||
|
if telemetry_interval_hours is not None:
|
||||||
|
updates.append("telemetry_interval_hours = ?")
|
||||||
|
params.append(telemetry_interval_hours)
|
||||||
|
|
||||||
if updates:
|
if updates:
|
||||||
query = f"UPDATE app_settings SET {', '.join(updates)} WHERE id = 1"
|
query = f"UPDATE app_settings SET {', '.join(updates)} WHERE id = 1"
|
||||||
await db.conn.execute(query, params)
|
async with conn.execute(query, params):
|
||||||
await db.conn.commit()
|
pass
|
||||||
|
|
||||||
return await AppSettingsRepository.get()
|
@staticmethod
|
||||||
|
async def get() -> AppSettings:
|
||||||
|
"""Get the current app settings.
|
||||||
|
|
||||||
|
Always returns settings - creates default row if needed (migration handles initial row).
|
||||||
|
"""
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
return await AppSettingsRepository._get_in_conn(conn)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def update(
|
||||||
|
max_radio_contacts: int | None = None,
|
||||||
|
auto_decrypt_dm_on_advert: bool | None = None,
|
||||||
|
last_message_times: dict[str, int] | None = None,
|
||||||
|
advert_interval: int | None = None,
|
||||||
|
last_advert_time: int | None = None,
|
||||||
|
flood_scope: str | None = None,
|
||||||
|
blocked_keys: list[str] | None = None,
|
||||||
|
blocked_names: list[str] | None = None,
|
||||||
|
discovery_blocked_types: list[int] | None = None,
|
||||||
|
tracked_telemetry_repeaters: list[str] | None = None,
|
||||||
|
auto_resend_channel: bool | None = None,
|
||||||
|
telemetry_interval_hours: int | None = None,
|
||||||
|
) -> AppSettings:
|
||||||
|
"""Update app settings. Only provided fields are updated."""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
await AppSettingsRepository._apply_updates(
|
||||||
|
conn,
|
||||||
|
max_radio_contacts=max_radio_contacts,
|
||||||
|
auto_decrypt_dm_on_advert=auto_decrypt_dm_on_advert,
|
||||||
|
last_message_times=last_message_times,
|
||||||
|
advert_interval=advert_interval,
|
||||||
|
last_advert_time=last_advert_time,
|
||||||
|
flood_scope=flood_scope,
|
||||||
|
blocked_keys=blocked_keys,
|
||||||
|
blocked_names=blocked_names,
|
||||||
|
discovery_blocked_types=discovery_blocked_types,
|
||||||
|
tracked_telemetry_repeaters=tracked_telemetry_repeaters,
|
||||||
|
auto_resend_channel=auto_resend_channel,
|
||||||
|
telemetry_interval_hours=telemetry_interval_hours,
|
||||||
|
)
|
||||||
|
return await AppSettingsRepository._get_in_conn(conn)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def toggle_blocked_key(key: str) -> AppSettings:
|
async def toggle_blocked_key(key: str) -> AppSettings:
|
||||||
"""Toggle a public key in the blocked list. Keys are normalized to lowercase."""
|
"""Toggle a public key in the blocked list. Keys are normalized to lowercase.
|
||||||
|
|
||||||
|
Read-modify-write is atomic under a single ``db.tx()`` lock — two
|
||||||
|
concurrent toggles for the same key cannot produce an inconsistent
|
||||||
|
intermediate state.
|
||||||
|
"""
|
||||||
normalized = key.lower()
|
normalized = key.lower()
|
||||||
settings = await AppSettingsRepository.get()
|
async with db.tx() as conn:
|
||||||
if normalized in settings.blocked_keys:
|
settings = await AppSettingsRepository._get_in_conn(conn)
|
||||||
new_keys = [k for k in settings.blocked_keys if k != normalized]
|
if normalized in settings.blocked_keys:
|
||||||
else:
|
new_keys = [k for k in settings.blocked_keys if k != normalized]
|
||||||
new_keys = settings.blocked_keys + [normalized]
|
else:
|
||||||
return await AppSettingsRepository.update(blocked_keys=new_keys)
|
new_keys = settings.blocked_keys + [normalized]
|
||||||
|
await AppSettingsRepository._apply_updates(conn, blocked_keys=new_keys)
|
||||||
|
return await AppSettingsRepository._get_in_conn(conn)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def toggle_blocked_name(name: str) -> AppSettings:
|
async def toggle_blocked_name(name: str) -> AppSettings:
|
||||||
"""Toggle a display name in the blocked list."""
|
"""Toggle a display name in the blocked list.
|
||||||
settings = await AppSettingsRepository.get()
|
|
||||||
if name in settings.blocked_names:
|
Same atomicity guarantee as ``toggle_blocked_key``.
|
||||||
new_names = [n for n in settings.blocked_names if n != name]
|
"""
|
||||||
else:
|
async with db.tx() as conn:
|
||||||
new_names = settings.blocked_names + [name]
|
settings = await AppSettingsRepository._get_in_conn(conn)
|
||||||
return await AppSettingsRepository.update(blocked_names=new_names)
|
if name in settings.blocked_names:
|
||||||
|
new_names = [n for n in settings.blocked_names if n != name]
|
||||||
|
else:
|
||||||
|
new_names = settings.blocked_names + [name]
|
||||||
|
await AppSettingsRepository._apply_updates(conn, blocked_names=new_names)
|
||||||
|
return await AppSettingsRepository._get_in_conn(conn)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_vapid_keys() -> tuple[str, str]:
|
||||||
|
"""Return (private_key_pem, public_key_b64url) from app_settings.
|
||||||
|
|
||||||
|
These are internal-only columns not exposed via the AppSettings model.
|
||||||
|
"""
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT vapid_private_key, vapid_public_key FROM app_settings WHERE id = 1"
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
if row and row["vapid_private_key"] and row["vapid_public_key"]:
|
||||||
|
return row["vapid_private_key"], row["vapid_public_key"]
|
||||||
|
return "", ""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def set_vapid_keys(private_key: str, public_key: str) -> None:
|
||||||
|
"""Persist auto-generated VAPID key pair to app_settings."""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
await conn.execute(
|
||||||
|
"UPDATE app_settings SET vapid_private_key = ?, vapid_public_key = ? WHERE id = 1",
|
||||||
|
(private_key, public_key),
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_push_conversations() -> list[str]:
|
||||||
|
"""Return the global list of push-enabled conversation state keys.
|
||||||
|
|
||||||
|
Internal-only column, not exposed via the AppSettings model.
|
||||||
|
"""
|
||||||
|
async with db.readonly() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT push_conversations FROM app_settings WHERE id = 1"
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
if row and row["push_conversations"]:
|
||||||
|
try:
|
||||||
|
return json.loads(row["push_conversations"])
|
||||||
|
except (json.JSONDecodeError, TypeError):
|
||||||
|
return []
|
||||||
|
return []
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def set_push_conversations(conversations: list[str]) -> list[str]:
|
||||||
|
"""Replace the global push-enabled conversation list."""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
await conn.execute(
|
||||||
|
"UPDATE app_settings SET push_conversations = ? WHERE id = 1",
|
||||||
|
(json.dumps(conversations),),
|
||||||
|
)
|
||||||
|
return conversations
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def toggle_push_conversation(key: str) -> list[str]:
|
||||||
|
"""Add or remove a conversation state key from the global push list.
|
||||||
|
|
||||||
|
Atomic read-modify-write under a single ``db.tx()`` lock.
|
||||||
|
"""
|
||||||
|
async with db.tx() as conn:
|
||||||
|
async with conn.execute(
|
||||||
|
"SELECT push_conversations FROM app_settings WHERE id = 1"
|
||||||
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
|
current: list[str] = []
|
||||||
|
if row and row["push_conversations"]:
|
||||||
|
try:
|
||||||
|
current = json.loads(row["push_conversations"])
|
||||||
|
except (json.JSONDecodeError, TypeError):
|
||||||
|
current = []
|
||||||
|
if key in current:
|
||||||
|
current = [k for k in current if k != key]
|
||||||
|
else:
|
||||||
|
current.append(key)
|
||||||
|
await conn.execute(
|
||||||
|
"UPDATE app_settings SET push_conversations = ? WHERE id = 1",
|
||||||
|
(json.dumps(current),),
|
||||||
|
)
|
||||||
|
return current
|
||||||
|
|
||||||
|
|
||||||
class StatisticsRepository:
|
class StatisticsRepository:
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_database_message_totals() -> dict[str, int]:
|
async def get_database_message_totals() -> dict[str, int]:
|
||||||
"""Return message totals needed by lightweight debug surfaces."""
|
"""Return message totals needed by lightweight debug surfaces."""
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT
|
"""
|
||||||
SUM(CASE WHEN type = 'PRIV' THEN 1 ELSE 0 END) AS total_dms,
|
SELECT
|
||||||
SUM(CASE WHEN type = 'CHAN' THEN 1 ELSE 0 END) AS total_channel_messages,
|
SUM(CASE WHEN type = 'PRIV' THEN 1 ELSE 0 END) AS total_dms,
|
||||||
SUM(CASE WHEN outgoing = 1 THEN 1 ELSE 0 END) AS total_outgoing
|
SUM(CASE WHEN type = 'CHAN' THEN 1 ELSE 0 END) AS total_channel_messages,
|
||||||
FROM messages
|
SUM(CASE WHEN outgoing = 1 THEN 1 ELSE 0 END) AS total_outgoing
|
||||||
"""
|
FROM messages
|
||||||
)
|
"""
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
assert row is not None
|
assert row is not None
|
||||||
return {
|
return {
|
||||||
"total_dms": row["total_dms"] or 0,
|
"total_dms": row["total_dms"] or 0,
|
||||||
@@ -222,18 +389,19 @@ class StatisticsRepository:
|
|||||||
"""Get time-windowed counts for contacts/repeaters heard."""
|
"""Get time-windowed counts for contacts/repeaters heard."""
|
||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
op = "!=" if exclude else "="
|
op = "!=" if exclude else "="
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
f"""
|
async with conn.execute(
|
||||||
SELECT
|
f"""
|
||||||
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_hour,
|
SELECT
|
||||||
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_24_hours,
|
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_hour,
|
||||||
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_week
|
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_24_hours,
|
||||||
FROM contacts
|
SUM(CASE WHEN last_seen >= ? THEN 1 ELSE 0 END) AS last_week
|
||||||
WHERE type {op} ? AND last_seen IS NOT NULL
|
FROM contacts
|
||||||
""",
|
WHERE type {op} ? AND last_seen IS NOT NULL
|
||||||
(now - SECONDS_1H, now - SECONDS_24H, now - SECONDS_7D, contact_type),
|
""",
|
||||||
)
|
(now - SECONDS_1H, now - SECONDS_24H, now - SECONDS_7D, contact_type),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
assert row is not None # Aggregate query always returns a row
|
assert row is not None # Aggregate query always returns a row
|
||||||
return {
|
return {
|
||||||
"last_hour": row["last_hour"] or 0,
|
"last_hour": row["last_hour"] or 0,
|
||||||
@@ -249,24 +417,25 @@ class StatisticsRepository:
|
|||||||
the old UPPER(...) join and aggregate per known channel directly.
|
the old UPPER(...) join and aggregate per known channel directly.
|
||||||
"""
|
"""
|
||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
WITH known AS (
|
"""
|
||||||
SELECT conversation_key, MAX(received_at) AS last_received_at
|
WITH known AS (
|
||||||
FROM messages
|
SELECT conversation_key, MAX(received_at) AS last_received_at
|
||||||
WHERE type = 'CHAN'
|
FROM messages
|
||||||
AND conversation_key IN (SELECT key FROM channels)
|
WHERE type = 'CHAN'
|
||||||
GROUP BY conversation_key
|
AND conversation_key IN (SELECT key FROM channels)
|
||||||
)
|
GROUP BY conversation_key
|
||||||
SELECT
|
)
|
||||||
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_hour,
|
SELECT
|
||||||
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_24_hours,
|
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_hour,
|
||||||
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_week
|
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_24_hours,
|
||||||
FROM known
|
SUM(CASE WHEN last_received_at >= ? THEN 1 ELSE 0 END) AS last_week
|
||||||
""",
|
FROM known
|
||||||
(now - SECONDS_1H, now - SECONDS_24H, now - SECONDS_7D),
|
""",
|
||||||
)
|
(now - SECONDS_1H, now - SECONDS_24H, now - SECONDS_7D),
|
||||||
row = await cursor.fetchone()
|
) as cursor:
|
||||||
|
row = await cursor.fetchone()
|
||||||
assert row is not None
|
assert row is not None
|
||||||
return {
|
return {
|
||||||
"last_hour": row["last_hour"] or 0,
|
"last_hour": row["last_hour"] or 0,
|
||||||
@@ -280,92 +449,105 @@ class StatisticsRepository:
|
|||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
cutoff = now - SECONDS_72H
|
cutoff = now - SECONDS_72H
|
||||||
# Bucket timestamps to the start of each hour
|
# Bucket timestamps to the start of each hour
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT (timestamp / 3600) * 3600 AS hour_ts, COUNT(*) AS count
|
"""
|
||||||
FROM raw_packets
|
SELECT (timestamp / 3600) * 3600 AS hour_ts, COUNT(*) AS count
|
||||||
WHERE timestamp >= ?
|
FROM raw_packets
|
||||||
GROUP BY hour_ts
|
WHERE timestamp >= ?
|
||||||
ORDER BY hour_ts
|
GROUP BY hour_ts
|
||||||
""",
|
ORDER BY hour_ts
|
||||||
(cutoff,),
|
""",
|
||||||
)
|
(cutoff,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return [{"timestamp": row["hour_ts"], "count": row["count"]} for row in rows]
|
return [{"timestamp": row["hour_ts"], "count": row["count"]} for row in rows]
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def _path_hash_width_24h() -> dict[str, int | float]:
|
async def _path_hash_width_24h() -> dict[str, int | float]:
|
||||||
"""Count parsed raw packets from the last 24h by hop hash width."""
|
"""Count parsed raw packets from the last 24h by hop hash width."""
|
||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
cursor = await db.conn.execute(
|
async with db.readonly() as conn:
|
||||||
"SELECT data FROM raw_packets WHERE timestamp >= ?",
|
async with conn.execute(
|
||||||
(now - SECONDS_24H,),
|
"SELECT data FROM raw_packets WHERE timestamp >= ?",
|
||||||
)
|
(now - SECONDS_24H,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
|
rows = await cursor.fetchall()
|
||||||
return bucket_path_hash_widths(rows)
|
return bucket_path_hash_widths(rows)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
async def get_all() -> dict:
|
async def get_all() -> dict:
|
||||||
"""Aggregate all statistics from existing tables."""
|
"""Aggregate all statistics from existing tables.
|
||||||
|
|
||||||
|
Each helper acquires its own lock; there's no requirement that the
|
||||||
|
whole snapshot be atomic. If we ever wanted a consistent snapshot
|
||||||
|
we'd batch all queries into a single ``db.readonly()`` and use
|
||||||
|
``_in_conn`` helpers, but statistics are intentionally approximate.
|
||||||
|
"""
|
||||||
now = int(time.time())
|
now = int(time.time())
|
||||||
|
|
||||||
# Top 5 busiest channels in last 24h
|
async with db.readonly() as conn:
|
||||||
cursor = await db.conn.execute(
|
# Top 5 busiest channels in last 24h
|
||||||
"""
|
async with conn.execute(
|
||||||
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
"""
|
||||||
COUNT(*) AS message_count
|
SELECT m.conversation_key, COALESCE(c.name, m.conversation_key) AS channel_name,
|
||||||
FROM messages m
|
COUNT(*) AS message_count
|
||||||
LEFT JOIN channels c ON m.conversation_key = c.key
|
FROM messages m
|
||||||
WHERE m.type = 'CHAN' AND m.received_at >= ?
|
LEFT JOIN channels c ON m.conversation_key = c.key
|
||||||
GROUP BY m.conversation_key
|
WHERE m.type = 'CHAN' AND m.received_at >= ?
|
||||||
ORDER BY COUNT(*) DESC
|
GROUP BY m.conversation_key
|
||||||
LIMIT 5
|
ORDER BY COUNT(*) DESC
|
||||||
""",
|
LIMIT 5
|
||||||
(now - SECONDS_24H,),
|
""",
|
||||||
)
|
(now - SECONDS_24H,),
|
||||||
rows = await cursor.fetchall()
|
) as cursor:
|
||||||
busiest_channels_24h = [
|
rows = await cursor.fetchall()
|
||||||
{
|
busiest_channels_24h = [
|
||||||
"channel_key": row["conversation_key"],
|
{
|
||||||
"channel_name": row["channel_name"],
|
"channel_key": row["conversation_key"],
|
||||||
"message_count": row["message_count"],
|
"channel_name": row["channel_name"],
|
||||||
}
|
"message_count": row["message_count"],
|
||||||
for row in rows
|
}
|
||||||
]
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
# Entity counts
|
# Entity counts
|
||||||
cursor = await db.conn.execute("SELECT COUNT(*) AS cnt FROM contacts WHERE type != 2")
|
async with conn.execute(
|
||||||
row = await cursor.fetchone()
|
"SELECT COUNT(*) AS cnt FROM contacts WHERE type != 2"
|
||||||
assert row is not None
|
) as cursor:
|
||||||
contact_count: int = row["cnt"]
|
row = await cursor.fetchone()
|
||||||
|
assert row is not None
|
||||||
|
contact_count: int = row["cnt"]
|
||||||
|
|
||||||
cursor = await db.conn.execute("SELECT COUNT(*) AS cnt FROM contacts WHERE type = 2")
|
async with conn.execute(
|
||||||
row = await cursor.fetchone()
|
"SELECT COUNT(*) AS cnt FROM contacts WHERE type = 2"
|
||||||
assert row is not None
|
) as cursor:
|
||||||
repeater_count: int = row["cnt"]
|
row = await cursor.fetchone()
|
||||||
|
assert row is not None
|
||||||
|
repeater_count: int = row["cnt"]
|
||||||
|
|
||||||
cursor = await db.conn.execute("SELECT COUNT(*) AS cnt FROM channels")
|
async with conn.execute("SELECT COUNT(*) AS cnt FROM channels") as cursor:
|
||||||
row = await cursor.fetchone()
|
row = await cursor.fetchone()
|
||||||
assert row is not None
|
assert row is not None
|
||||||
channel_count: int = row["cnt"]
|
channel_count: int = row["cnt"]
|
||||||
|
|
||||||
# Packet split
|
# Packet split
|
||||||
cursor = await db.conn.execute(
|
async with conn.execute(
|
||||||
"""
|
"""
|
||||||
SELECT COUNT(*) AS total,
|
SELECT COUNT(*) AS total,
|
||||||
SUM(CASE WHEN message_id IS NOT NULL THEN 1 ELSE 0 END) AS decrypted
|
SUM(CASE WHEN message_id IS NOT NULL THEN 1 ELSE 0 END) AS decrypted
|
||||||
FROM raw_packets
|
FROM raw_packets
|
||||||
"""
|
"""
|
||||||
)
|
) as cursor:
|
||||||
pkt_row = await cursor.fetchone()
|
pkt_row = await cursor.fetchone()
|
||||||
assert pkt_row is not None
|
assert pkt_row is not None
|
||||||
total_packets = pkt_row["total"] or 0
|
total_packets = pkt_row["total"] or 0
|
||||||
decrypted_packets = pkt_row["decrypted"] or 0
|
decrypted_packets = pkt_row["decrypted"] or 0
|
||||||
undecrypted_packets = total_packets - decrypted_packets
|
undecrypted_packets = total_packets - decrypted_packets
|
||||||
|
|
||||||
|
# These each acquire their own lock. The snapshot isn't atomic across
|
||||||
|
# them — fine for stats, which are approximate by nature.
|
||||||
message_totals = await StatisticsRepository.get_database_message_totals()
|
message_totals = await StatisticsRepository.get_database_message_totals()
|
||||||
|
|
||||||
# Activity windows
|
|
||||||
contacts_heard = await StatisticsRepository._activity_counts(contact_type=2, exclude=True)
|
contacts_heard = await StatisticsRepository._activity_counts(contact_type=2, exclude=True)
|
||||||
repeaters_heard = await StatisticsRepository._activity_counts(contact_type=2)
|
repeaters_heard = await StatisticsRepository._activity_counts(contact_type=2)
|
||||||
known_channels_active = await StatisticsRepository._known_channels_active()
|
known_channels_active = await StatisticsRepository._known_channels_active()
|
||||||
|
|||||||
@@ -0,0 +1,164 @@
|
|||||||
|
"""Web Push subscription management endpoints."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from pywebpush import WebPushException
|
||||||
|
|
||||||
|
from app.push.send import send_push
|
||||||
|
from app.push.vapid import get_vapid_private_key, get_vapid_public_key
|
||||||
|
from app.repository.push_subscriptions import PushSubscriptionRepository
|
||||||
|
from app.repository.settings import AppSettingsRepository
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/push", tags=["push"])
|
||||||
|
|
||||||
|
|
||||||
|
# ── Request/response models ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class VapidPublicKeyResponse(BaseModel):
|
||||||
|
public_key: str
|
||||||
|
|
||||||
|
|
||||||
|
class PushSubscribeRequest(BaseModel):
|
||||||
|
endpoint: str = Field(min_length=1)
|
||||||
|
p256dh: str = Field(min_length=1)
|
||||||
|
auth: str = Field(min_length=1)
|
||||||
|
label: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
class PushSubscriptionUpdate(BaseModel):
|
||||||
|
label: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
class PushConversationToggle(BaseModel):
|
||||||
|
key: str = Field(min_length=1)
|
||||||
|
|
||||||
|
|
||||||
|
# ─��� Endpoints ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/vapid-public-key", response_model=VapidPublicKeyResponse)
|
||||||
|
async def vapid_public_key() -> VapidPublicKeyResponse:
|
||||||
|
"""Return the VAPID public key for browser PushManager.subscribe()."""
|
||||||
|
key = get_vapid_public_key()
|
||||||
|
if not key:
|
||||||
|
raise HTTPException(status_code=503, detail="VAPID keys not initialized")
|
||||||
|
return VapidPublicKeyResponse(public_key=key)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/subscribe")
|
||||||
|
async def subscribe(body: PushSubscribeRequest) -> dict:
|
||||||
|
"""Register or update a push subscription (device). Upserts by endpoint."""
|
||||||
|
sub = await PushSubscriptionRepository.create(
|
||||||
|
endpoint=body.endpoint,
|
||||||
|
p256dh=body.p256dh,
|
||||||
|
auth=body.auth,
|
||||||
|
label=body.label,
|
||||||
|
)
|
||||||
|
return sub
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/subscriptions")
|
||||||
|
async def list_subscriptions() -> list[dict]:
|
||||||
|
"""List all push subscriptions (devices)."""
|
||||||
|
return await PushSubscriptionRepository.get_all()
|
||||||
|
|
||||||
|
|
||||||
|
@router.patch("/subscriptions/{subscription_id}")
|
||||||
|
async def update_subscription(subscription_id: str, body: PushSubscriptionUpdate) -> dict:
|
||||||
|
"""Update a subscription's label."""
|
||||||
|
existing = await PushSubscriptionRepository.get(subscription_id)
|
||||||
|
if not existing:
|
||||||
|
raise HTTPException(status_code=404, detail="Subscription not found")
|
||||||
|
|
||||||
|
updates = {}
|
||||||
|
if body.label is not None:
|
||||||
|
updates["label"] = body.label
|
||||||
|
|
||||||
|
result = await PushSubscriptionRepository.update(subscription_id, **updates)
|
||||||
|
return result or existing
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/subscriptions/{subscription_id}")
|
||||||
|
async def unsubscribe(subscription_id: str) -> dict:
|
||||||
|
"""Delete a push subscription (device)."""
|
||||||
|
deleted = await PushSubscriptionRepository.delete(subscription_id)
|
||||||
|
if not deleted:
|
||||||
|
raise HTTPException(status_code=404, detail="Subscription not found")
|
||||||
|
return {"deleted": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/subscriptions/{subscription_id}/test")
|
||||||
|
async def test_push(subscription_id: str) -> dict:
|
||||||
|
"""Send a test notification to a subscription."""
|
||||||
|
sub = await PushSubscriptionRepository.get(subscription_id)
|
||||||
|
if not sub:
|
||||||
|
raise HTTPException(status_code=404, detail="Subscription not found")
|
||||||
|
|
||||||
|
vapid_key = get_vapid_private_key()
|
||||||
|
if not vapid_key:
|
||||||
|
raise HTTPException(status_code=503, detail="VAPID keys not initialized")
|
||||||
|
|
||||||
|
payload = json.dumps(
|
||||||
|
{
|
||||||
|
"title": "RemoteTerm Test",
|
||||||
|
"body": "Push notifications are working!",
|
||||||
|
"tag": "meshcore-test",
|
||||||
|
"url_hash": "",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with asyncio.timeout(15):
|
||||||
|
await send_push(
|
||||||
|
subscription_info={
|
||||||
|
"endpoint": sub["endpoint"],
|
||||||
|
"keys": {"p256dh": sub["p256dh"], "auth": sub["auth"]},
|
||||||
|
},
|
||||||
|
payload=payload,
|
||||||
|
vapid_private_key=vapid_key,
|
||||||
|
vapid_claims={"sub": "mailto:noreply@meshcore.local"},
|
||||||
|
)
|
||||||
|
return {"status": "sent"}
|
||||||
|
except TimeoutError:
|
||||||
|
raise HTTPException(status_code=504, detail="Push delivery timed out") from None
|
||||||
|
except WebPushException as e:
|
||||||
|
status_code = getattr(getattr(e, "response", None), "status_code", 0)
|
||||||
|
if status_code in (403, 404, 410):
|
||||||
|
logger.info(
|
||||||
|
"Test push: subscription stale (HTTP %d), removing %s",
|
||||||
|
status_code,
|
||||||
|
subscription_id,
|
||||||
|
)
|
||||||
|
await PushSubscriptionRepository.delete(subscription_id)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=410,
|
||||||
|
detail="Subscription is stale (VAPID key mismatch or expired). "
|
||||||
|
"Re-enable push from a conversation header.",
|
||||||
|
) from None
|
||||||
|
logger.warning("Test push failed: %s", e)
|
||||||
|
raise HTTPException(status_code=502, detail=f"Push delivery failed: {e}") from None
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Test push failed: %s", e)
|
||||||
|
raise HTTPException(status_code=502, detail=f"Push delivery failed: {e}") from None
|
||||||
|
|
||||||
|
|
||||||
|
# ── Global push conversation management ──────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/conversations")
|
||||||
|
async def get_push_conversations() -> list[str]:
|
||||||
|
"""Return the global list of push-enabled conversation state keys."""
|
||||||
|
return await AppSettingsRepository.get_push_conversations()
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/conversations/toggle")
|
||||||
|
async def toggle_push_conversation(body: PushConversationToggle) -> list[str]:
|
||||||
|
"""Add or remove a conversation from the global push list."""
|
||||||
|
return await AppSettingsRepository.toggle_push_conversation(body.key)
|
||||||
@@ -8,6 +8,13 @@ from pydantic import BaseModel, Field
|
|||||||
from app.models import CONTACT_TYPE_REPEATER, AppSettings
|
from app.models import CONTACT_TYPE_REPEATER, AppSettings
|
||||||
from app.region_scope import normalize_region_scope
|
from app.region_scope import normalize_region_scope
|
||||||
from app.repository import AppSettingsRepository, ChannelRepository, ContactRepository
|
from app.repository import AppSettingsRepository, ChannelRepository, ContactRepository
|
||||||
|
from app.telemetry_interval import (
|
||||||
|
DEFAULT_TELEMETRY_INTERVAL_HOURS,
|
||||||
|
TELEMETRY_INTERVAL_OPTIONS_HOURS,
|
||||||
|
clamp_telemetry_interval,
|
||||||
|
legal_interval_options,
|
||||||
|
next_run_timestamp_utc,
|
||||||
|
)
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
router = APIRouter(prefix="/settings", tags=["settings"])
|
router = APIRouter(prefix="/settings", tags=["settings"])
|
||||||
@@ -57,6 +64,15 @@ class AppSettingsUpdate(BaseModel):
|
|||||||
default=None,
|
default=None,
|
||||||
description="Auto-resend channel messages once if no echo heard within 2 seconds",
|
description="Auto-resend channel messages once if no echo heard within 2 seconds",
|
||||||
)
|
)
|
||||||
|
telemetry_interval_hours: int | None = Field(
|
||||||
|
default=None,
|
||||||
|
description=(
|
||||||
|
"Preferred tracked-repeater telemetry interval in hours. "
|
||||||
|
f"Must be one of {list(TELEMETRY_INTERVAL_OPTIONS_HOURS)}. "
|
||||||
|
"Effective interval is clamped up to the shortest legal value "
|
||||||
|
"based on the current tracked-repeater count."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class BlockKeyRequest(BaseModel):
|
class BlockKeyRequest(BaseModel):
|
||||||
@@ -82,6 +98,29 @@ class TrackedTelemetryRequest(BaseModel):
|
|||||||
public_key: str = Field(description="Public key of the repeater to toggle tracking")
|
public_key: str = Field(description="Public key of the repeater to toggle tracking")
|
||||||
|
|
||||||
|
|
||||||
|
class TelemetrySchedule(BaseModel):
|
||||||
|
"""Surface of telemetry scheduling derivations for the UI.
|
||||||
|
|
||||||
|
``preferred_hours`` is the stored user choice. ``effective_hours`` is the
|
||||||
|
value the scheduler actually uses (preferred, clamped up to the shortest
|
||||||
|
legal interval given the current tracked-repeater count). ``options``
|
||||||
|
lists the subset of the menu that is legal at the current count; the UI
|
||||||
|
should hide anything not in this list. ``next_run_at`` is the Unix
|
||||||
|
timestamp (seconds, UTC) of the next scheduled cycle, or ``None`` when
|
||||||
|
no repeaters are tracked (nothing to schedule).
|
||||||
|
"""
|
||||||
|
|
||||||
|
preferred_hours: int = Field(description="User's saved telemetry interval preference")
|
||||||
|
effective_hours: int = Field(description="Scheduler's clamped interval")
|
||||||
|
options: list[int] = Field(description="Legal interval choices at the current count")
|
||||||
|
tracked_count: int = Field(description="Number of repeaters currently tracked")
|
||||||
|
max_tracked: int = Field(description="Maximum number of repeaters that can be tracked")
|
||||||
|
next_run_at: int | None = Field(
|
||||||
|
default=None,
|
||||||
|
description="Unix timestamp (UTC seconds) of the next scheduled cycle",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class TrackedTelemetryResponse(BaseModel):
|
class TrackedTelemetryResponse(BaseModel):
|
||||||
tracked_telemetry_repeaters: list[str] = Field(
|
tracked_telemetry_repeaters: list[str] = Field(
|
||||||
description="Current list of tracked repeater public keys"
|
description="Current list of tracked repeater public keys"
|
||||||
@@ -89,6 +128,24 @@ class TrackedTelemetryResponse(BaseModel):
|
|||||||
names: dict[str, str] = Field(
|
names: dict[str, str] = Field(
|
||||||
description="Map of public key to display name for tracked repeaters"
|
description="Map of public key to display name for tracked repeaters"
|
||||||
)
|
)
|
||||||
|
schedule: TelemetrySchedule = Field(description="Current scheduling state")
|
||||||
|
|
||||||
|
|
||||||
|
def _build_schedule(tracked_count: int, preferred_hours: int | None) -> TelemetrySchedule:
|
||||||
|
pref = (
|
||||||
|
preferred_hours
|
||||||
|
if preferred_hours in TELEMETRY_INTERVAL_OPTIONS_HOURS
|
||||||
|
else DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
)
|
||||||
|
effective = clamp_telemetry_interval(pref, tracked_count)
|
||||||
|
return TelemetrySchedule(
|
||||||
|
preferred_hours=pref,
|
||||||
|
effective_hours=effective,
|
||||||
|
options=legal_interval_options(tracked_count),
|
||||||
|
tracked_count=tracked_count,
|
||||||
|
max_tracked=MAX_TRACKED_TELEMETRY_REPEATERS,
|
||||||
|
next_run_at=next_run_timestamp_utc(effective) if tracked_count > 0 else None,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.get("", response_model=AppSettings)
|
@router.get("", response_model=AppSettings)
|
||||||
@@ -136,6 +193,20 @@ async def update_settings(update: AppSettingsUpdate) -> AppSettings:
|
|||||||
if update.auto_resend_channel is not None:
|
if update.auto_resend_channel is not None:
|
||||||
kwargs["auto_resend_channel"] = update.auto_resend_channel
|
kwargs["auto_resend_channel"] = update.auto_resend_channel
|
||||||
|
|
||||||
|
# Telemetry interval preference. Invalid values fall back to default
|
||||||
|
# rather than 400-ing so a stale client can't brick settings saves.
|
||||||
|
if update.telemetry_interval_hours is not None:
|
||||||
|
raw_interval = update.telemetry_interval_hours
|
||||||
|
if raw_interval not in TELEMETRY_INTERVAL_OPTIONS_HOURS:
|
||||||
|
logger.warning(
|
||||||
|
"telemetry_interval_hours=%r is not in the menu; defaulting to %d",
|
||||||
|
raw_interval,
|
||||||
|
DEFAULT_TELEMETRY_INTERVAL_HOURS,
|
||||||
|
)
|
||||||
|
raw_interval = DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
logger.info("Updating telemetry_interval_hours to %d", raw_interval)
|
||||||
|
kwargs["telemetry_interval_hours"] = raw_interval
|
||||||
|
|
||||||
# Flood scope
|
# Flood scope
|
||||||
flood_scope_changed = False
|
flood_scope_changed = False
|
||||||
if update.flood_scope is not None:
|
if update.flood_scope is not None:
|
||||||
@@ -229,6 +300,7 @@ async def toggle_tracked_telemetry(request: TrackedTelemetryRequest) -> TrackedT
|
|||||||
return TrackedTelemetryResponse(
|
return TrackedTelemetryResponse(
|
||||||
tracked_telemetry_repeaters=new_list,
|
tracked_telemetry_repeaters=new_list,
|
||||||
names=await _resolve_names(new_list),
|
names=await _resolve_names(new_list),
|
||||||
|
schedule=_build_schedule(len(new_list), settings.telemetry_interval_hours),
|
||||||
)
|
)
|
||||||
|
|
||||||
# Validate it's a repeater
|
# Validate it's a repeater
|
||||||
@@ -255,4 +327,20 @@ async def toggle_tracked_telemetry(request: TrackedTelemetryRequest) -> TrackedT
|
|||||||
return TrackedTelemetryResponse(
|
return TrackedTelemetryResponse(
|
||||||
tracked_telemetry_repeaters=new_list,
|
tracked_telemetry_repeaters=new_list,
|
||||||
names=await _resolve_names(new_list),
|
names=await _resolve_names(new_list),
|
||||||
|
schedule=_build_schedule(len(new_list), settings.telemetry_interval_hours),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tracked-telemetry/schedule", response_model=TelemetrySchedule)
|
||||||
|
async def get_telemetry_schedule() -> TelemetrySchedule:
|
||||||
|
"""Return the current telemetry scheduling derivation.
|
||||||
|
|
||||||
|
The UI uses this to render the interval dropdown (legal options),
|
||||||
|
surface saved-vs-effective when they differ, and show the next-run-at
|
||||||
|
timestamp so users know when the next cycle will fire.
|
||||||
|
"""
|
||||||
|
app_settings = await AppSettingsRepository.get()
|
||||||
|
return _build_schedule(
|
||||||
|
len(app_settings.tracked_telemetry_repeaters),
|
||||||
|
app_settings.telemetry_interval_hours,
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -252,6 +252,11 @@ async def _store_direct_message(
|
|||||||
|
|
||||||
if update_last_contacted_key:
|
if update_last_contacted_key:
|
||||||
await contact_repository.update_last_contacted(update_last_contacted_key, received_at)
|
await contact_repository.update_last_contacted(update_last_contacted_key, received_at)
|
||||||
|
# Incoming DMs are direct RF evidence that this contact transmitted;
|
||||||
|
# outgoing DMs are our own send and must not bump the contact's
|
||||||
|
# last_seen.
|
||||||
|
if not outgoing:
|
||||||
|
await contact_repository.touch_last_seen(update_last_contacted_key, received_at)
|
||||||
|
|
||||||
return message
|
return message
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,88 @@
|
|||||||
|
"""Shared math for the tracked-repeater telemetry scheduler.
|
||||||
|
|
||||||
|
The app enforces a ceiling of 24 repeater status checks per 24 hours across
|
||||||
|
all tracked repeaters. With N repeaters tracked, the shortest legal interval
|
||||||
|
is ``24 // floor(24 / N)`` hours. Longer intervals (``12`` or ``24``) are
|
||||||
|
always legal at any N and are offered as user choices on top of the derived
|
||||||
|
shortest-legal value.
|
||||||
|
|
||||||
|
The user picks an interval via settings. The scheduler uses
|
||||||
|
``clamp_telemetry_interval`` to push that pick up to the shortest legal
|
||||||
|
interval if the user has added repeaters that invalidated their choice.
|
||||||
|
The stored preference is *not* mutated on clamp — users get their pick back
|
||||||
|
if they later drop repeaters.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
|
||||||
|
# Daily check budget: total number of repeater status checks we allow
|
||||||
|
# across all tracked repeaters per 24-hour window.
|
||||||
|
DAILY_CHECK_CEILING = 24
|
||||||
|
|
||||||
|
# Menu of interval values shown to users. The derivation-based options
|
||||||
|
# (1..8) are filtered per current repeater count via
|
||||||
|
# ``legal_interval_options``; 12 and 24 are always legal.
|
||||||
|
TELEMETRY_INTERVAL_OPTIONS_HOURS: tuple[int, ...] = (1, 2, 3, 4, 6, 8, 12, 24)
|
||||||
|
|
||||||
|
DEFAULT_TELEMETRY_INTERVAL_HOURS = 8
|
||||||
|
|
||||||
|
|
||||||
|
def shortest_legal_interval_hours(n_tracked: int) -> int:
|
||||||
|
"""Return the shortest interval (hours) that keeps under the daily ceiling.
|
||||||
|
|
||||||
|
With ``N`` repeaters, each full cycle costs ``N`` checks. We're capped at
|
||||||
|
``DAILY_CHECK_CEILING`` checks/day, so the maximum cycles/day is
|
||||||
|
``floor(24 / N)`` and the resulting interval is ``24 // cycles_per_day``.
|
||||||
|
For ``N == 0`` we return the default so the math still terminates, though
|
||||||
|
the scheduler skips empty-tracked cycles regardless.
|
||||||
|
"""
|
||||||
|
if n_tracked <= 0:
|
||||||
|
return DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
cycles_per_day = DAILY_CHECK_CEILING // n_tracked
|
||||||
|
if cycles_per_day <= 0:
|
||||||
|
# Would exceed ceiling even at 24h cadence; fall back to 24h.
|
||||||
|
return 24
|
||||||
|
return 24 // cycles_per_day
|
||||||
|
|
||||||
|
|
||||||
|
def clamp_telemetry_interval(preferred_hours: int, n_tracked: int) -> int:
|
||||||
|
"""Return the effective interval: max of user preference and shortest legal.
|
||||||
|
|
||||||
|
Unrecognized values fall back to the default.
|
||||||
|
"""
|
||||||
|
if preferred_hours not in TELEMETRY_INTERVAL_OPTIONS_HOURS:
|
||||||
|
preferred_hours = DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
shortest = shortest_legal_interval_hours(n_tracked)
|
||||||
|
return max(preferred_hours, shortest)
|
||||||
|
|
||||||
|
|
||||||
|
def legal_interval_options(n_tracked: int) -> list[int]:
|
||||||
|
"""Return the subset of the interval menu that is legal for a given N."""
|
||||||
|
shortest = shortest_legal_interval_hours(n_tracked)
|
||||||
|
return [h for h in TELEMETRY_INTERVAL_OPTIONS_HOURS if h >= shortest]
|
||||||
|
|
||||||
|
|
||||||
|
def next_run_timestamp_utc(effective_hours: int, now: datetime | None = None) -> int:
|
||||||
|
"""Return Unix timestamp for the next UTC top-of-hour where
|
||||||
|
``hour % effective_hours == 0``.
|
||||||
|
|
||||||
|
Returns the next matching hour strictly in the future (never ``now``
|
||||||
|
itself, even if ``now`` lies exactly on a matching boundary).
|
||||||
|
"""
|
||||||
|
if effective_hours <= 0:
|
||||||
|
effective_hours = DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
if now is None:
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
else:
|
||||||
|
now = now.astimezone(UTC)
|
||||||
|
|
||||||
|
# Round up to the next top-of-hour, then skip forward until the modulo matches.
|
||||||
|
candidate = now.replace(minute=0, second=0, microsecond=0)
|
||||||
|
# Always move at least one hour forward so "now" never matches.
|
||||||
|
candidate = candidate.replace(hour=candidate.hour)
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
candidate = candidate + timedelta(hours=1)
|
||||||
|
while candidate.hour % effective_hours != 0:
|
||||||
|
candidate = candidate + timedelta(hours=1)
|
||||||
|
return int(candidate.timestamp())
|
||||||
@@ -108,6 +108,10 @@ def broadcast_event(event_type: str, data: dict, *, realtime: bool = True) -> No
|
|||||||
|
|
||||||
if event_type == "message":
|
if event_type == "message":
|
||||||
asyncio.create_task(fanout_manager.broadcast_message(data))
|
asyncio.create_task(fanout_manager.broadcast_message(data))
|
||||||
|
|
||||||
|
from app.push.manager import push_manager
|
||||||
|
|
||||||
|
asyncio.create_task(push_manager.dispatch_message(data))
|
||||||
elif event_type == "raw_packet":
|
elif event_type == "raw_packet":
|
||||||
asyncio.create_task(fanout_manager.broadcast_raw(data))
|
asyncio.create_task(fanout_manager.broadcast_raw(data))
|
||||||
elif event_type == "contact":
|
elif event_type == "contact":
|
||||||
|
|||||||
@@ -57,6 +57,7 @@ frontend/src/
|
|||||||
│ ├── useConversationRouter.ts # URL hash → active conversation routing
|
│ ├── useConversationRouter.ts # URL hash → active conversation routing
|
||||||
│ ├── useContactsAndChannels.ts # Contact/channel loading, creation, deletion
|
│ ├── useContactsAndChannels.ts # Contact/channel loading, creation, deletion
|
||||||
│ ├── useBrowserNotifications.ts # Per-conversation browser notification preferences + dispatch
|
│ ├── useBrowserNotifications.ts # Per-conversation browser notification preferences + dispatch
|
||||||
|
│ ├── usePushSubscription.ts # Web Push subscription lifecycle, per-conversation filters
|
||||||
│ ├── useFaviconBadge.ts # Browser tab unread badge state
|
│ ├── useFaviconBadge.ts # Browser tab unread badge state
|
||||||
│ ├── useRawPacketStatsSession.ts # Session-scoped packet-feed stats history
|
│ ├── useRawPacketStatsSession.ts # Session-scoped packet-feed stats history
|
||||||
│ └── useRememberedServerPassword.ts # Browser-local repeater/room password persistence
|
│ └── useRememberedServerPassword.ts # Browser-local repeater/room password persistence
|
||||||
@@ -429,6 +430,17 @@ The `SearchView` component (`components/SearchView.tsx`) provides full-text sear
|
|||||||
- **Bidirectional pagination**: After jumping mid-history, `hasNewerMessages` enables forward pagination via `fetchNewerMessages`. The scroll-to-bottom button calls `jumpToBottom` (re-fetches latest page) instead of just scrolling.
|
- **Bidirectional pagination**: After jumping mid-history, `hasNewerMessages` enables forward pagination via `fetchNewerMessages`. The scroll-to-bottom button calls `jumpToBottom` (re-fetches latest page) instead of just scrolling.
|
||||||
- **WS message suppression**: When `hasNewerMessages` is true, incoming WS messages for the active conversation are not added to the message list (the user is viewing historical context, not the latest page).
|
- **WS message suppression**: When `hasNewerMessages` is true, incoming WS messages for the active conversation are not added to the message list (the user is viewing historical context, not the latest page).
|
||||||
|
|
||||||
|
## Web Push Notifications
|
||||||
|
|
||||||
|
Web Push allows notifications even when the browser tab is closed. Requires HTTPS (self-signed OK).
|
||||||
|
|
||||||
|
- **Service worker**: `frontend/public/sw.js` handles `push` events (show notification) and `notificationclick` (focus/open tab, navigate via `url_hash`). Registered in `main.tsx` on secure contexts only.
|
||||||
|
- **`usePushSubscription` hook**: manages the full subscription lifecycle — subscribe (register SW → `PushManager.subscribe()` → POST to backend), unsubscribe, global push-conversation toggles, device listing, and deletion.
|
||||||
|
- **ChatHeader integration**: `BellRing` icon (amber when active) appears next to the existing desktop notification `Bell` on secure contexts. First click subscribes the browser and enables push for that conversation; subsequent clicks toggle the conversation on/off.
|
||||||
|
- **Settings > Local**: `PushDeviceManagement` component shows subscription status, lists all registered devices with test/delete buttons. Uses `usePushSubscription` hook directly.
|
||||||
|
- Auto-generates device labels from User-Agent (e.g., "Chrome on macOS").
|
||||||
|
- `PushSubscriptionInfo` type in `types.ts`; API methods in `api.ts`.
|
||||||
|
|
||||||
## Styling
|
## Styling
|
||||||
|
|
||||||
UI styling is mostly utility-class driven (Tailwind-style classes in JSX) plus shared globals in `index.css` and `styles.css`.
|
UI styling is mostly utility-class driven (Tailwind-style classes in JSX) plus shared globals in `index.css` and `styles.css`.
|
||||||
|
|||||||
+2
-4
@@ -15,10 +15,8 @@
|
|||||||
<link rel="apple-touch-icon" sizes="180x180" href="./apple-touch-icon.png" />
|
<link rel="apple-touch-icon" sizes="180x180" href="./apple-touch-icon.png" />
|
||||||
<link rel="manifest" href="./site.webmanifest" crossorigin="use-credentials" />
|
<link rel="manifest" href="./site.webmanifest" crossorigin="use-credentials" />
|
||||||
<script>
|
<script>
|
||||||
// Register minimal service worker for PWA installability.
|
// Service worker registration moved to main.tsx (requires isSecureContext
|
||||||
if ('serviceWorker' in navigator) {
|
// for Web Push). Do not duplicate here.
|
||||||
navigator.serviceWorker.register('./sw.js').catch(function() {});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Start critical data fetches before React/Vite JS loads.
|
// Start critical data fetches before React/Vite JS loads.
|
||||||
// Must be in <head> BEFORE the module script so the browser queues these
|
// Must be in <head> BEFORE the module script so the browser queues these
|
||||||
|
|||||||
Generated
+2
-2
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "remoteterm-meshcore-frontend",
|
"name": "remoteterm-meshcore-frontend",
|
||||||
"version": "3.11.0",
|
"version": "3.11.3",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "remoteterm-meshcore-frontend",
|
"name": "remoteterm-meshcore-frontend",
|
||||||
"version": "3.11.0",
|
"version": "3.11.3",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@codemirror/lang-python": "^6.2.1",
|
"@codemirror/lang-python": "^6.2.1",
|
||||||
"@codemirror/theme-one-dark": "^6.1.3",
|
"@codemirror/theme-one-dark": "^6.1.3",
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "remoteterm-meshcore-frontend",
|
"name": "remoteterm-meshcore-frontend",
|
||||||
"private": true,
|
"private": true,
|
||||||
"version": "3.11.2",
|
"version": "3.11.3",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "vite",
|
"dev": "vite",
|
||||||
|
|||||||
Binary file not shown.
|
After Width: | Height: | Size: 122 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 426 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 109 KiB |
+53
-5
@@ -1,12 +1,60 @@
|
|||||||
// Minimal service worker required for PWA installability.
|
/* Service worker for PWA installability and Web Push notifications. */
|
||||||
// No caching — this app is network-dependent. All fetches pass through.
|
|
||||||
|
|
||||||
self.addEventListener("install", function () {
|
self.addEventListener("install", () => {
|
||||||
self.skipWaiting();
|
self.skipWaiting();
|
||||||
});
|
});
|
||||||
|
|
||||||
self.addEventListener("activate", function (event) {
|
self.addEventListener("activate", (event) => {
|
||||||
event.waitUntil(self.clients.claim());
|
event.waitUntil(self.clients.claim());
|
||||||
});
|
});
|
||||||
|
|
||||||
self.addEventListener("fetch", function () {});
|
// No-op fetch handler — required for PWA installability criteria.
|
||||||
|
// We don't cache anything; the app always fetches from the network.
|
||||||
|
self.addEventListener("fetch", () => {});
|
||||||
|
|
||||||
|
self.addEventListener("push", (event) => {
|
||||||
|
let data = {};
|
||||||
|
try {
|
||||||
|
data = event.data ? event.data.json() : {};
|
||||||
|
} catch {
|
||||||
|
data = { title: "New message", body: event.data?.text() || "" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const title = data.title || "New message";
|
||||||
|
const options = {
|
||||||
|
body: data.body || "",
|
||||||
|
icon: "./favicon-256x256.png",
|
||||||
|
badge: "./favicon-96x96.png",
|
||||||
|
tag: data.tag || "meshcore-push",
|
||||||
|
data: { url_hash: data.url_hash || "" },
|
||||||
|
};
|
||||||
|
|
||||||
|
event.waitUntil(self.registration.showNotification(title, options));
|
||||||
|
});
|
||||||
|
|
||||||
|
self.addEventListener("notificationclick", (event) => {
|
||||||
|
event.notification.close();
|
||||||
|
const urlHash = event.notification.data?.url_hash || "";
|
||||||
|
// Use the SW registration scope as the base URL so subpath deployments
|
||||||
|
// (e.g. archworks.co/meshcore/) navigate correctly.
|
||||||
|
const base = self.registration.scope;
|
||||||
|
|
||||||
|
event.waitUntil(
|
||||||
|
clients
|
||||||
|
.matchAll({ type: "window", includeUncontrolled: true })
|
||||||
|
.then((windowClients) => {
|
||||||
|
// Focus an existing tab if one is open
|
||||||
|
for (const client of windowClients) {
|
||||||
|
if (client.url.startsWith(base)) {
|
||||||
|
client.focus();
|
||||||
|
if (urlHash) {
|
||||||
|
client.navigate(base + urlHash);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Otherwise open a new tab
|
||||||
|
return clients.openWindow(base + (urlHash || ""));
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|||||||
@@ -22,6 +22,7 @@ import { toast } from './components/ui/sonner';
|
|||||||
import { AppShell } from './components/AppShell';
|
import { AppShell } from './components/AppShell';
|
||||||
import type { MessageInputHandle } from './components/MessageInput';
|
import type { MessageInputHandle } from './components/MessageInput';
|
||||||
import { DistanceUnitProvider } from './contexts/DistanceUnitContext';
|
import { DistanceUnitProvider } from './contexts/DistanceUnitContext';
|
||||||
|
import { usePush } from './contexts/PushSubscriptionContext';
|
||||||
import { messageContainsMention } from './utils/messageParser';
|
import { messageContainsMention } from './utils/messageParser';
|
||||||
import { getStateKey } from './utils/conversationState';
|
import { getStateKey } from './utils/conversationState';
|
||||||
import type { BulkCreateHashtagChannelsResult, Conversation, Message, RawPacket } from './types';
|
import type { BulkCreateHashtagChannelsResult, Conversation, Message, RawPacket } from './types';
|
||||||
@@ -99,6 +100,7 @@ export function App() {
|
|||||||
toggleConversationNotifications,
|
toggleConversationNotifications,
|
||||||
notifyIncomingMessage,
|
notifyIncomingMessage,
|
||||||
} = useBrowserNotifications();
|
} = useBrowserNotifications();
|
||||||
|
const pushSubscription = usePush();
|
||||||
const { rawPacketStatsSession, recordRawPacketObservation } = useRawPacketStatsSession();
|
const { rawPacketStatsSession, recordRawPacketObservation } = useRawPacketStatsSession();
|
||||||
const {
|
const {
|
||||||
showNewMessage,
|
showNewMessage,
|
||||||
@@ -588,6 +590,7 @@ export function App() {
|
|||||||
onDeleteChannel: handleDeleteChannel,
|
onDeleteChannel: handleDeleteChannel,
|
||||||
onSetChannelFloodScopeOverride: handleSetChannelFloodScopeOverride,
|
onSetChannelFloodScopeOverride: handleSetChannelFloodScopeOverride,
|
||||||
onSetChannelPathHashModeOverride: handleSetChannelPathHashModeOverride,
|
onSetChannelPathHashModeOverride: handleSetChannelPathHashModeOverride,
|
||||||
|
onSelectConversation: handleSelectConversationWithTargetReset,
|
||||||
onOpenContactInfo: handleOpenContactInfo,
|
onOpenContactInfo: handleOpenContactInfo,
|
||||||
onOpenChannelInfo: handleOpenChannelInfo,
|
onOpenChannelInfo: handleOpenChannelInfo,
|
||||||
onSenderClick: handleSenderClick,
|
onSenderClick: handleSenderClick,
|
||||||
@@ -614,6 +617,36 @@ export function App() {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
pushSupported: pushSubscription.isSupported,
|
||||||
|
pushSubscribed: pushSubscription.isSubscribed,
|
||||||
|
pushEnabledForConversation:
|
||||||
|
activeConversation?.type === 'contact' || activeConversation?.type === 'channel'
|
||||||
|
? pushSubscription.isConversationPushEnabled(
|
||||||
|
getStateKey(activeConversation.type, activeConversation.id)
|
||||||
|
)
|
||||||
|
: false,
|
||||||
|
onTogglePush: async () => {
|
||||||
|
if (
|
||||||
|
!activeConversation ||
|
||||||
|
(activeConversation.type !== 'contact' && activeConversation.type !== 'channel')
|
||||||
|
)
|
||||||
|
return;
|
||||||
|
const key = getStateKey(activeConversation.type, activeConversation.id);
|
||||||
|
const pushEnabled = pushSubscription.isConversationPushEnabled(key);
|
||||||
|
|
||||||
|
if (!pushEnabled && !pushSubscription.isSubscribed) {
|
||||||
|
const subscriptionId = await pushSubscription.subscribe();
|
||||||
|
if (!subscriptionId) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
await pushSubscription.toggleConversation(key);
|
||||||
|
},
|
||||||
|
onOpenPushSettings: () => {
|
||||||
|
setSettingsSection('local');
|
||||||
|
if (!showSettings) handleToggleSettingsView();
|
||||||
|
},
|
||||||
trackedTelemetryRepeaters: appSettings?.tracked_telemetry_repeaters ?? [],
|
trackedTelemetryRepeaters: appSettings?.tracked_telemetry_repeaters ?? [],
|
||||||
onToggleTrackedTelemetry: handleToggleTrackedTelemetry,
|
onToggleTrackedTelemetry: handleToggleTrackedTelemetry,
|
||||||
repeaterAutoLoginKey,
|
repeaterAutoLoginKey,
|
||||||
@@ -647,6 +680,7 @@ export function App() {
|
|||||||
onToggleBlockedKey: handleBlockKey,
|
onToggleBlockedKey: handleBlockKey,
|
||||||
onToggleBlockedName: handleBlockName,
|
onToggleBlockedName: handleBlockName,
|
||||||
contacts,
|
contacts,
|
||||||
|
channels,
|
||||||
onBulkDeleteContacts: (deletedKeys: string[]) => {
|
onBulkDeleteContacts: (deletedKeys: string[]) => {
|
||||||
const keySet = new Set(deletedKeys.map((k) => k.toLowerCase()));
|
const keySet = new Set(deletedKeys.map((k) => k.toLowerCase()));
|
||||||
setContacts((prev) => prev.filter((c) => !keySet.has(c.public_key.toLowerCase())));
|
setContacts((prev) => prev.filter((c) => !keySet.has(c.public_key.toLowerCase())));
|
||||||
|
|||||||
@@ -22,6 +22,7 @@ import type {
|
|||||||
RadioTraceResponse,
|
RadioTraceResponse,
|
||||||
RadioDiscoveryTarget,
|
RadioDiscoveryTarget,
|
||||||
PathDiscoveryResponse,
|
PathDiscoveryResponse,
|
||||||
|
PushSubscriptionInfo,
|
||||||
ResendChannelMessageResponse,
|
ResendChannelMessageResponse,
|
||||||
RepeaterAclResponse,
|
RepeaterAclResponse,
|
||||||
RepeaterAdvertIntervalsResponse,
|
RepeaterAdvertIntervalsResponse,
|
||||||
@@ -33,6 +34,7 @@ import type {
|
|||||||
RepeaterRadioSettingsResponse,
|
RepeaterRadioSettingsResponse,
|
||||||
RepeaterStatusResponse,
|
RepeaterStatusResponse,
|
||||||
TelemetryHistoryEntry,
|
TelemetryHistoryEntry,
|
||||||
|
TelemetrySchedule,
|
||||||
TrackedTelemetryResponse,
|
TrackedTelemetryResponse,
|
||||||
StatisticsResponse,
|
StatisticsResponse,
|
||||||
TraceResponse,
|
TraceResponse,
|
||||||
@@ -332,6 +334,8 @@ export const api = {
|
|||||||
body: JSON.stringify({ public_key: publicKey }),
|
body: JSON.stringify({ public_key: publicKey }),
|
||||||
}),
|
}),
|
||||||
|
|
||||||
|
getTelemetrySchedule: () => fetchJson<TelemetrySchedule>('/settings/tracked-telemetry/schedule'),
|
||||||
|
|
||||||
// Favorites
|
// Favorites
|
||||||
toggleFavorite: (type: 'channel' | 'contact', id: string) =>
|
toggleFavorite: (type: 'channel' | 'contact', id: string) =>
|
||||||
fetchJson<{ type: string; id: string; favorite: boolean }>('/settings/favorites/toggle', {
|
fetchJson<{ type: string; id: string; favorite: boolean }>('/settings/favorites/toggle', {
|
||||||
@@ -438,4 +442,28 @@ export const api = {
|
|||||||
fetchJson<RepeaterLppTelemetryResponse>(`/contacts/${publicKey}/room/lpp-telemetry`, {
|
fetchJson<RepeaterLppTelemetryResponse>(`/contacts/${publicKey}/room/lpp-telemetry`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
}),
|
}),
|
||||||
|
|
||||||
|
// Push Notifications
|
||||||
|
getVapidPublicKey: () => fetchJson<{ public_key: string }>('/push/vapid-public-key'),
|
||||||
|
pushSubscribe: (subscription: {
|
||||||
|
endpoint: string;
|
||||||
|
p256dh: string;
|
||||||
|
auth: string;
|
||||||
|
label?: string;
|
||||||
|
}) =>
|
||||||
|
fetchJson<PushSubscriptionInfo>('/push/subscribe', {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify(subscription),
|
||||||
|
}),
|
||||||
|
getPushSubscriptions: () => fetchJson<PushSubscriptionInfo[]>('/push/subscriptions'),
|
||||||
|
deletePushSubscription: (id: string) =>
|
||||||
|
fetchJson<{ deleted: boolean }>(`/push/subscriptions/${id}`, { method: 'DELETE' }),
|
||||||
|
testPushSubscription: (id: string) =>
|
||||||
|
fetchJson<{ status: string }>(`/push/subscriptions/${id}/test`, { method: 'POST' }),
|
||||||
|
getPushConversations: () => fetchJson<string[]>('/push/conversations'),
|
||||||
|
togglePushConversation: (key: string) =>
|
||||||
|
fetchJson<string[]>('/push/conversations/toggle', {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ key }),
|
||||||
|
}),
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { useEffect, useState } from 'react';
|
import { useEffect, useRef, useState } from 'react';
|
||||||
import { Bell, ChevronsLeftRight, Globe2, Info, Route, Star, Trash2 } from 'lucide-react';
|
import { Bell, ChevronsLeftRight, Globe2, Info, Route, Star, Trash2 } from 'lucide-react';
|
||||||
import { toast } from './ui/sonner';
|
import { toast } from './ui/sonner';
|
||||||
import { DirectTraceIcon } from './DirectTraceIcon';
|
import { DirectTraceIcon } from './DirectTraceIcon';
|
||||||
@@ -26,6 +26,11 @@ interface ChatHeaderProps {
|
|||||||
onTrace: () => void;
|
onTrace: () => void;
|
||||||
onPathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
|
onPathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
|
||||||
onToggleNotifications: () => void;
|
onToggleNotifications: () => void;
|
||||||
|
pushSupported?: boolean;
|
||||||
|
pushSubscribed?: boolean;
|
||||||
|
pushEnabledForConversation?: boolean;
|
||||||
|
onTogglePush?: () => void;
|
||||||
|
onOpenPushSettings?: () => void;
|
||||||
onToggleFavorite: (type: 'channel' | 'contact', id: string) => void;
|
onToggleFavorite: (type: 'channel' | 'contact', id: string) => void;
|
||||||
onSetChannelFloodScopeOverride?: (key: string, floodScopeOverride: string) => void;
|
onSetChannelFloodScopeOverride?: (key: string, floodScopeOverride: string) => void;
|
||||||
onSetChannelPathHashModeOverride?: (key: string, pathHashModeOverride: number | null) => void;
|
onSetChannelPathHashModeOverride?: (key: string, pathHashModeOverride: number | null) => void;
|
||||||
@@ -46,6 +51,11 @@ export function ChatHeader({
|
|||||||
onTrace,
|
onTrace,
|
||||||
onPathDiscovery,
|
onPathDiscovery,
|
||||||
onToggleNotifications,
|
onToggleNotifications,
|
||||||
|
pushSupported,
|
||||||
|
pushSubscribed,
|
||||||
|
pushEnabledForConversation,
|
||||||
|
onTogglePush,
|
||||||
|
onOpenPushSettings,
|
||||||
onToggleFavorite,
|
onToggleFavorite,
|
||||||
onSetChannelFloodScopeOverride,
|
onSetChannelFloodScopeOverride,
|
||||||
onSetChannelPathHashModeOverride,
|
onSetChannelPathHashModeOverride,
|
||||||
@@ -58,14 +68,29 @@ export function ChatHeader({
|
|||||||
const [pathDiscoveryOpen, setPathDiscoveryOpen] = useState(false);
|
const [pathDiscoveryOpen, setPathDiscoveryOpen] = useState(false);
|
||||||
const [channelOverrideOpen, setChannelOverrideOpen] = useState(false);
|
const [channelOverrideOpen, setChannelOverrideOpen] = useState(false);
|
||||||
const [pathHashModeOverrideOpen, setPathHashModeOverrideOpen] = useState(false);
|
const [pathHashModeOverrideOpen, setPathHashModeOverrideOpen] = useState(false);
|
||||||
|
const [notifDropdownOpen, setNotifDropdownOpen] = useState(false);
|
||||||
|
const notifDropdownRef = useRef<HTMLDivElement>(null);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
setShowKey(false);
|
setShowKey(false);
|
||||||
setPathDiscoveryOpen(false);
|
setPathDiscoveryOpen(false);
|
||||||
setChannelOverrideOpen(false);
|
setChannelOverrideOpen(false);
|
||||||
setPathHashModeOverrideOpen(false);
|
setPathHashModeOverrideOpen(false);
|
||||||
|
setNotifDropdownOpen(false);
|
||||||
}, [conversation.id]);
|
}, [conversation.id]);
|
||||||
|
|
||||||
|
// Close notification dropdown on outside click
|
||||||
|
useEffect(() => {
|
||||||
|
if (!notifDropdownOpen) return;
|
||||||
|
const handler = (e: MouseEvent) => {
|
||||||
|
if (notifDropdownRef.current && !notifDropdownRef.current.contains(e.target as Node)) {
|
||||||
|
setNotifDropdownOpen(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
document.addEventListener('mousedown', handler);
|
||||||
|
return () => document.removeEventListener('mousedown', handler);
|
||||||
|
}, [notifDropdownOpen]);
|
||||||
|
|
||||||
const activeChannel =
|
const activeChannel =
|
||||||
conversation.type === 'channel'
|
conversation.type === 'channel'
|
||||||
? channels.find((channel) => channel.key === conversation.id)
|
? channels.find((channel) => channel.key === conversation.id)
|
||||||
@@ -288,34 +313,94 @@ export function ChatHeader({
|
|||||||
<DirectTraceIcon className="h-4 w-4 text-muted-foreground" />
|
<DirectTraceIcon className="h-4 w-4 text-muted-foreground" />
|
||||||
</button>
|
</button>
|
||||||
)}
|
)}
|
||||||
{notificationsSupported && !activeContactIsRoomServer && (
|
{(notificationsSupported || pushSupported) && !activeContactIsRoomServer && (
|
||||||
<button
|
<div className="relative" ref={notifDropdownRef}>
|
||||||
className="flex items-center gap-1 rounded px-1 py-1 hover:bg-accent text-lg leading-none transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
|
<button
|
||||||
onClick={onToggleNotifications}
|
className="p-1 rounded hover:bg-accent text-lg leading-none transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
|
||||||
title={
|
onClick={() => setNotifDropdownOpen((v) => !v)}
|
||||||
notificationsEnabled
|
title="Notification settings"
|
||||||
? 'Disable desktop notifications for this conversation'
|
aria-label="Notification settings"
|
||||||
: notificationsPermission === 'denied'
|
aria-expanded={notifDropdownOpen}
|
||||||
? 'Notifications blocked by the browser'
|
>
|
||||||
: 'Enable desktop notifications for this conversation'
|
<Bell
|
||||||
}
|
className={cn(
|
||||||
aria-label={
|
'h-4 w-4',
|
||||||
notificationsEnabled
|
notificationsEnabled || pushEnabledForConversation
|
||||||
? 'Disable notifications for this conversation'
|
? 'text-primary'
|
||||||
: 'Enable notifications for this conversation'
|
: 'text-muted-foreground'
|
||||||
}
|
)}
|
||||||
>
|
fill={notificationsEnabled || pushEnabledForConversation ? 'currentColor' : 'none'}
|
||||||
<Bell
|
aria-hidden="true"
|
||||||
className={`h-4 w-4 ${notificationsEnabled ? 'text-status-connected' : 'text-muted-foreground'}`}
|
/>
|
||||||
fill={notificationsEnabled ? 'currentColor' : 'none'}
|
</button>
|
||||||
aria-hidden="true"
|
{notifDropdownOpen && (
|
||||||
/>
|
<div className="absolute right-[-4.5rem] sm:right-0 top-full z-50 mt-1 w-[calc(100vw-2rem)] sm:w-72 max-w-72 rounded-md border border-border bg-popover p-3 shadow-lg space-y-3">
|
||||||
{notificationsEnabled && (
|
{notificationsSupported && (
|
||||||
<span className="hidden md:inline text-[0.6875rem] font-medium text-status-connected">
|
<label className="flex items-start gap-2.5 cursor-pointer group">
|
||||||
Notifications On
|
<input
|
||||||
</span>
|
type="checkbox"
|
||||||
|
className="mt-0.5 accent-primary h-4 w-4 shrink-0"
|
||||||
|
checked={notificationsEnabled}
|
||||||
|
disabled={notificationsPermission === 'denied'}
|
||||||
|
onChange={onToggleNotifications}
|
||||||
|
/>
|
||||||
|
<div className="min-w-0">
|
||||||
|
<span className="text-sm font-medium text-foreground block leading-tight">
|
||||||
|
Desktop notifications (legacy)
|
||||||
|
</span>
|
||||||
|
<span className="text-xs text-muted-foreground leading-snug block mt-0.5">
|
||||||
|
{notificationsPermission === 'denied'
|
||||||
|
? 'Blocked by browser — check site permissions'
|
||||||
|
: 'Alerts while this tab is open'}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</label>
|
||||||
|
)}
|
||||||
|
{pushSupported && onTogglePush && (
|
||||||
|
<>
|
||||||
|
<label className="flex items-start gap-2.5 cursor-pointer group">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
className="mt-0.5 accent-primary h-4 w-4 shrink-0"
|
||||||
|
checked={!!pushEnabledForConversation}
|
||||||
|
onChange={onTogglePush}
|
||||||
|
/>
|
||||||
|
<div className="min-w-0">
|
||||||
|
<span className="text-sm font-medium text-foreground block leading-tight">
|
||||||
|
Web Push (beta testing)
|
||||||
|
</span>
|
||||||
|
<span className="text-xs text-muted-foreground leading-snug block mt-0.5">
|
||||||
|
{pushSubscribed
|
||||||
|
? 'Alerts even when the browser is closed'
|
||||||
|
: 'Alerts even when the browser is closed. Requires HTTPS.'}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</label>
|
||||||
|
<span className="text-xs text-muted-foreground leading-snug block mt-0.5">
|
||||||
|
All notification types require a trusted HTTPS context. Depending on your
|
||||||
|
browser, a snakeoil certificate may not be sufficient.
|
||||||
|
</span>
|
||||||
|
{onOpenPushSettings && (
|
||||||
|
<p className="text-xs text-muted-foreground leading-snug mt-1.5">
|
||||||
|
Manage Web Push enabled devices in{' '}
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={() => {
|
||||||
|
setNotifDropdownOpen(false);
|
||||||
|
onOpenPushSettings();
|
||||||
|
}}
|
||||||
|
className="text-primary hover:underline transition-colors focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
|
||||||
|
>
|
||||||
|
Settings → Local
|
||||||
|
</button>
|
||||||
|
.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
)}
|
)}
|
||||||
</button>
|
</div>
|
||||||
)}
|
)}
|
||||||
{conversation.type === 'channel' && onSetChannelFloodScopeOverride && (
|
{conversation.type === 'channel' && onSetChannelFloodScopeOverride && (
|
||||||
<button
|
<button
|
||||||
|
|||||||
@@ -3,10 +3,9 @@ import { useMemo, useState } from 'react';
|
|||||||
import type { Contact, PathDiscoveryResponse, PathDiscoveryRoute } from '../types';
|
import type { Contact, PathDiscoveryResponse, PathDiscoveryRoute } from '../types';
|
||||||
import {
|
import {
|
||||||
findContactsByPrefix,
|
findContactsByPrefix,
|
||||||
|
formatForcedRouteSummary,
|
||||||
|
formatLearnedRouteSummary,
|
||||||
formatRouteLabel,
|
formatRouteLabel,
|
||||||
getDirectContactRoute,
|
|
||||||
getEffectiveContactRoute,
|
|
||||||
hasRoutingOverride,
|
|
||||||
parsePathHops,
|
parsePathHops,
|
||||||
} from '../utils/pathUtils';
|
} from '../utils/pathUtils';
|
||||||
import { Button } from './ui/button';
|
import { Button } from './ui/button';
|
||||||
@@ -99,30 +98,9 @@ export function ContactPathDiscoveryModal({
|
|||||||
const [error, setError] = useState<string | null>(null);
|
const [error, setError] = useState<string | null>(null);
|
||||||
const [result, setResult] = useState<PathDiscoveryResponse | null>(null);
|
const [result, setResult] = useState<PathDiscoveryResponse | null>(null);
|
||||||
|
|
||||||
const effectiveRoute = useMemo(() => getEffectiveContactRoute(contact), [contact]);
|
const learnedRouteSummary = useMemo(() => formatLearnedRouteSummary(contact), [contact]);
|
||||||
const directRoute = useMemo(() => getDirectContactRoute(contact), [contact]);
|
const forcedRouteSummary = useMemo(() => formatForcedRouteSummary(contact), [contact]);
|
||||||
const hasForcedRoute = hasRoutingOverride(contact);
|
const hasForcedRoute = forcedRouteSummary !== null;
|
||||||
const learnedRouteSummary = useMemo(() => {
|
|
||||||
if (!directRoute) {
|
|
||||||
return 'Flood';
|
|
||||||
}
|
|
||||||
const hops = parsePathHops(directRoute.path, directRoute.path_len);
|
|
||||||
return hops.length > 0
|
|
||||||
? `${formatRouteLabel(directRoute.path_len, true)} (${hops.join(' -> ')})`
|
|
||||||
: formatRouteLabel(directRoute.path_len, true);
|
|
||||||
}, [directRoute]);
|
|
||||||
const forcedRouteSummary = useMemo(() => {
|
|
||||||
if (!hasForcedRoute) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
if (effectiveRoute.pathLen === -1) {
|
|
||||||
return 'Flood';
|
|
||||||
}
|
|
||||||
const hops = parsePathHops(effectiveRoute.path, effectiveRoute.pathLen);
|
|
||||||
return hops.length > 0
|
|
||||||
? `${formatRouteLabel(effectiveRoute.pathLen, true)} (${hops.join(' -> ')})`
|
|
||||||
: formatRouteLabel(effectiveRoute.pathLen, true);
|
|
||||||
}, [effectiveRoute, hasForcedRoute]);
|
|
||||||
|
|
||||||
const forwardChain = result
|
const forwardChain = result
|
||||||
? renderRouteNodes(
|
? renderRouteNodes(
|
||||||
|
|||||||
@@ -3,10 +3,9 @@ import { useEffect, useMemo, useState } from 'react';
|
|||||||
import { api } from '../api';
|
import { api } from '../api';
|
||||||
import type { Contact } from '../types';
|
import type { Contact } from '../types';
|
||||||
import {
|
import {
|
||||||
formatRouteLabel,
|
formatForcedRouteSummary,
|
||||||
|
formatLearnedRouteSummary,
|
||||||
formatRoutingOverrideInput,
|
formatRoutingOverrideInput,
|
||||||
getDirectContactRoute,
|
|
||||||
hasRoutingOverride,
|
|
||||||
} from '../utils/pathUtils';
|
} from '../utils/pathUtils';
|
||||||
import { Button } from './ui/button';
|
import { Button } from './ui/button';
|
||||||
import {
|
import {
|
||||||
@@ -28,18 +27,6 @@ interface ContactRoutingOverrideModalProps {
|
|||||||
onError: (message: string) => void;
|
onError: (message: string) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
function summarizeLearnedRoute(contact: Contact): string {
|
|
||||||
return formatRouteLabel(getDirectContactRoute(contact)?.path_len ?? -1, true);
|
|
||||||
}
|
|
||||||
|
|
||||||
function summarizeForcedRoute(contact: Contact): string | null {
|
|
||||||
if (!hasRoutingOverride(contact)) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
const routeOverrideLen = contact.route_override_len;
|
|
||||||
return routeOverrideLen == null ? null : formatRouteLabel(routeOverrideLen, true);
|
|
||||||
}
|
|
||||||
|
|
||||||
export function ContactRoutingOverrideModal({
|
export function ContactRoutingOverrideModal({
|
||||||
open,
|
open,
|
||||||
onClose,
|
onClose,
|
||||||
@@ -59,7 +46,8 @@ export function ContactRoutingOverrideModal({
|
|||||||
setError(null);
|
setError(null);
|
||||||
}, [contact, open]);
|
}, [contact, open]);
|
||||||
|
|
||||||
const forcedRouteSummary = useMemo(() => summarizeForcedRoute(contact), [contact]);
|
const learnedRouteSummary = useMemo(() => formatLearnedRouteSummary(contact), [contact]);
|
||||||
|
const forcedRouteSummary = useMemo(() => formatForcedRouteSummary(contact), [contact]);
|
||||||
|
|
||||||
const saveRoute = async (value: string) => {
|
const saveRoute = async (value: string) => {
|
||||||
setSaving(true);
|
setSaving(true);
|
||||||
@@ -98,7 +86,7 @@ export function ContactRoutingOverrideModal({
|
|||||||
<div className="rounded-md border border-border bg-muted/20 p-3 text-sm">
|
<div className="rounded-md border border-border bg-muted/20 p-3 text-sm">
|
||||||
<div className="font-medium">{contact.name || contact.public_key.slice(0, 12)}</div>
|
<div className="font-medium">{contact.name || contact.public_key.slice(0, 12)}</div>
|
||||||
<div className="mt-1 text-muted-foreground">
|
<div className="mt-1 text-muted-foreground">
|
||||||
Current learned route: {summarizeLearnedRoute(contact)}
|
Current learned route: {learnedRouteSummary}
|
||||||
</div>
|
</div>
|
||||||
{forcedRouteSummary && (
|
{forcedRouteSummary && (
|
||||||
<div className="mt-1 text-destructive">
|
<div className="mt-1 text-destructive">
|
||||||
|
|||||||
@@ -20,7 +20,11 @@ import type {
|
|||||||
} from '../types';
|
} from '../types';
|
||||||
import type { RawPacketStatsSessionState } from '../utils/rawPacketStats';
|
import type { RawPacketStatsSessionState } from '../utils/rawPacketStats';
|
||||||
import { CONTACT_TYPE_REPEATER, CONTACT_TYPE_ROOM } from '../types';
|
import { CONTACT_TYPE_REPEATER, CONTACT_TYPE_ROOM } from '../types';
|
||||||
import { isPrefixOnlyContact, isUnknownFullKeyContact } from '../utils/pubkey';
|
import {
|
||||||
|
getContactDisplayName,
|
||||||
|
isPrefixOnlyContact,
|
||||||
|
isUnknownFullKeyContact,
|
||||||
|
} from '../utils/pubkey';
|
||||||
|
|
||||||
const RepeaterDashboard = lazy(() =>
|
const RepeaterDashboard = lazy(() =>
|
||||||
import('./RepeaterDashboard').then((m) => ({ default: m.RepeaterDashboard }))
|
import('./RepeaterDashboard').then((m) => ({ default: m.RepeaterDashboard }))
|
||||||
@@ -65,6 +69,7 @@ interface ConversationPaneProps {
|
|||||||
channelKey: string,
|
channelKey: string,
|
||||||
pathHashModeOverride: number | null
|
pathHashModeOverride: number | null
|
||||||
) => Promise<void>;
|
) => Promise<void>;
|
||||||
|
onSelectConversation: (conversation: Conversation) => void;
|
||||||
onOpenContactInfo: (publicKey: string, fromChannel?: boolean) => void;
|
onOpenContactInfo: (publicKey: string, fromChannel?: boolean) => void;
|
||||||
onOpenChannelInfo: (channelKey: string) => void;
|
onOpenChannelInfo: (channelKey: string) => void;
|
||||||
onSenderClick: (sender: string) => void;
|
onSenderClick: (sender: string) => void;
|
||||||
@@ -77,6 +82,11 @@ interface ConversationPaneProps {
|
|||||||
onDismissUnreadMarker: () => void;
|
onDismissUnreadMarker: () => void;
|
||||||
onSendMessage: (text: string) => Promise<void>;
|
onSendMessage: (text: string) => Promise<void>;
|
||||||
onToggleNotifications: () => void;
|
onToggleNotifications: () => void;
|
||||||
|
pushSupported?: boolean;
|
||||||
|
pushSubscribed?: boolean;
|
||||||
|
pushEnabledForConversation?: boolean;
|
||||||
|
onTogglePush?: () => void;
|
||||||
|
onOpenPushSettings?: () => void;
|
||||||
trackedTelemetryRepeaters: string[];
|
trackedTelemetryRepeaters: string[];
|
||||||
onToggleTrackedTelemetry: (publicKey: string) => Promise<void>;
|
onToggleTrackedTelemetry: (publicKey: string) => Promise<void>;
|
||||||
repeaterAutoLoginKey: string | null;
|
repeaterAutoLoginKey: string | null;
|
||||||
@@ -137,6 +147,7 @@ export function ConversationPane({
|
|||||||
onDeleteChannel,
|
onDeleteChannel,
|
||||||
onSetChannelFloodScopeOverride,
|
onSetChannelFloodScopeOverride,
|
||||||
onSetChannelPathHashModeOverride,
|
onSetChannelPathHashModeOverride,
|
||||||
|
onSelectConversation,
|
||||||
onOpenContactInfo,
|
onOpenContactInfo,
|
||||||
onOpenChannelInfo,
|
onOpenChannelInfo,
|
||||||
onSenderClick,
|
onSenderClick,
|
||||||
@@ -149,6 +160,11 @@ export function ConversationPane({
|
|||||||
onDismissUnreadMarker,
|
onDismissUnreadMarker,
|
||||||
onSendMessage,
|
onSendMessage,
|
||||||
onToggleNotifications,
|
onToggleNotifications,
|
||||||
|
pushSupported,
|
||||||
|
pushSubscribed,
|
||||||
|
pushEnabledForConversation,
|
||||||
|
onTogglePush,
|
||||||
|
onOpenPushSettings,
|
||||||
trackedTelemetryRepeaters,
|
trackedTelemetryRepeaters,
|
||||||
onToggleTrackedTelemetry,
|
onToggleTrackedTelemetry,
|
||||||
repeaterAutoLoginKey,
|
repeaterAutoLoginKey,
|
||||||
@@ -197,6 +213,17 @@ export function ConversationPane({
|
|||||||
focusedKey={activeConversation.mapFocusKey}
|
focusedKey={activeConversation.mapFocusKey}
|
||||||
rawPackets={rawPackets}
|
rawPackets={rawPackets}
|
||||||
config={config}
|
config={config}
|
||||||
|
onSelectContact={(contact) =>
|
||||||
|
onSelectConversation({
|
||||||
|
type: 'contact',
|
||||||
|
id: contact.public_key,
|
||||||
|
name: getContactDisplayName(
|
||||||
|
contact.name,
|
||||||
|
contact.public_key,
|
||||||
|
contact.last_advert
|
||||||
|
),
|
||||||
|
})
|
||||||
|
}
|
||||||
/>
|
/>
|
||||||
</Suspense>
|
</Suspense>
|
||||||
</div>
|
</div>
|
||||||
@@ -271,6 +298,11 @@ export function ConversationPane({
|
|||||||
notificationsSupported={notificationsSupported}
|
notificationsSupported={notificationsSupported}
|
||||||
notificationsEnabled={notificationsEnabled}
|
notificationsEnabled={notificationsEnabled}
|
||||||
notificationsPermission={notificationsPermission}
|
notificationsPermission={notificationsPermission}
|
||||||
|
pushSupported={pushSupported}
|
||||||
|
pushSubscribed={pushSubscribed}
|
||||||
|
pushEnabledForConversation={pushEnabledForConversation}
|
||||||
|
onTogglePush={onTogglePush}
|
||||||
|
onOpenPushSettings={onOpenPushSettings}
|
||||||
onTrace={onTrace}
|
onTrace={onTrace}
|
||||||
onPathDiscovery={onPathDiscovery}
|
onPathDiscovery={onPathDiscovery}
|
||||||
onToggleNotifications={onToggleNotifications}
|
onToggleNotifications={onToggleNotifications}
|
||||||
|
|||||||
@@ -1,5 +1,14 @@
|
|||||||
import { Fragment, useEffect, useState, useMemo, useRef, useCallback } from 'react';
|
import { Fragment, useEffect, useState, useMemo, useRef, useCallback } from 'react';
|
||||||
import { MapContainer, TileLayer, CircleMarker, Popup, useMap, Polyline } from 'react-leaflet';
|
import {
|
||||||
|
MapContainer,
|
||||||
|
TileLayer,
|
||||||
|
CircleMarker,
|
||||||
|
Popup,
|
||||||
|
useMap,
|
||||||
|
useMapEvents,
|
||||||
|
Polyline,
|
||||||
|
LayersControl,
|
||||||
|
} from 'react-leaflet';
|
||||||
import type { LatLngBoundsExpression, CircleMarker as LeafletCircleMarker } from 'leaflet';
|
import type { LatLngBoundsExpression, CircleMarker as LeafletCircleMarker } from 'leaflet';
|
||||||
import L from 'leaflet';
|
import L from 'leaflet';
|
||||||
import 'leaflet/dist/leaflet.css';
|
import 'leaflet/dist/leaflet.css';
|
||||||
@@ -21,29 +30,132 @@ interface MapViewProps {
|
|||||||
focusedKey?: string | null;
|
focusedKey?: string | null;
|
||||||
rawPackets?: RawPacket[];
|
rawPackets?: RawPacket[];
|
||||||
config?: RadioConfig | null;
|
config?: RadioConfig | null;
|
||||||
|
/** When provided, the contact name in each popup becomes a clickable link
|
||||||
|
* that opens the conversation for that contact (DM, repeater, or room). */
|
||||||
|
onSelectContact?: (contact: Contact) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Tile layer presets ---
|
// --- Tile layer presets ---
|
||||||
const TILE_LIGHT = {
|
// Every provider here is free and works without an API key. Attribution strings
|
||||||
url: 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png',
|
// follow each provider's requirements; do not remove them. If you add a new
|
||||||
attribution: '© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a>',
|
// provider, verify its terms of service (especially for Esri / Google-style
|
||||||
background: '#1a1a2e',
|
// satellite tiles) before committing.
|
||||||
};
|
interface TileLayerPreset {
|
||||||
const TILE_DARK = {
|
id: string;
|
||||||
url: 'https://{s}.basemaps.cartocdn.com/dark_all/{z}/{x}/{y}{r}.png',
|
label: string;
|
||||||
attribution:
|
url: string;
|
||||||
'© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> © <a href="https://carto.com/">CARTO</a>',
|
attribution: string;
|
||||||
background: '#0d0d0d',
|
background: string;
|
||||||
};
|
/** Highest zoom the provider publishes tiles at. When the layer is active,
|
||||||
|
* the map's zoom ceiling is tightened to this value via
|
||||||
|
* `MaxZoomByActiveLayer` so the user cannot zoom into a grey void. */
|
||||||
|
maxZoom?: number;
|
||||||
|
}
|
||||||
|
|
||||||
function getSavedDarkMap(): boolean {
|
// Global zoom bounds for the MapContainer itself. These are pinned to the
|
||||||
|
// container so Leaflet's internal tile-range math never has to guess when
|
||||||
|
// layers swap in/out via LayersControl. Without this, an initial-mount race
|
||||||
|
// between MapContainer layout and LayersControl.BaseLayer addition has been
|
||||||
|
// observed to throw "Attempted to load an infinite number of tiles".
|
||||||
|
const MAP_MIN_ZOOM = 2;
|
||||||
|
const MAP_MAX_ZOOM = 19;
|
||||||
|
|
||||||
|
const TILE_LAYERS: readonly TileLayerPreset[] = [
|
||||||
|
{
|
||||||
|
id: 'light',
|
||||||
|
label: 'Light (OpenStreetMap)',
|
||||||
|
url: 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png',
|
||||||
|
attribution: '© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a>',
|
||||||
|
background: '#1a1a2e',
|
||||||
|
maxZoom: 19,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'dark',
|
||||||
|
label: 'Dark (CARTO)',
|
||||||
|
url: 'https://{s}.basemaps.cartocdn.com/dark_all/{z}/{x}/{y}{r}.png',
|
||||||
|
attribution:
|
||||||
|
'© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> © <a href="https://carto.com/">CARTO</a>',
|
||||||
|
background: '#0d0d0d',
|
||||||
|
maxZoom: 19,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'topographic',
|
||||||
|
label: 'Topographic (OpenTopoMap)',
|
||||||
|
url: 'https://{s}.tile.opentopomap.org/{z}/{x}/{y}.png',
|
||||||
|
attribution:
|
||||||
|
'Map data: © <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors, <a href="http://viewfinderpanoramas.org">SRTM</a> | Map style: © <a href="https://opentopomap.org">OpenTopoMap</a> (<a href="https://creativecommons.org/licenses/by-sa/3.0/">CC-BY-SA</a>)',
|
||||||
|
background: '#a3b3bc',
|
||||||
|
maxZoom: 17,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'satellite',
|
||||||
|
label: 'Satellite (Esri)',
|
||||||
|
url: 'https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}',
|
||||||
|
attribution:
|
||||||
|
'Tiles © <a href="https://www.esri.com/">Esri</a> — Source: Esri, Maxar, Earthstar Geographics, and the GIS User Community',
|
||||||
|
background: '#1a1f2e',
|
||||||
|
// Esri's tile service advertises LODs up to 23 and returns HTTP 200 for
|
||||||
|
// every tile request, but the underlying imagery is only high-resolution
|
||||||
|
// up to ~18 in most developed areas and shallower in rural regions. We
|
||||||
|
// cap at 18 rather than 19 so users don't zoom into visibly-empty or
|
||||||
|
// severely-upscaled tiles. Remote regions may still be sparse at 18.
|
||||||
|
maxZoom: 18,
|
||||||
|
},
|
||||||
|
] as const;
|
||||||
|
|
||||||
|
const MAP_LAYER_STORAGE_KEY = 'remoteterm-map-layer';
|
||||||
|
const LEGACY_DARK_MAP_STORAGE_KEY = 'remoteterm-dark-map';
|
||||||
|
|
||||||
|
function getSavedLayerId(): string {
|
||||||
try {
|
try {
|
||||||
return localStorage.getItem('remoteterm-dark-map') === 'true';
|
const stored = localStorage.getItem(MAP_LAYER_STORAGE_KEY);
|
||||||
|
if (stored && TILE_LAYERS.some((l) => l.id === stored)) return stored;
|
||||||
|
// Legacy migration: boolean dark-map flag predates multi-layer support.
|
||||||
|
const legacyDark = localStorage.getItem(LEGACY_DARK_MAP_STORAGE_KEY) === 'true';
|
||||||
|
return legacyDark ? 'dark' : 'light';
|
||||||
} catch {
|
} catch {
|
||||||
return false;
|
return 'light';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Leaflet-internal companion component: listens for base-layer changes driven
|
||||||
|
* by Leaflet's own LayersControl UI and pipes the selection back to React.
|
||||||
|
* Kept separate so the persistence/state logic stays out of the render tree.
|
||||||
|
*/
|
||||||
|
function LayerChangeWatcher({ onChange }: { onChange: (name: string) => void }) {
|
||||||
|
useMapEvents({
|
||||||
|
baselayerchange: (event) => {
|
||||||
|
if (event.name) onChange(event.name);
|
||||||
|
},
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Enforces the active layer's zoom ceiling on the underlying Leaflet map.
|
||||||
|
*
|
||||||
|
* Leaflet's `map.getMaxZoom()` prefers `options.maxZoom` (set on MapContainer)
|
||||||
|
* over per-layer `maxZoom`, so a per-TileLayer cap is silently ignored unless
|
||||||
|
* we push it down to the map itself. We do that here whenever the active
|
||||||
|
* layer changes, and clamp the current zoom if the user happened to be zoomed
|
||||||
|
* past the new cap at the moment of the switch.
|
||||||
|
*
|
||||||
|
* The MapContainer's fixed `minZoom`/`maxZoom` remain the absolute hull that
|
||||||
|
* prevents the "Attempted to load an infinite number of tiles" race during
|
||||||
|
* initial mount (see `MAP_MIN_ZOOM`/`MAP_MAX_ZOOM` below).
|
||||||
|
*/
|
||||||
|
function MaxZoomByActiveLayer({ maxZoom }: { maxZoom: number }) {
|
||||||
|
const map = useMap();
|
||||||
|
useEffect(() => {
|
||||||
|
map.setMaxZoom(maxZoom);
|
||||||
|
if (map.getZoom() > maxZoom) {
|
||||||
|
map.setZoom(maxZoom);
|
||||||
|
}
|
||||||
|
}, [map, maxZoom]);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
const MAP_RECENCY_COLORS = {
|
const MAP_RECENCY_COLORS = {
|
||||||
recent: '#06b6d4',
|
recent: '#06b6d4',
|
||||||
today: '#2563eb',
|
today: '#2563eb',
|
||||||
@@ -379,20 +491,43 @@ function ParticleOverlay({ particles }: { particles: MapParticle[] }) {
|
|||||||
|
|
||||||
// --- Main component ---
|
// --- Main component ---
|
||||||
|
|
||||||
export function MapView({ contacts, focusedKey, rawPackets, config }: MapViewProps) {
|
export function MapView({
|
||||||
|
contacts,
|
||||||
|
focusedKey,
|
||||||
|
rawPackets,
|
||||||
|
config,
|
||||||
|
onSelectContact,
|
||||||
|
}: MapViewProps) {
|
||||||
const [sevenDaysAgo] = useState(() => Date.now() / 1000 - 7 * 24 * 60 * 60);
|
const [sevenDaysAgo] = useState(() => Date.now() / 1000 - 7 * 24 * 60 * 60);
|
||||||
const [darkMap, setDarkMap] = useState(getSavedDarkMap);
|
const [selectedLayerId, setSelectedLayerId] = useState<string>(getSavedLayerId);
|
||||||
const tile = darkMap ? TILE_DARK : TILE_LIGHT;
|
const activeLayer = TILE_LAYERS.find((l) => l.id === selectedLayerId) ?? TILE_LAYERS[0];
|
||||||
|
|
||||||
// Sync with settings changes from other components
|
// Sync layer selection across tabs and windows.
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const onStorage = (e: StorageEvent) => {
|
const onStorage = (e: StorageEvent) => {
|
||||||
if (e.key === 'remoteterm-dark-map') setDarkMap(e.newValue === 'true');
|
if (e.key !== MAP_LAYER_STORAGE_KEY) return;
|
||||||
|
const next = e.newValue ?? '';
|
||||||
|
if (TILE_LAYERS.some((l) => l.id === next)) {
|
||||||
|
setSelectedLayerId(next);
|
||||||
|
}
|
||||||
};
|
};
|
||||||
window.addEventListener('storage', onStorage);
|
window.addEventListener('storage', onStorage);
|
||||||
return () => window.removeEventListener('storage', onStorage);
|
return () => window.removeEventListener('storage', onStorage);
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
|
const handleLayerChange = useCallback((layerName: string) => {
|
||||||
|
const match = TILE_LAYERS.find((l) => l.label === layerName);
|
||||||
|
if (!match) return;
|
||||||
|
setSelectedLayerId(match.id);
|
||||||
|
try {
|
||||||
|
localStorage.setItem(MAP_LAYER_STORAGE_KEY, match.id);
|
||||||
|
// Clear the legacy key so a future downgrade-rollback doesn't revert us.
|
||||||
|
localStorage.removeItem(LEGACY_DARK_MAP_STORAGE_KEY);
|
||||||
|
} catch {
|
||||||
|
// localStorage may be disabled; selection stays in memory only.
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
const [showPackets, setShowPackets] = useState(false);
|
const [showPackets, setShowPackets] = useState(false);
|
||||||
const [discoveryMode, setDiscoveryMode] = useState(false);
|
const [discoveryMode, setDiscoveryMode] = useState(false);
|
||||||
const [discoveredKeys, setDiscoveredKeys] = useState<Set<string>>(new Set());
|
const [discoveredKeys, setDiscoveredKeys] = useState<Set<string>>(new Set());
|
||||||
@@ -674,10 +809,12 @@ export function MapView({ contacts, focusedKey, rawPackets, config }: MapViewPro
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="flex flex-col h-full">
|
<div className="flex flex-col h-full">
|
||||||
{/* Info bar */}
|
{/* Info bar: stacks vertically on narrow viewports (info label, legend
|
||||||
<div className="px-4 py-2 bg-muted/50 text-xs text-muted-foreground flex items-center justify-between">
|
row, controls row) so nothing truncates; flattens to a single row
|
||||||
|
with right-aligned cluster at md and up. */}
|
||||||
|
<div className="px-4 py-2 bg-muted/50 text-xs text-muted-foreground flex flex-col gap-1 md:flex-row md:items-center md:justify-between md:gap-3">
|
||||||
<span>{infoLabel}</span>
|
<span>{infoLabel}</span>
|
||||||
<div className="flex items-center gap-3">
|
<div className="flex flex-wrap items-center gap-x-3 gap-y-1 md:justify-end">
|
||||||
{!showPackets && (
|
{!showPackets && (
|
||||||
<>
|
<>
|
||||||
<span className="flex items-center gap-1">
|
<span className="flex items-center gap-1">
|
||||||
@@ -758,7 +895,7 @@ export function MapView({ contacts, focusedKey, rawPackets, config }: MapViewPro
|
|||||||
/>{' '}
|
/>{' '}
|
||||||
repeater
|
repeater
|
||||||
</span>
|
</span>
|
||||||
<label className="flex items-center gap-1.5 cursor-pointer ml-2">
|
<label className="flex items-center gap-1.5 cursor-pointer">
|
||||||
<input
|
<input
|
||||||
type="checkbox"
|
type="checkbox"
|
||||||
checked={showPackets}
|
checked={showPackets}
|
||||||
@@ -791,10 +928,28 @@ export function MapView({ contacts, focusedKey, rawPackets, config }: MapViewPro
|
|||||||
<MapContainer
|
<MapContainer
|
||||||
center={[20, 0]}
|
center={[20, 0]}
|
||||||
zoom={2}
|
zoom={2}
|
||||||
|
minZoom={MAP_MIN_ZOOM}
|
||||||
|
maxZoom={MAP_MAX_ZOOM}
|
||||||
className="h-full w-full"
|
className="h-full w-full"
|
||||||
style={{ background: tile.background }}
|
style={{ background: activeLayer.background }}
|
||||||
>
|
>
|
||||||
<TileLayer key={tile.url} attribution={tile.attribution} url={tile.url} />
|
<LayersControl position="topright" collapsed={false}>
|
||||||
|
{TILE_LAYERS.map((layer) => (
|
||||||
|
<LayersControl.BaseLayer
|
||||||
|
key={layer.id}
|
||||||
|
name={layer.label}
|
||||||
|
checked={layer.id === selectedLayerId}
|
||||||
|
>
|
||||||
|
<TileLayer
|
||||||
|
url={layer.url}
|
||||||
|
attribution={layer.attribution}
|
||||||
|
maxZoom={layer.maxZoom}
|
||||||
|
/>
|
||||||
|
</LayersControl.BaseLayer>
|
||||||
|
))}
|
||||||
|
</LayersControl>
|
||||||
|
<LayerChangeWatcher onChange={handleLayerChange} />
|
||||||
|
<MaxZoomByActiveLayer maxZoom={activeLayer.maxZoom ?? MAP_MAX_ZOOM} />
|
||||||
<MapBoundsHandler contacts={mappableContacts} focusedContact={focusedContact} />
|
<MapBoundsHandler contacts={mappableContacts} focusedContact={focusedContact} />
|
||||||
|
|
||||||
{/* Faint route lines for active packet paths */}
|
{/* Faint route lines for active packet paths */}
|
||||||
@@ -839,7 +994,21 @@ export function MapView({ contacts, focusedKey, rawPackets, config }: MapViewPro
|
|||||||
🛜
|
🛜
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
{displayName}
|
{onSelectContact ? (
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="p-0 bg-transparent border-0 font-inherit text-primary underline hover:text-primary/80 cursor-pointer"
|
||||||
|
onClick={(event) => {
|
||||||
|
event.stopPropagation();
|
||||||
|
onSelectContact(contact);
|
||||||
|
}}
|
||||||
|
title={`Open conversation with ${displayName}`}
|
||||||
|
>
|
||||||
|
{displayName}
|
||||||
|
</button>
|
||||||
|
) : (
|
||||||
|
displayName
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
<div className="text-xs text-gray-500 mt-1">Last heard: {lastHeardLabel}</div>
|
<div className="text-xs text-gray-500 mt-1">Last heard: {lastHeardLabel}</div>
|
||||||
<div className="text-xs text-gray-400 mt-1 font-mono">
|
<div className="text-xs text-gray-400 mt-1 font-mono">
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ import {
|
|||||||
type ReactNode,
|
type ReactNode,
|
||||||
} from 'react';
|
} from 'react';
|
||||||
import type { Channel, Contact, Message, MessagePath, RadioConfig, RawPacket } from '../types';
|
import type { Channel, Contact, Message, MessagePath, RadioConfig, RawPacket } from '../types';
|
||||||
import { CONTACT_TYPE_REPEATER, CONTACT_TYPE_ROOM } from '../types';
|
import { CONTACT_TYPE_ROOM } from '../types';
|
||||||
import { api } from '../api';
|
import { api } from '../api';
|
||||||
import {
|
import {
|
||||||
findLinkedChannelReferences,
|
findLinkedChannelReferences,
|
||||||
@@ -808,12 +808,13 @@ export function MessageList({
|
|||||||
{sortedMessages.map((msg, index) => {
|
{sortedMessages.map((msg, index) => {
|
||||||
// For DMs, look up contact; for channel messages, use parsed sender
|
// For DMs, look up contact; for channel messages, use parsed sender
|
||||||
const contact = msg.type === 'PRIV' ? getContact(msg.conversation_key) : null;
|
const contact = msg.type === 'PRIV' ? getContact(msg.conversation_key) : null;
|
||||||
const isRepeater = contact?.type === CONTACT_TYPE_REPEATER;
|
|
||||||
const isRoomServer = contact?.type === CONTACT_TYPE_ROOM;
|
const isRoomServer = contact?.type === CONTACT_TYPE_ROOM;
|
||||||
|
|
||||||
// Skip sender parsing for repeater messages (CLI responses often have colons)
|
// Only parse "sender: text" prefix for channel messages — DMs never carry
|
||||||
|
// an in-text sender prefix, so parsing them would incorrectly strip
|
||||||
|
// user text that happens to contain a colon (e.g. "TEST1: TEST2").
|
||||||
const { sender, content } =
|
const { sender, content } =
|
||||||
isRepeater || (isRoomServer && msg.type === 'PRIV')
|
msg.type === 'PRIV'
|
||||||
? { sender: null, content: msg.text }
|
? { sender: null, content: msg.text }
|
||||||
: parseSenderFromText(msg.text);
|
: parseSenderFromText(msg.text);
|
||||||
const directSenderName =
|
const directSenderName =
|
||||||
@@ -845,7 +846,8 @@ export function MessageList({
|
|||||||
isCorruptChannelMessage
|
isCorruptChannelMessage
|
||||||
);
|
);
|
||||||
const prevMsg = sortedMessages[index - 1];
|
const prevMsg = sortedMessages[index - 1];
|
||||||
const prevParsedSender = prevMsg ? parseSenderFromText(prevMsg.text).sender : null;
|
const prevParsedSender =
|
||||||
|
prevMsg && prevMsg.type === 'CHAN' ? parseSenderFromText(prevMsg.text).sender : null;
|
||||||
const prevSenderKey = prevMsg
|
const prevSenderKey = prevMsg
|
||||||
? getSenderKey(
|
? getSenderKey(
|
||||||
prevMsg,
|
prevMsg,
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { useState, useEffect, type ReactNode } from 'react';
|
|||||||
import type {
|
import type {
|
||||||
AppSettings,
|
AppSettings,
|
||||||
AppSettingsUpdate,
|
AppSettingsUpdate,
|
||||||
|
Channel,
|
||||||
Contact,
|
Contact,
|
||||||
HealthStatus,
|
HealthStatus,
|
||||||
RadioAdvertMode,
|
RadioAdvertMode,
|
||||||
@@ -49,6 +50,7 @@ interface SettingsModalBaseProps {
|
|||||||
onToggleBlockedKey?: (key: string) => void;
|
onToggleBlockedKey?: (key: string) => void;
|
||||||
onToggleBlockedName?: (name: string) => void;
|
onToggleBlockedName?: (name: string) => void;
|
||||||
contacts?: Contact[];
|
contacts?: Contact[];
|
||||||
|
channels?: Channel[];
|
||||||
onBulkDeleteContacts?: (deletedKeys: string[]) => void;
|
onBulkDeleteContacts?: (deletedKeys: string[]) => void;
|
||||||
trackedTelemetryRepeaters?: string[];
|
trackedTelemetryRepeaters?: string[];
|
||||||
onToggleTrackedTelemetry?: (publicKey: string) => Promise<void>;
|
onToggleTrackedTelemetry?: (publicKey: string) => Promise<void>;
|
||||||
@@ -86,6 +88,7 @@ export function SettingsModal(props: SettingsModalProps) {
|
|||||||
onToggleBlockedKey,
|
onToggleBlockedKey,
|
||||||
onToggleBlockedName,
|
onToggleBlockedName,
|
||||||
contacts,
|
contacts,
|
||||||
|
channels,
|
||||||
onBulkDeleteContacts,
|
onBulkDeleteContacts,
|
||||||
trackedTelemetryRepeaters,
|
trackedTelemetryRepeaters,
|
||||||
onToggleTrackedTelemetry,
|
onToggleTrackedTelemetry,
|
||||||
@@ -228,6 +231,8 @@ export function SettingsModal(props: SettingsModalProps) {
|
|||||||
{isSectionVisible('local') && (
|
{isSectionVisible('local') && (
|
||||||
<SettingsLocalSection
|
<SettingsLocalSection
|
||||||
onLocalLabelChange={onLocalLabelChange}
|
onLocalLabelChange={onLocalLabelChange}
|
||||||
|
contacts={contacts}
|
||||||
|
channels={channels}
|
||||||
className={sectionContentClass}
|
className={sectionContentClass}
|
||||||
/>
|
/>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -12,13 +12,21 @@ import type { HealthStatus, RadioConfig } from '../types';
|
|||||||
import { api } from '../api';
|
import { api } from '../api';
|
||||||
import { toast } from './ui/sonner';
|
import { toast } from './ui/sonner';
|
||||||
import { handleKeyboardActivate } from '../utils/a11y';
|
import { handleKeyboardActivate } from '../utils/a11y';
|
||||||
import { applyTheme, getSavedTheme, THEME_CHANGE_EVENT } from '../utils/theme';
|
import { applyTheme, getEffectiveTheme, THEME_CHANGE_EVENT } from '../utils/theme';
|
||||||
import {
|
import {
|
||||||
BATTERY_DISPLAY_CHANGE_EVENT,
|
BATTERY_DISPLAY_CHANGE_EVENT,
|
||||||
getShowBatteryPercent,
|
getShowBatteryPercent,
|
||||||
getShowBatteryVoltage,
|
getShowBatteryVoltage,
|
||||||
mvToPercent,
|
mvToPercent,
|
||||||
} from '../utils/batteryDisplay';
|
} from '../utils/batteryDisplay';
|
||||||
|
import {
|
||||||
|
STATUS_DOT_PULSE_CHANGE_EVENT,
|
||||||
|
STATUS_DOT_PULSE_DURATION_MS,
|
||||||
|
STATUS_DOT_PULSE_PACKET_EVENT,
|
||||||
|
getStatusDotPulseEnabled,
|
||||||
|
pulseColorFor,
|
||||||
|
type StatusDotPulseKind,
|
||||||
|
} from '../utils/statusDotPulse';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
|
|
||||||
interface StatusBarProps {
|
interface StatusBarProps {
|
||||||
@@ -84,17 +92,71 @@ export function StatusBar({
|
|||||||
? 'Radio OK'
|
? 'Radio OK'
|
||||||
: 'Radio Disconnected';
|
: 'Radio Disconnected';
|
||||||
const [reconnecting, setReconnecting] = useState(false);
|
const [reconnecting, setReconnecting] = useState(false);
|
||||||
const [currentTheme, setCurrentTheme] = useState(getSavedTheme);
|
// Track the *effective* theme (follow-os is resolved to original/light) so the
|
||||||
|
// toggle icon and action match what the user currently sees rendered.
|
||||||
|
const [currentTheme, setCurrentTheme] = useState(getEffectiveTheme);
|
||||||
|
const [pulseEnabled, setPulseEnabled] = useState(getStatusDotPulseEnabled);
|
||||||
|
const [pulseKind, setPulseKind] = useState<StatusDotPulseKind | null>(null);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const handleThemeChange = (event: Event) => {
|
const handler = () => setPulseEnabled(getStatusDotPulseEnabled());
|
||||||
const themeId = (event as CustomEvent<string>).detail;
|
window.addEventListener(STATUS_DOT_PULSE_CHANGE_EVENT, handler);
|
||||||
setCurrentTheme(typeof themeId === 'string' && themeId ? themeId : getSavedTheme());
|
return () => window.removeEventListener(STATUS_DOT_PULSE_CHANGE_EVENT, handler);
|
||||||
};
|
}, []);
|
||||||
|
|
||||||
window.addEventListener(THEME_CHANGE_EVENT, handleThemeChange as EventListener);
|
useEffect(() => {
|
||||||
|
if (!pulseEnabled) {
|
||||||
|
setPulseKind(null);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let timer: number | null = null;
|
||||||
|
const handler = (event: Event) => {
|
||||||
|
const kind = (event as CustomEvent<StatusDotPulseKind>).detail;
|
||||||
|
setPulseKind(kind);
|
||||||
|
if (timer !== null) {
|
||||||
|
window.clearTimeout(timer);
|
||||||
|
}
|
||||||
|
timer = window.setTimeout(() => {
|
||||||
|
setPulseKind(null);
|
||||||
|
timer = null;
|
||||||
|
}, STATUS_DOT_PULSE_DURATION_MS);
|
||||||
|
};
|
||||||
|
window.addEventListener(STATUS_DOT_PULSE_PACKET_EVENT, handler);
|
||||||
return () => {
|
return () => {
|
||||||
window.removeEventListener(THEME_CHANGE_EVENT, handleThemeChange as EventListener);
|
window.removeEventListener(STATUS_DOT_PULSE_PACKET_EVENT, handler);
|
||||||
|
if (timer !== null) {
|
||||||
|
window.clearTimeout(timer);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}, [pulseEnabled]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const syncEffective = () => setCurrentTheme(getEffectiveTheme());
|
||||||
|
window.addEventListener(THEME_CHANGE_EVENT, syncEffective);
|
||||||
|
|
||||||
|
// When saved theme is "follow-os", OS appearance changes alter the effective
|
||||||
|
// theme without firing a THEME_CHANGE_EVENT, so also watch matchMedia.
|
||||||
|
const mql =
|
||||||
|
typeof window.matchMedia === 'function'
|
||||||
|
? window.matchMedia('(prefers-color-scheme: light)')
|
||||||
|
: null;
|
||||||
|
if (mql) {
|
||||||
|
if (typeof mql.addEventListener === 'function') {
|
||||||
|
mql.addEventListener('change', syncEffective);
|
||||||
|
} else if (typeof (mql as MediaQueryList).addListener === 'function') {
|
||||||
|
(mql as MediaQueryList).addListener(syncEffective);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
window.removeEventListener(THEME_CHANGE_EVENT, syncEffective);
|
||||||
|
if (mql) {
|
||||||
|
if (typeof mql.removeEventListener === 'function') {
|
||||||
|
mql.removeEventListener('change', syncEffective);
|
||||||
|
} else if (typeof (mql as MediaQueryList).removeListener === 'function') {
|
||||||
|
(mql as MediaQueryList).removeListener(syncEffective);
|
||||||
|
}
|
||||||
|
}
|
||||||
};
|
};
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
@@ -154,9 +216,12 @@ export function StatusBar({
|
|||||||
radioState === 'initializing' || radioState === 'connecting'
|
radioState === 'initializing' || radioState === 'connecting'
|
||||||
? 'bg-warning'
|
? 'bg-warning'
|
||||||
: connected
|
: connected
|
||||||
? 'bg-status-connected shadow-[0_0_6px_hsl(var(--status-connected)/0.5)]'
|
? pulseKind
|
||||||
|
? ''
|
||||||
|
: 'bg-status-connected shadow-[0_0_6px_hsl(var(--status-connected)/0.5)]'
|
||||||
: 'bg-status-disconnected'
|
: 'bg-status-disconnected'
|
||||||
)}
|
)}
|
||||||
|
style={connected && pulseKind ? { backgroundColor: pulseColorFor(pulseKind) } : undefined}
|
||||||
aria-hidden="true"
|
aria-hidden="true"
|
||||||
/>
|
/>
|
||||||
<span className="hidden lg:inline text-muted-foreground">{statusLabel}</span>
|
<span className="hidden lg:inline text-muted-foreground">{statusLabel}</span>
|
||||||
|
|||||||
@@ -1,7 +1,23 @@
|
|||||||
|
import type { ReactNode } from 'react';
|
||||||
import { Separator } from '../ui/separator';
|
import { Separator } from '../ui/separator';
|
||||||
import { RepeaterPane, NotFetched, KvRow, formatDuration } from './repeaterPaneShared';
|
import { RepeaterPane, NotFetched, KvRow, formatDuration } from './repeaterPaneShared';
|
||||||
import type { RepeaterStatusResponse, PaneState } from '../../types';
|
import type { RepeaterStatusResponse, PaneState } from '../../types';
|
||||||
|
|
||||||
|
function Secondary({ children }: { children: ReactNode }) {
|
||||||
|
return <span className="ml-1.5 font-normal text-muted-foreground">{children}</span>;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatAirtimePercent(airtimeSec: number, uptimeSec: number): string | null {
|
||||||
|
if (uptimeSec <= 0) return null;
|
||||||
|
return `${((airtimeSec / uptimeSec) * 100).toFixed(2)}%`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatPerMinute(count: number, uptimeSec: number): string | null {
|
||||||
|
if (uptimeSec <= 0) return null;
|
||||||
|
const rate = (count * 60) / uptimeSec;
|
||||||
|
return rate >= 10 ? rate.toFixed(0) : rate.toFixed(1);
|
||||||
|
}
|
||||||
|
|
||||||
export function TelemetryPane({
|
export function TelemetryPane({
|
||||||
data,
|
data,
|
||||||
state,
|
state,
|
||||||
@@ -13,6 +29,11 @@ export function TelemetryPane({
|
|||||||
onRefresh: () => void;
|
onRefresh: () => void;
|
||||||
disabled?: boolean;
|
disabled?: boolean;
|
||||||
}) {
|
}) {
|
||||||
|
const txPct = data ? formatAirtimePercent(data.airtime_seconds, data.uptime_seconds) : null;
|
||||||
|
const rxPct = data ? formatAirtimePercent(data.rx_airtime_seconds, data.uptime_seconds) : null;
|
||||||
|
const rxPerMin = data ? formatPerMinute(data.packets_received, data.uptime_seconds) : null;
|
||||||
|
const txPerMin = data ? formatPerMinute(data.packets_sent, data.uptime_seconds) : null;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<RepeaterPane title="Telemetry" state={state} onRefresh={onRefresh} disabled={disabled}>
|
<RepeaterPane title="Telemetry" state={state} onRefresh={onRefresh} disabled={disabled}>
|
||||||
{!data ? (
|
{!data ? (
|
||||||
@@ -21,8 +42,24 @@ export function TelemetryPane({
|
|||||||
<div className="space-y-2">
|
<div className="space-y-2">
|
||||||
<KvRow label="Battery" value={`${data.battery_volts.toFixed(3)}V`} />
|
<KvRow label="Battery" value={`${data.battery_volts.toFixed(3)}V`} />
|
||||||
<KvRow label="Uptime" value={formatDuration(data.uptime_seconds)} />
|
<KvRow label="Uptime" value={formatDuration(data.uptime_seconds)} />
|
||||||
<KvRow label="TX Airtime" value={formatDuration(data.airtime_seconds)} />
|
<KvRow
|
||||||
<KvRow label="RX Airtime" value={formatDuration(data.rx_airtime_seconds)} />
|
label="TX Airtime"
|
||||||
|
value={
|
||||||
|
<>
|
||||||
|
{formatDuration(data.airtime_seconds)}
|
||||||
|
{txPct && <Secondary>({txPct})</Secondary>}
|
||||||
|
</>
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
<KvRow
|
||||||
|
label="RX Airtime"
|
||||||
|
value={
|
||||||
|
<>
|
||||||
|
{formatDuration(data.rx_airtime_seconds)}
|
||||||
|
{rxPct && <Secondary>({rxPct})</Secondary>}
|
||||||
|
</>
|
||||||
|
}
|
||||||
|
/>
|
||||||
<Separator className="my-1" />
|
<Separator className="my-1" />
|
||||||
<KvRow label="Noise Floor" value={`${data.noise_floor_dbm} dBm`} />
|
<KvRow label="Noise Floor" value={`${data.noise_floor_dbm} dBm`} />
|
||||||
<KvRow label="Last RSSI" value={`${data.last_rssi_dbm} dBm`} />
|
<KvRow label="Last RSSI" value={`${data.last_rssi_dbm} dBm`} />
|
||||||
@@ -30,7 +67,17 @@ export function TelemetryPane({
|
|||||||
<Separator className="my-1" />
|
<Separator className="my-1" />
|
||||||
<KvRow
|
<KvRow
|
||||||
label="Packets"
|
label="Packets"
|
||||||
value={`${data.packets_received.toLocaleString()} rx / ${data.packets_sent.toLocaleString()} tx`}
|
value={
|
||||||
|
<>
|
||||||
|
{data.packets_received.toLocaleString()} rx / {data.packets_sent.toLocaleString()}{' '}
|
||||||
|
tx
|
||||||
|
{rxPerMin && txPerMin && (
|
||||||
|
<Secondary>
|
||||||
|
(avg {rxPerMin} rx/min / {txPerMin} tx/min)
|
||||||
|
</Secondary>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
}
|
||||||
/>
|
/>
|
||||||
<KvRow
|
<KvRow
|
||||||
label="Flood"
|
label="Flood"
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ import type {
|
|||||||
Contact,
|
Contact,
|
||||||
HealthStatus,
|
HealthStatus,
|
||||||
TelemetryHistoryEntry,
|
TelemetryHistoryEntry,
|
||||||
|
TelemetrySchedule,
|
||||||
} from '../../types';
|
} from '../../types';
|
||||||
|
|
||||||
export function SettingsDatabaseSection({
|
export function SettingsDatabaseSection({
|
||||||
@@ -54,19 +55,45 @@ export function SettingsDatabaseSection({
|
|||||||
const [discoveryBlockedTypes, setDiscoveryBlockedTypes] = useState<number[]>([]);
|
const [discoveryBlockedTypes, setDiscoveryBlockedTypes] = useState<number[]>([]);
|
||||||
const [bulkDeleteOpen, setBulkDeleteOpen] = useState(false);
|
const [bulkDeleteOpen, setBulkDeleteOpen] = useState(false);
|
||||||
|
|
||||||
const [busy, setBusy] = useState(false);
|
|
||||||
const [error, setError] = useState<string | null>(null);
|
|
||||||
|
|
||||||
const [latestTelemetry, setLatestTelemetry] = useState<
|
const [latestTelemetry, setLatestTelemetry] = useState<
|
||||||
Record<string, TelemetryHistoryEntry | null>
|
Record<string, TelemetryHistoryEntry | null>
|
||||||
>({});
|
>({});
|
||||||
const telemetryFetchedRef = useRef(false);
|
const telemetryFetchedRef = useRef(false);
|
||||||
|
|
||||||
|
const [schedule, setSchedule] = useState<TelemetrySchedule | null>(null);
|
||||||
|
const [intervalDraft, setIntervalDraft] = useState<number>(appSettings.telemetry_interval_hours);
|
||||||
|
|
||||||
|
// Serialization chain for every auto-persisted control on this page.
|
||||||
|
// Without this, rapid successive toggles (or mixed dropdown + checkbox
|
||||||
|
// interactions) can dispatch overlapping PATCHes that land out of order
|
||||||
|
// on HTTP/2 — a stale write then wins, reverting the user's last click.
|
||||||
|
// Each call awaits the previous one before sending its request, so the
|
||||||
|
// server sees updates in the order the user made them.
|
||||||
|
const saveChainRef = useRef<Promise<void>>(Promise.resolve());
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
setAutoDecryptOnAdvert(appSettings.auto_decrypt_dm_on_advert);
|
setAutoDecryptOnAdvert(appSettings.auto_decrypt_dm_on_advert);
|
||||||
setDiscoveryBlockedTypes(appSettings.discovery_blocked_types ?? []);
|
setDiscoveryBlockedTypes(appSettings.discovery_blocked_types ?? []);
|
||||||
|
setIntervalDraft(appSettings.telemetry_interval_hours);
|
||||||
}, [appSettings]);
|
}, [appSettings]);
|
||||||
|
|
||||||
|
// Re-fetch the scheduler derivation whenever the tracked list changes or
|
||||||
|
// the stored preference changes. Cheap: single GET, no radio lock.
|
||||||
|
useEffect(() => {
|
||||||
|
let cancelled = false;
|
||||||
|
api
|
||||||
|
.getTelemetrySchedule()
|
||||||
|
.then((s) => {
|
||||||
|
if (!cancelled) setSchedule(s);
|
||||||
|
})
|
||||||
|
.catch(() => {
|
||||||
|
// Non-critical: dropdown falls back to the unfiltered menu.
|
||||||
|
});
|
||||||
|
return () => {
|
||||||
|
cancelled = true;
|
||||||
|
};
|
||||||
|
}, [trackedTelemetryRepeaters.length, appSettings.telemetry_interval_hours]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (trackedTelemetryRepeaters.length === 0 || telemetryFetchedRef.current) return;
|
if (trackedTelemetryRepeaters.length === 0 || telemetryFetchedRef.current) return;
|
||||||
telemetryFetchedRef.current = true;
|
telemetryFetchedRef.current = true;
|
||||||
@@ -132,28 +159,26 @@ export function SettingsDatabaseSection({
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleSave = async () => {
|
/**
|
||||||
setBusy(true);
|
* Apply an AppSettings PATCH after any already-queued saves finish, and
|
||||||
setError(null);
|
* revert local state if the save fails. Every auto-persist control on
|
||||||
|
* this page routes through here so the user-visible order of clicks is
|
||||||
try {
|
* the order the backend sees, regardless of network reordering.
|
||||||
const update: AppSettingsUpdate = { auto_decrypt_dm_on_advert: autoDecryptOnAdvert };
|
*/
|
||||||
const currentBlocked = appSettings.discovery_blocked_types ?? [];
|
const persistAppSettings = (update: AppSettingsUpdate, revert: () => void): Promise<void> => {
|
||||||
if (
|
const chained = saveChainRef.current.then(async () => {
|
||||||
discoveryBlockedTypes.length !== currentBlocked.length ||
|
try {
|
||||||
discoveryBlockedTypes.some((t) => !currentBlocked.includes(t))
|
await onSaveAppSettings(update);
|
||||||
) {
|
} catch (err) {
|
||||||
update.discovery_blocked_types = discoveryBlockedTypes;
|
console.error('Failed to save database settings:', err);
|
||||||
|
revert();
|
||||||
|
toast.error('Failed to save setting', {
|
||||||
|
description: err instanceof Error ? err.message : 'Unknown error',
|
||||||
|
});
|
||||||
}
|
}
|
||||||
await onSaveAppSettings(update);
|
});
|
||||||
toast.success('Database settings saved');
|
saveChainRef.current = chained;
|
||||||
} catch (err) {
|
return chained;
|
||||||
console.error('Failed to save database settings:', err);
|
|
||||||
setError(err instanceof Error ? err.message : 'Failed to save');
|
|
||||||
toast.error('Failed to save settings');
|
|
||||||
} finally {
|
|
||||||
setBusy(false);
|
|
||||||
}
|
|
||||||
};
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
@@ -249,7 +274,14 @@ export function SettingsDatabaseSection({
|
|||||||
<input
|
<input
|
||||||
type="checkbox"
|
type="checkbox"
|
||||||
checked={autoDecryptOnAdvert}
|
checked={autoDecryptOnAdvert}
|
||||||
onChange={(e) => setAutoDecryptOnAdvert(e.target.checked)}
|
onChange={(e) => {
|
||||||
|
const next = e.target.checked;
|
||||||
|
const prev = autoDecryptOnAdvert;
|
||||||
|
setAutoDecryptOnAdvert(next);
|
||||||
|
void persistAppSettings({ auto_decrypt_dm_on_advert: next }, () =>
|
||||||
|
setAutoDecryptOnAdvert(prev)
|
||||||
|
);
|
||||||
|
}}
|
||||||
className="w-4 h-4 rounded border-input accent-primary"
|
className="w-4 h-4 rounded border-input accent-primary"
|
||||||
/>
|
/>
|
||||||
<span className="text-sm">Auto-decrypt historical DMs when new contact advertises</span>
|
<span className="text-sm">Auto-decrypt historical DMs when new contact advertises</span>
|
||||||
@@ -266,10 +298,61 @@ export function SettingsDatabaseSection({
|
|||||||
<div className="space-y-3">
|
<div className="space-y-3">
|
||||||
<Label className="text-base">Tracked Repeater Telemetry</Label>
|
<Label className="text-base">Tracked Repeater Telemetry</Label>
|
||||||
<p className="text-xs text-muted-foreground">
|
<p className="text-xs text-muted-foreground">
|
||||||
Repeaters opted into automatic telemetry collection are polled every 8 hours. Up to 8
|
Repeaters opted into automatic telemetry collection are polled on a scheduled interval. To
|
||||||
repeaters may be tracked at a time ({trackedTelemetryRepeaters.length} / 8 slots used).
|
limit mesh traffic, the app caps telemetry at 24 checks per day across all tracked
|
||||||
|
repeaters — so fewer tracked repeaters allows shorter intervals, and more tracked
|
||||||
|
repeaters forces longer ones. Up to {schedule?.max_tracked ?? 8} repeaters may be tracked
|
||||||
|
at once ({trackedTelemetryRepeaters.length} / {schedule?.max_tracked ?? 8} slots used).
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
|
{/* Interval picker. Legal options depend on current tracked count;
|
||||||
|
we list only those. If the saved preference is no longer legal,
|
||||||
|
the effective interval is shown below so the user knows what the
|
||||||
|
scheduler is actually using. */}
|
||||||
|
<div className="space-y-1.5">
|
||||||
|
<Label htmlFor="telemetry-interval" className="text-sm">
|
||||||
|
Collection interval
|
||||||
|
</Label>
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<select
|
||||||
|
id="telemetry-interval"
|
||||||
|
value={intervalDraft}
|
||||||
|
onChange={(e) => {
|
||||||
|
const nextValue = Number(e.target.value);
|
||||||
|
if (!Number.isFinite(nextValue) || nextValue === intervalDraft) return;
|
||||||
|
const prevValue = intervalDraft;
|
||||||
|
setIntervalDraft(nextValue);
|
||||||
|
void persistAppSettings({ telemetry_interval_hours: nextValue }, () =>
|
||||||
|
setIntervalDraft(prevValue)
|
||||||
|
);
|
||||||
|
}}
|
||||||
|
className="h-9 px-3 rounded-md border border-input bg-background text-sm ring-offset-background focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2"
|
||||||
|
>
|
||||||
|
{(schedule?.options ?? [1, 2, 3, 4, 6, 8, 12, 24]).map((hrs) => (
|
||||||
|
<option key={hrs} value={hrs}>
|
||||||
|
Every {hrs} hour{hrs === 1 ? '' : 's'} ({Math.floor(24 / hrs)} check
|
||||||
|
{Math.floor(24 / hrs) === 1 ? '' : 's'}/day)
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
{schedule && schedule.effective_hours !== schedule.preferred_hours && (
|
||||||
|
<p className="text-xs text-warning">
|
||||||
|
Saved preference is {schedule.preferred_hours} hour
|
||||||
|
{schedule.preferred_hours === 1 ? '' : 's'}, but the scheduler is using{' '}
|
||||||
|
{schedule.effective_hours} hours because {schedule.tracked_count} repeater
|
||||||
|
{schedule.tracked_count === 1 ? '' : 's'}{' '}
|
||||||
|
{schedule.tracked_count === 1 ? 'is' : 'are'} tracked. Your preference will be
|
||||||
|
restored if you drop back to a supported count.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
{schedule?.next_run_at != null && (
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
Next run at {formatTime(schedule.next_run_at)} (UTC top of hour).
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
{trackedTelemetryRepeaters.length === 0 ? (
|
{trackedTelemetryRepeaters.length === 0 ? (
|
||||||
<p className="text-sm text-muted-foreground italic">
|
<p className="text-sm text-muted-foreground italic">
|
||||||
No repeaters are being tracked. Enable tracking from a repeater's dashboard.
|
No repeaters are being tracked. Enable tracking from a repeater's dashboard.
|
||||||
@@ -341,16 +424,6 @@ export function SettingsDatabaseSection({
|
|||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{error && (
|
|
||||||
<div className="text-sm text-destructive" role="alert">
|
|
||||||
{error}
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
<Button onClick={handleSave} disabled={busy} className="w-full">
|
|
||||||
{busy ? 'Saving...' : 'Save Settings'}
|
|
||||||
</Button>
|
|
||||||
|
|
||||||
<Separator />
|
<Separator />
|
||||||
|
|
||||||
{/* ── Contact Management ── */}
|
{/* ── Contact Management ── */}
|
||||||
@@ -380,11 +453,14 @@ export function SettingsDatabaseSection({
|
|||||||
<input
|
<input
|
||||||
type="checkbox"
|
type="checkbox"
|
||||||
checked={checked}
|
checked={checked}
|
||||||
onChange={() =>
|
onChange={() => {
|
||||||
setDiscoveryBlockedTypes((prev) =>
|
const prev = discoveryBlockedTypes;
|
||||||
checked ? prev.filter((t) => t !== typeCode) : [...prev, typeCode]
|
const next = checked ? prev.filter((t) => t !== typeCode) : [...prev, typeCode];
|
||||||
)
|
setDiscoveryBlockedTypes(next);
|
||||||
}
|
void persistAppSettings({ discovery_blocked_types: next }, () =>
|
||||||
|
setDiscoveryBlockedTypes(prev)
|
||||||
|
);
|
||||||
|
}}
|
||||||
className="rounded border-input"
|
className="rounded border-input"
|
||||||
/>
|
/>
|
||||||
{label}
|
{label}
|
||||||
|
|||||||
@@ -325,7 +325,7 @@ const CREATE_INTEGRATION_DEFINITIONS: readonly CreateIntegrationDefinition[] = [
|
|||||||
label: 'Map Upload',
|
label: 'Map Upload',
|
||||||
section: 'Community Sharing',
|
section: 'Community Sharing',
|
||||||
description:
|
description:
|
||||||
'Upload repeaters and room servers to map.meshcore.dev or a compatible map API endpoint.',
|
'Upload repeaters and room servers to map.meshcore.io or a compatible map API endpoint.',
|
||||||
defaultName: 'Map Upload',
|
defaultName: 'Map Upload',
|
||||||
nameMode: 'counted',
|
nameMode: 'counted',
|
||||||
defaults: {
|
defaults: {
|
||||||
@@ -1668,12 +1668,12 @@ function MapUploadConfigEditor({
|
|||||||
<p className="text-xs text-muted-foreground">
|
<p className="text-xs text-muted-foreground">
|
||||||
Automatically upload heard repeater and room server advertisements to{' '}
|
Automatically upload heard repeater and room server advertisements to{' '}
|
||||||
<a
|
<a
|
||||||
href="https://map.meshcore.dev"
|
href="https://map.meshcore.io"
|
||||||
target="_blank"
|
target="_blank"
|
||||||
rel="noopener noreferrer"
|
rel="noopener noreferrer"
|
||||||
className="underline hover:text-foreground"
|
className="underline hover:text-foreground"
|
||||||
>
|
>
|
||||||
map.meshcore.dev
|
map.meshcore.io
|
||||||
</a>
|
</a>
|
||||||
. Requires the radio's private key to be available (firmware must have{' '}
|
. Requires the radio's private key to be available (firmware must have{' '}
|
||||||
<code>ENABLE_PRIVATE_KEY_EXPORT=1</code>). Only raw RF packets are shared — never
|
<code>ENABLE_PRIVATE_KEY_EXPORT=1</code>). Only raw RF packets are shared — never
|
||||||
@@ -1710,12 +1710,12 @@ function MapUploadConfigEditor({
|
|||||||
<Input
|
<Input
|
||||||
id="fanout-map-api-url"
|
id="fanout-map-api-url"
|
||||||
type="url"
|
type="url"
|
||||||
placeholder="https://map.meshcore.dev/api/v1/uploader/node"
|
placeholder="https://map.meshcore.io/api/v1/uploader/node"
|
||||||
value={(config.api_url as string) || ''}
|
value={(config.api_url as string) || ''}
|
||||||
onChange={(e) => onChange({ ...config, api_url: e.target.value })}
|
onChange={(e) => onChange({ ...config, api_url: e.target.value })}
|
||||||
/>
|
/>
|
||||||
<p className="text-xs text-muted-foreground">
|
<p className="text-xs text-muted-foreground">
|
||||||
Leave blank to use the default <code>map.meshcore.dev</code> endpoint.
|
Leave blank to use the default <code>map.meshcore.io</code> endpoint.
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -1811,6 +1811,162 @@ function getFilterKeys(filter: unknown): string[] {
|
|||||||
return [];
|
return [];
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const MAX_SCOPE_PILL_DISPLAY = 32;
|
||||||
|
|
||||||
|
interface PillsSearchListItem {
|
||||||
|
key: string;
|
||||||
|
label: string;
|
||||||
|
/** Optional trailing monospace hint (e.g. pubkey prefix) */
|
||||||
|
trailing?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Search-and-pills picker for the generic fanout scope selector.
|
||||||
|
* Shows selected items as removable pills (up to MAX_SCOPE_PILL_DISPLAY),
|
||||||
|
* a search input, and a scrollable list of filtered items with checkboxes.
|
||||||
|
* When more than MAX_SCOPE_PILL_DISPLAY items are selected, the pill row
|
||||||
|
* collapses to a single informational badge to keep the interface clean.
|
||||||
|
*/
|
||||||
|
function PillsSearchList({
|
||||||
|
label,
|
||||||
|
labelSuffix,
|
||||||
|
items,
|
||||||
|
selectedKeys,
|
||||||
|
onToggle,
|
||||||
|
onAll,
|
||||||
|
onNone,
|
||||||
|
searchPlaceholder,
|
||||||
|
emptyItemsMessage,
|
||||||
|
}: {
|
||||||
|
label: string;
|
||||||
|
labelSuffix: string;
|
||||||
|
items: PillsSearchListItem[];
|
||||||
|
selectedKeys: string[];
|
||||||
|
onToggle: (key: string) => void;
|
||||||
|
onAll: () => void;
|
||||||
|
onNone: () => void;
|
||||||
|
searchPlaceholder: string;
|
||||||
|
emptyItemsMessage: string;
|
||||||
|
}) {
|
||||||
|
const [search, setSearch] = useState('');
|
||||||
|
const searchLower = search.toLowerCase().trim();
|
||||||
|
|
||||||
|
const filtered = useMemo(() => {
|
||||||
|
const matches = items.filter((it) => {
|
||||||
|
if (!searchLower) return true;
|
||||||
|
return (
|
||||||
|
it.label.toLowerCase().includes(searchLower) || it.key.toLowerCase().startsWith(searchLower)
|
||||||
|
);
|
||||||
|
});
|
||||||
|
// Selected items sort to top (mirrors the Home Assistant tracked-contacts picker)
|
||||||
|
return matches.sort((a, b) => {
|
||||||
|
const aSel = selectedKeys.includes(a.key) ? 0 : 1;
|
||||||
|
const bSel = selectedKeys.includes(b.key) ? 0 : 1;
|
||||||
|
if (aSel !== bSel) return aSel - bSel;
|
||||||
|
return a.label.localeCompare(b.label);
|
||||||
|
});
|
||||||
|
}, [items, searchLower, selectedKeys]);
|
||||||
|
|
||||||
|
const selectedDetails = useMemo(
|
||||||
|
() => items.filter((it) => selectedKeys.includes(it.key)),
|
||||||
|
[items, selectedKeys]
|
||||||
|
);
|
||||||
|
const overPillLimit = selectedDetails.length > MAX_SCOPE_PILL_DISPLAY;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-1">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<Label className="text-xs">
|
||||||
|
{label} <span className="text-muted-foreground font-normal">({labelSuffix})</span>
|
||||||
|
</Label>
|
||||||
|
<span className="flex gap-1">
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
||||||
|
onClick={onAll}
|
||||||
|
>
|
||||||
|
All
|
||||||
|
</button>
|
||||||
|
<span className="text-xs text-muted-foreground">/</span>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
||||||
|
onClick={onNone}
|
||||||
|
>
|
||||||
|
None
|
||||||
|
</button>
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{selectedDetails.length > 0 && (
|
||||||
|
<div className="flex flex-wrap gap-1.5">
|
||||||
|
{overPillLimit ? (
|
||||||
|
<span className="inline-flex items-center text-[0.6875rem] px-2 py-0.5 rounded-full bg-muted text-muted-foreground">
|
||||||
|
>{MAX_SCOPE_PILL_DISPLAY} selections made; hiding selection preview to keep the
|
||||||
|
interface clean
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
selectedDetails.map((it) => (
|
||||||
|
<span
|
||||||
|
key={it.key}
|
||||||
|
className="inline-flex items-center gap-1 text-[0.6875rem] px-2 py-0.5 rounded-full bg-primary/10 text-primary"
|
||||||
|
>
|
||||||
|
{it.label}
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="ml-0.5 hover:text-destructive transition-colors"
|
||||||
|
onClick={() => onToggle(it.key)}
|
||||||
|
aria-label={`Remove ${it.label}`}
|
||||||
|
>
|
||||||
|
×
|
||||||
|
</button>
|
||||||
|
</span>
|
||||||
|
))
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{items.length === 0 ? (
|
||||||
|
<p className="text-xs text-muted-foreground italic">{emptyItemsMessage}</p>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<Input
|
||||||
|
type="text"
|
||||||
|
placeholder={searchPlaceholder}
|
||||||
|
value={search}
|
||||||
|
onChange={(e) => setSearch(e.target.value)}
|
||||||
|
className="h-8 text-sm"
|
||||||
|
/>
|
||||||
|
<div className="max-h-48 overflow-y-auto space-y-1 rounded border border-border p-2">
|
||||||
|
{filtered.length === 0 ? (
|
||||||
|
<p className="text-xs text-muted-foreground italic py-1">
|
||||||
|
No {label.toLowerCase()} match “{search}”
|
||||||
|
</p>
|
||||||
|
) : (
|
||||||
|
filtered.map((it) => (
|
||||||
|
<label key={it.key} className="flex items-center gap-2 cursor-pointer text-sm">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
checked={selectedKeys.includes(it.key)}
|
||||||
|
onChange={() => onToggle(it.key)}
|
||||||
|
className="h-3.5 w-3.5 rounded border-input accent-primary"
|
||||||
|
/>
|
||||||
|
<span className="truncate">{it.label}</span>
|
||||||
|
{it.trailing && (
|
||||||
|
<span className="text-[0.625rem] text-muted-foreground ml-auto font-mono shrink-0">
|
||||||
|
{it.trailing}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</label>
|
||||||
|
))
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
function ScopeSelector({
|
function ScopeSelector({
|
||||||
scope,
|
scope,
|
||||||
onChange,
|
onChange,
|
||||||
@@ -1920,9 +2076,6 @@ function ScopeSelector({
|
|||||||
selectedContacts.length >= filteredContacts.length);
|
selectedContacts.length >= filteredContacts.length);
|
||||||
const showEmptyScopeWarning = messagesEffectivelyNone && !rawEnabled;
|
const showEmptyScopeWarning = messagesEffectivelyNone && !rawEnabled;
|
||||||
|
|
||||||
const isChannelChecked = (key: string) => selectedChannels.includes(key);
|
|
||||||
const isContactChecked = (key: string) => selectedContacts.includes(key);
|
|
||||||
|
|
||||||
const listHint =
|
const listHint =
|
||||||
mode === 'only'
|
mode === 'only'
|
||||||
? 'Newly added channels or contacts will not be automatically included.'
|
? 'Newly added channels or contacts will not be automatically included.'
|
||||||
@@ -1976,107 +2129,51 @@ function ScopeSelector({
|
|||||||
<p className="text-xs text-muted-foreground">{listHint}</p>
|
<p className="text-xs text-muted-foreground">{listHint}</p>
|
||||||
|
|
||||||
{channels.length > 0 && (
|
{channels.length > 0 && (
|
||||||
<div className="space-y-1">
|
<PillsSearchList
|
||||||
<div className="flex items-center justify-between">
|
label="Channels"
|
||||||
<Label className="text-xs">
|
labelSuffix={checkboxLabel}
|
||||||
Channels{' '}
|
items={channels.map((ch) => ({ key: ch.key, label: ch.name }))}
|
||||||
<span className="text-muted-foreground font-normal">({checkboxLabel})</span>
|
selectedKeys={selectedChannels}
|
||||||
</Label>
|
onToggle={toggleChannel}
|
||||||
<span className="flex gap-1">
|
onAll={() =>
|
||||||
<button
|
onChange({
|
||||||
type="button"
|
...scope,
|
||||||
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
messages: buildMessages(
|
||||||
onClick={() =>
|
channels.map((ch) => ch.key),
|
||||||
onChange({
|
selectedContacts
|
||||||
...scope,
|
),
|
||||||
messages: buildMessages(
|
})
|
||||||
channels.map((ch) => ch.key),
|
}
|
||||||
selectedContacts
|
onNone={() => onChange({ ...scope, messages: buildMessages([], selectedContacts) })}
|
||||||
),
|
searchPlaceholder={`Search ${channels.length} channel${channels.length === 1 ? '' : 's'}...`}
|
||||||
})
|
emptyItemsMessage="No channels available."
|
||||||
}
|
/>
|
||||||
>
|
|
||||||
All
|
|
||||||
</button>
|
|
||||||
<span className="text-xs text-muted-foreground">/</span>
|
|
||||||
<button
|
|
||||||
type="button"
|
|
||||||
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
|
||||||
onClick={() =>
|
|
||||||
onChange({ ...scope, messages: buildMessages([], selectedContacts) })
|
|
||||||
}
|
|
||||||
>
|
|
||||||
None
|
|
||||||
</button>
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="max-h-32 overflow-y-auto border border-input rounded-md p-2 space-y-1">
|
|
||||||
{channels.map((ch) => (
|
|
||||||
<label key={ch.key} className="flex items-center gap-2 cursor-pointer">
|
|
||||||
<input
|
|
||||||
type="checkbox"
|
|
||||||
checked={isChannelChecked(ch.key)}
|
|
||||||
onChange={() => toggleChannel(ch.key)}
|
|
||||||
className="h-3.5 w-3.5 rounded border-input accent-primary"
|
|
||||||
/>
|
|
||||||
<span className="text-sm truncate">{ch.name}</span>
|
|
||||||
</label>
|
|
||||||
))}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{filteredContacts.length > 0 && (
|
{filteredContacts.length > 0 && (
|
||||||
<div className="space-y-1">
|
<PillsSearchList
|
||||||
<div className="flex items-center justify-between">
|
label="Contacts"
|
||||||
<Label className="text-xs">
|
labelSuffix={checkboxLabel}
|
||||||
Contacts{' '}
|
items={filteredContacts.map((c) => ({
|
||||||
<span className="text-muted-foreground font-normal">({checkboxLabel})</span>
|
key: c.public_key,
|
||||||
</Label>
|
label: c.name || c.public_key.slice(0, 12),
|
||||||
<span className="flex gap-1">
|
trailing: c.public_key.slice(0, 12),
|
||||||
<button
|
}))}
|
||||||
type="button"
|
selectedKeys={selectedContacts}
|
||||||
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
onToggle={toggleContact}
|
||||||
onClick={() =>
|
onAll={() =>
|
||||||
onChange({
|
onChange({
|
||||||
...scope,
|
...scope,
|
||||||
messages: buildMessages(
|
messages: buildMessages(
|
||||||
selectedChannels,
|
selectedChannels,
|
||||||
filteredContacts.map((c) => c.public_key)
|
filteredContacts.map((c) => c.public_key)
|
||||||
),
|
),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
>
|
onNone={() => onChange({ ...scope, messages: buildMessages(selectedChannels, []) })}
|
||||||
All
|
searchPlaceholder={`Search ${filteredContacts.length} contact${filteredContacts.length === 1 ? '' : 's'}...`}
|
||||||
</button>
|
emptyItemsMessage="No contacts available."
|
||||||
<span className="text-xs text-muted-foreground">/</span>
|
/>
|
||||||
<button
|
|
||||||
type="button"
|
|
||||||
className="text-xs text-muted-foreground hover:text-foreground transition-colors"
|
|
||||||
onClick={() =>
|
|
||||||
onChange({ ...scope, messages: buildMessages(selectedChannels, []) })
|
|
||||||
}
|
|
||||||
>
|
|
||||||
None
|
|
||||||
</button>
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="max-h-32 overflow-y-auto border border-input rounded-md p-2 space-y-1">
|
|
||||||
{filteredContacts.map((c) => (
|
|
||||||
<label key={c.public_key} className="flex items-center gap-2 cursor-pointer">
|
|
||||||
<input
|
|
||||||
type="checkbox"
|
|
||||||
checked={isContactChecked(c.public_key)}
|
|
||||||
onChange={() => toggleContact(c.public_key)}
|
|
||||||
className="h-3.5 w-3.5 rounded border-input accent-primary"
|
|
||||||
/>
|
|
||||||
<span className="text-sm truncate">
|
|
||||||
{c.name || c.public_key.substring(0, 12) + '...'}
|
|
||||||
</span>
|
|
||||||
</label>
|
|
||||||
))}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
)}
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -1,5 +1,9 @@
|
|||||||
import { useState } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { ChevronRight, Logs, MessageSquare, Send, Settings } from 'lucide-react';
|
import { ChevronRight, Logs, MessageSquare, Send, Settings, X } from 'lucide-react';
|
||||||
|
import { toast } from '../ui/sonner';
|
||||||
|
import { usePush } from '../../contexts/PushSubscriptionContext';
|
||||||
|
import type { Channel, Contact } from '../../types';
|
||||||
|
import { getContactDisplayName } from '../../utils/pubkey';
|
||||||
import { Button } from '../ui/button';
|
import { Button } from '../ui/button';
|
||||||
import { Input } from '../ui/input';
|
import { Input } from '../ui/input';
|
||||||
import { Label } from '../ui/label';
|
import { Label } from '../ui/label';
|
||||||
@@ -35,30 +39,198 @@ import {
|
|||||||
getShowBatteryVoltage,
|
getShowBatteryVoltage,
|
||||||
setShowBatteryVoltage as saveBatteryVoltage,
|
setShowBatteryVoltage as saveBatteryVoltage,
|
||||||
} from '../../utils/batteryDisplay';
|
} from '../../utils/batteryDisplay';
|
||||||
|
import {
|
||||||
|
STATUS_DOT_PULSE_CHANGE_EVENT,
|
||||||
|
getStatusDotPulseEnabled,
|
||||||
|
setStatusDotPulseEnabled as saveStatusDotPulse,
|
||||||
|
} from '../../utils/statusDotPulse';
|
||||||
|
|
||||||
|
/** Resolve a state key like "contact-abc123" or "channel-def456" to a display name. */
|
||||||
|
function resolveConversationName(
|
||||||
|
stateKey: string,
|
||||||
|
contacts: Contact[],
|
||||||
|
channels: Channel[]
|
||||||
|
): string {
|
||||||
|
if (stateKey.startsWith('contact-')) {
|
||||||
|
const pubkey = stateKey.slice('contact-'.length);
|
||||||
|
const contact = contacts.find((c) => c.public_key === pubkey);
|
||||||
|
return contact ? getContactDisplayName(contact.name, contact.public_key) : pubkey.slice(0, 12);
|
||||||
|
}
|
||||||
|
if (stateKey.startsWith('channel-')) {
|
||||||
|
const key = stateKey.slice('channel-'.length);
|
||||||
|
const channel = channels.find((c) => c.key === key);
|
||||||
|
if (channel?.name) return channel.name.startsWith('#') ? channel.name : `#${channel.name}`;
|
||||||
|
return `#${key.slice(0, 12)}`;
|
||||||
|
}
|
||||||
|
return stateKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
function PushDeviceManagement({
|
||||||
|
contacts = [],
|
||||||
|
channels = [],
|
||||||
|
}: {
|
||||||
|
contacts?: Contact[];
|
||||||
|
channels?: Channel[];
|
||||||
|
}) {
|
||||||
|
const {
|
||||||
|
isSupported,
|
||||||
|
allSubscriptions,
|
||||||
|
pushConversations,
|
||||||
|
loading,
|
||||||
|
subscribe,
|
||||||
|
currentSubscriptionId,
|
||||||
|
toggleConversation,
|
||||||
|
deleteSubscription,
|
||||||
|
testPush,
|
||||||
|
refreshSubscriptions,
|
||||||
|
} = usePush();
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
refreshSubscriptions();
|
||||||
|
}, [refreshSubscriptions]);
|
||||||
|
|
||||||
|
if (!isSupported) {
|
||||||
|
return (
|
||||||
|
<div className="space-y-3">
|
||||||
|
<Label>Web Push Notifications</Label>
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
{window.isSecureContext
|
||||||
|
? 'Push notifications are not supported by this browser.'
|
||||||
|
: 'Web Push requires HTTPS. Access RemoteTerm over HTTPS (self-signed certificates work) to enable push notifications.'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="space-y-1">
|
||||||
|
<Label>Web Push Notifications</Label>
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
Receive notifications even when the browser is closed. Use the bell icon in any
|
||||||
|
conversation header to enable push for that contact or channel, or subscribe this browser
|
||||||
|
to receive notifications for all push-enabled conversations.
|
||||||
|
</p>
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
The set of channels or DMs that trigger push notifications are global per-install (i.e.
|
||||||
|
all devices that register for Web Push will have the same set of channels/DMs that trigger
|
||||||
|
notifications). Subscribing or unsubscribing a particular browser only controls whether
|
||||||
|
that browser receives notifications for the configured set of channels/DMs.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{!currentSubscriptionId && (
|
||||||
|
<Button variant="outline" size="sm" onClick={() => void subscribe()} disabled={loading}>
|
||||||
|
{loading ? 'Subscribing...' : 'Subscribe This Browser'}
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{pushConversations.length > 0 && (
|
||||||
|
<div className="space-y-2">
|
||||||
|
<span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium">
|
||||||
|
Push-enabled conversations
|
||||||
|
</span>
|
||||||
|
<div className="flex flex-wrap gap-1.5">
|
||||||
|
{pushConversations.map((key) => (
|
||||||
|
<span
|
||||||
|
key={key}
|
||||||
|
className="inline-flex items-center gap-1 rounded-full bg-muted px-2.5 py-1 text-sm"
|
||||||
|
>
|
||||||
|
{resolveConversationName(key, contacts, channels)}
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={() => void toggleConversation(key)}
|
||||||
|
className="rounded-full p-0.5 hover:bg-accent transition-colors"
|
||||||
|
title="Remove"
|
||||||
|
aria-label={`Remove ${resolveConversationName(key, contacts, channels)} from push`}
|
||||||
|
>
|
||||||
|
<X className="h-3.5 w-3.5" />
|
||||||
|
</button>
|
||||||
|
</span>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{allSubscriptions.length > 0 && (
|
||||||
|
<div className="space-y-2">
|
||||||
|
<span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium">
|
||||||
|
Registered Devices
|
||||||
|
</span>
|
||||||
|
<div className="mt-2 space-y-2">
|
||||||
|
{allSubscriptions.map((sub) => (
|
||||||
|
<div
|
||||||
|
key={sub.id}
|
||||||
|
className="flex items-center justify-between gap-3 rounded-md border border-border px-3 py-2"
|
||||||
|
>
|
||||||
|
<div className="min-w-0 flex-1">
|
||||||
|
<div className="flex items-center gap-2 overflow-hidden">
|
||||||
|
<span className="truncate text-sm font-medium">
|
||||||
|
{sub.label || 'Unknown device'}
|
||||||
|
</span>
|
||||||
|
{sub.id === currentSubscriptionId && (
|
||||||
|
<span className="shrink-0 rounded bg-primary/10 px-1.5 py-0.5 text-[0.625rem] font-medium text-primary">
|
||||||
|
Current device
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
<span className="text-xs text-muted-foreground">
|
||||||
|
{sub.last_success_at
|
||||||
|
? `Last push: ${new Date(sub.last_success_at * 1000).toLocaleDateString()}`
|
||||||
|
: 'Never pushed'}
|
||||||
|
{sub.failure_count > 0 && ` · ${sub.failure_count} failures`}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div className="flex gap-1">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
|
className="h-8 text-sm"
|
||||||
|
onClick={() => void testPush(sub.id)}
|
||||||
|
>
|
||||||
|
Test
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
|
className="h-8 text-sm text-destructive hover:text-destructive"
|
||||||
|
onClick={() => {
|
||||||
|
void deleteSubscription(sub.id).then(() => toast.success('Device removed'));
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Unsubscribe this device
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
export function SettingsLocalSection({
|
export function SettingsLocalSection({
|
||||||
onLocalLabelChange,
|
onLocalLabelChange,
|
||||||
|
contacts,
|
||||||
|
channels,
|
||||||
className,
|
className,
|
||||||
}: {
|
}: {
|
||||||
onLocalLabelChange?: (label: LocalLabel) => void;
|
onLocalLabelChange?: (label: LocalLabel) => void;
|
||||||
|
contacts?: Contact[];
|
||||||
|
channels?: Channel[];
|
||||||
className?: string;
|
className?: string;
|
||||||
}) {
|
}) {
|
||||||
const { distanceUnit, setDistanceUnit } = useDistanceUnit();
|
const { distanceUnit, setDistanceUnit } = useDistanceUnit();
|
||||||
const [reopenLastConversation, setReopenLastConversation] = useState(
|
const [reopenLastConversation, setReopenLastConversation] = useState(
|
||||||
getReopenLastConversationEnabled
|
getReopenLastConversationEnabled
|
||||||
);
|
);
|
||||||
const [darkMap, setDarkMap] = useState(() => {
|
|
||||||
try {
|
|
||||||
return localStorage.getItem('remoteterm-dark-map') === 'true';
|
|
||||||
} catch {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
const [localLabelText, setLocalLabelText] = useState(() => getLocalLabel().text);
|
const [localLabelText, setLocalLabelText] = useState(() => getLocalLabel().text);
|
||||||
const [localLabelColor, setLocalLabelColor] = useState(() => getLocalLabel().color);
|
const [localLabelColor, setLocalLabelColor] = useState(() => getLocalLabel().color);
|
||||||
const [autoFocusInput, setAutoFocusInput] = useState(getAutoFocusInputEnabled);
|
const [autoFocusInput, setAutoFocusInput] = useState(getAutoFocusInputEnabled);
|
||||||
const [batteryPercent, setBatteryPercent] = useState(getShowBatteryPercent);
|
const [batteryPercent, setBatteryPercent] = useState(getShowBatteryPercent);
|
||||||
const [batteryVoltage, setBatteryVoltage] = useState(getShowBatteryVoltage);
|
const [batteryVoltage, setBatteryVoltage] = useState(getShowBatteryVoltage);
|
||||||
|
const [statusDotPulse, setStatusDotPulse] = useState(getStatusDotPulseEnabled);
|
||||||
const [fontScale, setFontScale] = useState(getSavedFontScale);
|
const [fontScale, setFontScale] = useState(getSavedFontScale);
|
||||||
const [fontScaleSlider, setFontScaleSlider] = useState(getSavedFontScale);
|
const [fontScaleSlider, setFontScaleSlider] = useState(getSavedFontScale);
|
||||||
const [fontScaleInput, setFontScaleInput] = useState(() => String(getSavedFontScale()));
|
const [fontScaleInput, setFontScaleInput] = useState(() => String(getSavedFontScale()));
|
||||||
@@ -178,24 +350,6 @@ export function SettingsLocalSection({
|
|||||||
<span className="text-sm">Reopen to last viewed channel/conversation</span>
|
<span className="text-sm">Reopen to last viewed channel/conversation</span>
|
||||||
</label>
|
</label>
|
||||||
|
|
||||||
<label className="flex items-center gap-3 cursor-pointer">
|
|
||||||
<input
|
|
||||||
type="checkbox"
|
|
||||||
checked={darkMap}
|
|
||||||
onChange={(e) => {
|
|
||||||
const v = e.target.checked;
|
|
||||||
setDarkMap(v);
|
|
||||||
try {
|
|
||||||
localStorage.setItem('remoteterm-dark-map', String(v));
|
|
||||||
} catch {
|
|
||||||
// localStorage may be disabled
|
|
||||||
}
|
|
||||||
}}
|
|
||||||
className="w-4 h-4 rounded border-input accent-primary"
|
|
||||||
/>
|
|
||||||
<span className="text-sm">Dark mode map tiles</span>
|
|
||||||
</label>
|
|
||||||
|
|
||||||
<label className="flex items-center gap-3 cursor-pointer">
|
<label className="flex items-center gap-3 cursor-pointer">
|
||||||
<input
|
<input
|
||||||
type="checkbox"
|
type="checkbox"
|
||||||
@@ -247,6 +401,24 @@ export function SettingsLocalSection({
|
|||||||
</p>
|
</p>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
<label className="flex items-center gap-3 cursor-pointer">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
checked={statusDotPulse}
|
||||||
|
onChange={(e) => {
|
||||||
|
const v = e.target.checked;
|
||||||
|
setStatusDotPulse(v);
|
||||||
|
saveStatusDotPulse(v);
|
||||||
|
window.dispatchEvent(new Event(STATUS_DOT_PULSE_CHANGE_EVENT));
|
||||||
|
}}
|
||||||
|
className="w-4 h-4 rounded border-input accent-primary"
|
||||||
|
/>
|
||||||
|
<span className="text-sm">
|
||||||
|
Glitter status dot as packets arrive (blue = channel, purple = DM, cyan = advert, dark
|
||||||
|
green = other)
|
||||||
|
</span>
|
||||||
|
</label>
|
||||||
|
|
||||||
<div className="space-y-3">
|
<div className="space-y-3">
|
||||||
<Label htmlFor="font-scale-input">Relative Font Size</Label>
|
<Label htmlFor="font-scale-input">Relative Font Size</Label>
|
||||||
<div className="flex flex-col gap-3 sm:flex-row sm:items-center">
|
<div className="flex flex-col gap-3 sm:flex-row sm:items-center">
|
||||||
@@ -324,6 +496,10 @@ export function SettingsLocalSection({
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<Separator />
|
||||||
|
|
||||||
|
<PushDeviceManagement contacts={contacts} channels={channels} />
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -56,15 +56,68 @@ interface SheetContentProps
|
|||||||
hideCloseButton?: boolean;
|
hideCloseButton?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Safe-area insets for each sheet side. Sheets are position:fixed and escape
|
||||||
|
// body padding, so without this they render under the iOS status bar/home
|
||||||
|
// indicator when the app is installed as a PWA.
|
||||||
|
//
|
||||||
|
// NOTE: these inline styles override the matching sides of the `p-6` default
|
||||||
|
// in sheetVariants. All current consumers pass `p-0`; future sheets that want
|
||||||
|
// the default padding should compose explicit per-side padding in their own
|
||||||
|
// className rather than relying on the `p-6` shorthand being preserved.
|
||||||
|
type SheetSide = Exclude<VariantProps<typeof sheetVariants>['side'], null | undefined>;
|
||||||
|
|
||||||
|
const sheetSafeAreaStyles: Record<SheetSide, React.CSSProperties> = {
|
||||||
|
top: {
|
||||||
|
paddingTop: 'var(--safe-area-top)',
|
||||||
|
paddingLeft: 'var(--safe-area-left)',
|
||||||
|
paddingRight: 'var(--safe-area-right)',
|
||||||
|
},
|
||||||
|
bottom: {
|
||||||
|
paddingBottom: 'var(--safe-area-bottom)',
|
||||||
|
paddingLeft: 'var(--safe-area-left)',
|
||||||
|
paddingRight: 'var(--safe-area-right)',
|
||||||
|
},
|
||||||
|
left: {
|
||||||
|
paddingTop: 'var(--safe-area-top)',
|
||||||
|
paddingLeft: 'var(--safe-area-left)',
|
||||||
|
paddingBottom: 'var(--safe-area-bottom)',
|
||||||
|
},
|
||||||
|
right: {
|
||||||
|
paddingTop: 'var(--safe-area-top)',
|
||||||
|
paddingRight: 'var(--safe-area-right)',
|
||||||
|
paddingBottom: 'var(--safe-area-bottom)',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
const SheetContent = React.forwardRef<
|
const SheetContent = React.forwardRef<
|
||||||
React.ElementRef<typeof SheetPrimitive.Content>,
|
React.ElementRef<typeof SheetPrimitive.Content>,
|
||||||
SheetContentProps
|
SheetContentProps
|
||||||
>(({ side = 'right', className, children, hideCloseButton = false, ...props }, ref) => (
|
>(({ side = 'right', className, children, hideCloseButton = false, style, ...props }, ref) => (
|
||||||
<SheetPortal>
|
<SheetPortal>
|
||||||
<SheetOverlay />
|
<SheetOverlay />
|
||||||
<SheetPrimitive.Content ref={ref} className={cn(sheetVariants({ side }), className)} {...props}>
|
<SheetPrimitive.Content
|
||||||
|
ref={ref}
|
||||||
|
className={cn(sheetVariants({ side }), className)}
|
||||||
|
style={{ ...sheetSafeAreaStyles[side as SheetSide], ...style }}
|
||||||
|
{...props}
|
||||||
|
>
|
||||||
{!hideCloseButton && (
|
{!hideCloseButton && (
|
||||||
<SheetPrimitive.Close className="absolute right-4 top-4 rounded-sm opacity-70 ring-offset-background transition-opacity hover:opacity-100 focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 disabled:pointer-events-none data-[state=open]:bg-secondary">
|
<SheetPrimitive.Close
|
||||||
|
// Absolute positioning is measured from the containing block's
|
||||||
|
// padding edge, so the safe-area padding on SheetContent does not
|
||||||
|
// push this button down. We offset `top` by safe-area-top manually
|
||||||
|
// for sheets that pin to the viewport top (top/left/right). Bottom
|
||||||
|
// sheets start mid-viewport, so no adjustment is needed there.
|
||||||
|
style={
|
||||||
|
side === 'bottom'
|
||||||
|
? undefined
|
||||||
|
: {
|
||||||
|
top: 'calc(var(--safe-area-top) + 1rem)',
|
||||||
|
right: 'calc(var(--safe-area-right) + 1rem)',
|
||||||
|
}
|
||||||
|
}
|
||||||
|
className="absolute right-4 top-4 rounded-sm opacity-70 ring-offset-background transition-opacity hover:opacity-100 focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 disabled:pointer-events-none data-[state=open]:bg-secondary"
|
||||||
|
>
|
||||||
<X className="h-4 w-4" />
|
<X className="h-4 w-4" />
|
||||||
<span className="sr-only">Close</span>
|
<span className="sr-only">Close</span>
|
||||||
</SheetPrimitive.Close>
|
</SheetPrimitive.Close>
|
||||||
|
|||||||
@@ -0,0 +1,35 @@
|
|||||||
|
import { createContext, useContext, type ReactNode } from 'react';
|
||||||
|
import { usePushSubscription, type PushSubscriptionState } from '../hooks/usePushSubscription';
|
||||||
|
|
||||||
|
const noopAsync = async () => {};
|
||||||
|
const noopAsyncNull = async () => null;
|
||||||
|
|
||||||
|
const defaultState: PushSubscriptionState = {
|
||||||
|
isSupported: false,
|
||||||
|
isSubscribed: false,
|
||||||
|
currentSubscriptionId: null,
|
||||||
|
allSubscriptions: [],
|
||||||
|
pushConversations: [],
|
||||||
|
loading: false,
|
||||||
|
subscribe: noopAsyncNull,
|
||||||
|
unsubscribe: noopAsync,
|
||||||
|
toggleConversation: noopAsync,
|
||||||
|
isConversationPushEnabled: () => false,
|
||||||
|
deleteSubscription: noopAsync,
|
||||||
|
testPush: noopAsync,
|
||||||
|
refreshSubscriptions: async () => [],
|
||||||
|
refreshConversations: noopAsync,
|
||||||
|
};
|
||||||
|
|
||||||
|
const PushSubscriptionContext = createContext<PushSubscriptionState>(defaultState);
|
||||||
|
|
||||||
|
export function PushSubscriptionProvider({ children }: { children: ReactNode }) {
|
||||||
|
const push = usePushSubscription();
|
||||||
|
return (
|
||||||
|
<PushSubscriptionContext.Provider value={push}>{children}</PushSubscriptionContext.Provider>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function usePush(): PushSubscriptionState {
|
||||||
|
return useContext(PushSubscriptionContext);
|
||||||
|
}
|
||||||
@@ -0,0 +1,277 @@
|
|||||||
|
import { useState, useEffect, useCallback, useRef } from 'react';
|
||||||
|
import { toast } from '../components/ui/sonner';
|
||||||
|
import { api } from '../api';
|
||||||
|
import type { PushSubscriptionInfo } from '../types';
|
||||||
|
|
||||||
|
function generateLabel(): string {
|
||||||
|
const ua = navigator.userAgent;
|
||||||
|
if (/Firefox/i.test(ua)) {
|
||||||
|
if (/Android/i.test(ua)) return 'Firefox on Android';
|
||||||
|
if (/Mac/i.test(ua)) return 'Firefox on macOS';
|
||||||
|
if (/Windows/i.test(ua)) return 'Firefox on Windows';
|
||||||
|
if (/Linux/i.test(ua)) return 'Firefox on Linux';
|
||||||
|
return 'Firefox';
|
||||||
|
}
|
||||||
|
if (/Chrome/i.test(ua) && !/Edg/i.test(ua)) {
|
||||||
|
if (/Android/i.test(ua)) return 'Chrome on Android';
|
||||||
|
if (/CrOS/i.test(ua)) return 'Chrome on ChromeOS';
|
||||||
|
if (/Mac/i.test(ua)) return 'Chrome on macOS';
|
||||||
|
if (/Windows/i.test(ua)) return 'Chrome on Windows';
|
||||||
|
if (/Linux/i.test(ua)) return 'Chrome on Linux';
|
||||||
|
return 'Chrome';
|
||||||
|
}
|
||||||
|
if (/Edg/i.test(ua)) return 'Edge';
|
||||||
|
if (/Safari/i.test(ua)) {
|
||||||
|
if (/iPhone|iPad/i.test(ua)) return 'Safari on iOS';
|
||||||
|
return 'Safari on macOS';
|
||||||
|
}
|
||||||
|
return 'Browser';
|
||||||
|
}
|
||||||
|
|
||||||
|
function urlBase64ToUint8Array(base64String: string): Uint8Array {
|
||||||
|
const padding = '='.repeat((4 - (base64String.length % 4)) % 4);
|
||||||
|
const base64 = (base64String + padding).replace(/-/g, '+').replace(/_/g, '/');
|
||||||
|
const raw = atob(base64);
|
||||||
|
const arr = new Uint8Array(raw.length);
|
||||||
|
for (let i = 0; i < raw.length; i++) arr[i] = raw.charCodeAt(i);
|
||||||
|
return arr;
|
||||||
|
}
|
||||||
|
|
||||||
|
function uint8ArraysEqual(a: Uint8Array | null, b: Uint8Array): boolean {
|
||||||
|
if (!a || a.length !== b.length) return false;
|
||||||
|
for (let i = 0; i < a.length; i++) {
|
||||||
|
if (a[i] !== b[i]) return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getApplicationServerKeyBytes(
|
||||||
|
key: ArrayBuffer | ArrayBufferView | null | undefined
|
||||||
|
): Uint8Array | null {
|
||||||
|
if (!key) return null;
|
||||||
|
if (ArrayBuffer.isView(key)) {
|
||||||
|
return new Uint8Array(key.buffer, key.byteOffset, key.byteLength);
|
||||||
|
}
|
||||||
|
return new Uint8Array(key);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface PushSubscriptionState {
|
||||||
|
isSupported: boolean;
|
||||||
|
isSubscribed: boolean;
|
||||||
|
currentSubscriptionId: string | null;
|
||||||
|
allSubscriptions: PushSubscriptionInfo[];
|
||||||
|
/** Global list of push-enabled conversation state keys (device-independent). */
|
||||||
|
pushConversations: string[];
|
||||||
|
loading: boolean;
|
||||||
|
subscribe: () => Promise<string | null>;
|
||||||
|
unsubscribe: () => Promise<void>;
|
||||||
|
/** Toggle a conversation in the global push list (device-independent). */
|
||||||
|
toggleConversation: (conversationKey: string) => Promise<void>;
|
||||||
|
isConversationPushEnabled: (conversationKey: string) => boolean;
|
||||||
|
deleteSubscription: (subscriptionId: string) => Promise<void>;
|
||||||
|
testPush: (subscriptionId: string) => Promise<void>;
|
||||||
|
refreshSubscriptions: () => Promise<PushSubscriptionInfo[]>;
|
||||||
|
refreshConversations: () => Promise<void>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function usePushSubscription(): PushSubscriptionState {
|
||||||
|
const [isSupported, setIsSupported] = useState(false);
|
||||||
|
const [currentSubscriptionId, setCurrentSubscriptionId] = useState<string | null>(null);
|
||||||
|
const [allSubscriptions, setAllSubscriptions] = useState<PushSubscriptionInfo[]>([]);
|
||||||
|
const [pushConversations, setPushConversations] = useState<string[]>([]);
|
||||||
|
const [loading, setLoading] = useState(false);
|
||||||
|
const vapidKeyRef = useRef<string | null>(null);
|
||||||
|
|
||||||
|
const reconcileCurrentSubscription = useCallback(
|
||||||
|
(subs: PushSubscriptionInfo[], endpoint: string | null) => {
|
||||||
|
setAllSubscriptions(subs);
|
||||||
|
if (!endpoint) {
|
||||||
|
setCurrentSubscriptionId(null);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const match = subs.find((sub) => sub.endpoint === endpoint);
|
||||||
|
setCurrentSubscriptionId(match?.id ?? null);
|
||||||
|
},
|
||||||
|
[]
|
||||||
|
);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const supported =
|
||||||
|
window.isSecureContext &&
|
||||||
|
'serviceWorker' in navigator &&
|
||||||
|
'PushManager' in window &&
|
||||||
|
'Notification' in window;
|
||||||
|
setIsSupported(supported);
|
||||||
|
|
||||||
|
if (supported) {
|
||||||
|
// Always load all registered devices so Settings can manage them even
|
||||||
|
// when this particular browser isn't subscribed.
|
||||||
|
const subsPromise = api.getPushSubscriptions().catch(() => [] as PushSubscriptionInfo[]);
|
||||||
|
|
||||||
|
// Check if THIS browser has an active push subscription and match it
|
||||||
|
// to a backend record.
|
||||||
|
navigator.serviceWorker.ready
|
||||||
|
.then((reg) => reg.pushManager.getSubscription())
|
||||||
|
.then(async (sub) => {
|
||||||
|
const existing = await subsPromise;
|
||||||
|
reconcileCurrentSubscription(existing, sub?.endpoint ?? null);
|
||||||
|
})
|
||||||
|
.catch(() => {});
|
||||||
|
|
||||||
|
// Load global conversation list
|
||||||
|
api
|
||||||
|
.getPushConversations()
|
||||||
|
.then(setPushConversations)
|
||||||
|
.catch(() => {});
|
||||||
|
}
|
||||||
|
}, [reconcileCurrentSubscription]);
|
||||||
|
|
||||||
|
const refreshSubscriptions = useCallback(async () => {
|
||||||
|
try {
|
||||||
|
const subs = await api.getPushSubscriptions();
|
||||||
|
const reg = await navigator.serviceWorker.ready;
|
||||||
|
const sub = await reg.pushManager.getSubscription();
|
||||||
|
reconcileCurrentSubscription(subs, sub?.endpoint ?? null);
|
||||||
|
return subs;
|
||||||
|
} catch {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}, [reconcileCurrentSubscription]);
|
||||||
|
|
||||||
|
const refreshConversations = useCallback(async () => {
|
||||||
|
try {
|
||||||
|
const convos = await api.getPushConversations();
|
||||||
|
setPushConversations(convos);
|
||||||
|
} catch {
|
||||||
|
// best effort
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const subscribe = useCallback(async (): Promise<string | null> => {
|
||||||
|
if (!isSupported) return null;
|
||||||
|
setLoading(true);
|
||||||
|
try {
|
||||||
|
const resp = await api.getVapidPublicKey();
|
||||||
|
vapidKeyRef.current = resp.public_key;
|
||||||
|
const vapidKeyBytes = urlBase64ToUint8Array(resp.public_key);
|
||||||
|
|
||||||
|
const reg = await navigator.serviceWorker.ready;
|
||||||
|
let pushSub = await reg.pushManager.getSubscription();
|
||||||
|
const existingKeyBytes = getApplicationServerKeyBytes(pushSub?.options?.applicationServerKey);
|
||||||
|
const requiresRecreate =
|
||||||
|
pushSub !== null && !uint8ArraysEqual(existingKeyBytes, vapidKeyBytes);
|
||||||
|
|
||||||
|
if (requiresRecreate) {
|
||||||
|
await pushSub!.unsubscribe();
|
||||||
|
pushSub = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!pushSub) {
|
||||||
|
pushSub = await reg.pushManager.subscribe({
|
||||||
|
userVisibleOnly: true,
|
||||||
|
applicationServerKey: vapidKeyBytes.buffer as ArrayBuffer,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const json = pushSub.toJSON();
|
||||||
|
const result = await api.pushSubscribe({
|
||||||
|
endpoint: json.endpoint!,
|
||||||
|
p256dh: json.keys!.p256dh!,
|
||||||
|
auth: json.keys!.auth!,
|
||||||
|
label: generateLabel(),
|
||||||
|
});
|
||||||
|
|
||||||
|
setCurrentSubscriptionId(result.id);
|
||||||
|
await refreshSubscriptions();
|
||||||
|
return result.id;
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Push subscribe failed:', err);
|
||||||
|
toast.error('Failed to enable push notifications', {
|
||||||
|
description: err instanceof Error ? err.message : 'Check that notifications are allowed',
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
}, [isSupported, refreshSubscriptions]);
|
||||||
|
|
||||||
|
const unsubscribe = useCallback(async () => {
|
||||||
|
setLoading(true);
|
||||||
|
try {
|
||||||
|
const reg = await navigator.serviceWorker.ready;
|
||||||
|
const pushSub = await reg.pushManager.getSubscription();
|
||||||
|
if (pushSub) await pushSub.unsubscribe();
|
||||||
|
|
||||||
|
if (currentSubscriptionId) {
|
||||||
|
await api.deletePushSubscription(currentSubscriptionId).catch(() => {});
|
||||||
|
}
|
||||||
|
|
||||||
|
setCurrentSubscriptionId(null);
|
||||||
|
await refreshSubscriptions();
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Push unsubscribe failed:', err);
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
}, [currentSubscriptionId, refreshSubscriptions]);
|
||||||
|
|
||||||
|
const toggleConversation = useCallback(async (conversationKey: string) => {
|
||||||
|
try {
|
||||||
|
const updated = await api.togglePushConversation(conversationKey);
|
||||||
|
setPushConversations(updated);
|
||||||
|
} catch {
|
||||||
|
toast.error('Failed to update push preferences');
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const isConversationPushEnabled = useCallback(
|
||||||
|
(conversationKey: string): boolean => {
|
||||||
|
return pushConversations.includes(conversationKey);
|
||||||
|
},
|
||||||
|
[pushConversations]
|
||||||
|
);
|
||||||
|
|
||||||
|
const deleteSubscription = useCallback(
|
||||||
|
async (subscriptionId: string) => {
|
||||||
|
await api.deletePushSubscription(subscriptionId);
|
||||||
|
if (subscriptionId === currentSubscriptionId) {
|
||||||
|
setCurrentSubscriptionId(null);
|
||||||
|
try {
|
||||||
|
const reg = await navigator.serviceWorker.ready;
|
||||||
|
const pushSub = await reg.pushManager.getSubscription();
|
||||||
|
if (pushSub) await pushSub.unsubscribe();
|
||||||
|
} catch {
|
||||||
|
// best effort
|
||||||
|
}
|
||||||
|
}
|
||||||
|
await refreshSubscriptions();
|
||||||
|
},
|
||||||
|
[currentSubscriptionId, refreshSubscriptions]
|
||||||
|
);
|
||||||
|
|
||||||
|
const testPush = useCallback(async (subscriptionId: string) => {
|
||||||
|
try {
|
||||||
|
await api.testPushSubscription(subscriptionId);
|
||||||
|
toast.success('Test notification sent');
|
||||||
|
} catch {
|
||||||
|
toast.error('Test notification failed');
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
isSupported,
|
||||||
|
isSubscribed: !!currentSubscriptionId,
|
||||||
|
currentSubscriptionId,
|
||||||
|
allSubscriptions,
|
||||||
|
pushConversations,
|
||||||
|
loading,
|
||||||
|
subscribe,
|
||||||
|
unsubscribe,
|
||||||
|
toggleConversation,
|
||||||
|
isConversationPushEnabled,
|
||||||
|
deleteSubscription,
|
||||||
|
testPush,
|
||||||
|
refreshSubscriptions,
|
||||||
|
refreshConversations,
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -12,6 +12,7 @@ import { getStateKey } from '../utils/conversationState';
|
|||||||
import { mergeContactIntoList } from '../utils/contactMerge';
|
import { mergeContactIntoList } from '../utils/contactMerge';
|
||||||
import { getContactDisplayName } from '../utils/pubkey';
|
import { getContactDisplayName } from '../utils/pubkey';
|
||||||
import { appendRawPacketUnique } from '../utils/rawPacketIdentity';
|
import { appendRawPacketUnique } from '../utils/rawPacketIdentity';
|
||||||
|
import { emitStatusDotPulse } from '../utils/statusDotPulse';
|
||||||
import type {
|
import type {
|
||||||
Channel,
|
Channel,
|
||||||
Contact,
|
Contact,
|
||||||
@@ -253,6 +254,7 @@ export function useRealtimeAppState({
|
|||||||
},
|
},
|
||||||
onRawPacket: (packet: RawPacket) => {
|
onRawPacket: (packet: RawPacket) => {
|
||||||
recordRawPacketObservation?.(packet);
|
recordRawPacketObservation?.(packet);
|
||||||
|
emitStatusDotPulse(packet.payload_type);
|
||||||
setRawPackets((prev) => appendRawPacketUnique(prev, packet, maxRawPackets));
|
setRawPackets((prev) => appendRawPacketUnique(prev, packet, maxRawPackets));
|
||||||
},
|
},
|
||||||
onMessageAcked: (
|
onMessageAcked: (
|
||||||
|
|||||||
+12
-2
@@ -4,15 +4,25 @@ import { App } from './App';
|
|||||||
import './index.css';
|
import './index.css';
|
||||||
import './themes.css';
|
import './themes.css';
|
||||||
import './styles.css';
|
import './styles.css';
|
||||||
import { getSavedTheme, applyTheme } from './utils/theme';
|
import { getSavedTheme, applyTheme, initFollowOSListener } from './utils/theme';
|
||||||
import { applyFontScale, getSavedFontScale } from './utils/fontScale';
|
import { applyFontScale, getSavedFontScale } from './utils/fontScale';
|
||||||
|
import { PushSubscriptionProvider } from './contexts/PushSubscriptionContext';
|
||||||
|
|
||||||
// Apply saved theme before first render
|
// Apply saved theme before first render
|
||||||
applyTheme(getSavedTheme());
|
applyTheme(getSavedTheme());
|
||||||
|
// Re-apply when the OS color-scheme preference changes, if on "Follow OS".
|
||||||
|
initFollowOSListener();
|
||||||
applyFontScale(getSavedFontScale());
|
applyFontScale(getSavedFontScale());
|
||||||
|
|
||||||
createRoot(document.getElementById('root')!).render(
|
createRoot(document.getElementById('root')!).render(
|
||||||
<StrictMode>
|
<StrictMode>
|
||||||
<App />
|
<PushSubscriptionProvider>
|
||||||
|
<App />
|
||||||
|
</PushSubscriptionProvider>
|
||||||
</StrictMode>
|
</StrictMode>
|
||||||
);
|
);
|
||||||
|
|
||||||
|
// Register service worker for Web Push (requires secure context)
|
||||||
|
if ('serviceWorker' in navigator && window.isSecureContext) {
|
||||||
|
navigator.serviceWorker.register('./sw.js').catch(() => {});
|
||||||
|
}
|
||||||
|
|||||||
@@ -29,6 +29,13 @@ const mocks = vi.hoisted(() => ({
|
|||||||
success: vi.fn(),
|
success: vi.fn(),
|
||||||
error: vi.fn(),
|
error: vi.fn(),
|
||||||
},
|
},
|
||||||
|
push: {
|
||||||
|
isSupported: false,
|
||||||
|
isSubscribed: false,
|
||||||
|
subscribe: vi.fn<() => Promise<string | null>>(async () => null),
|
||||||
|
toggleConversation: vi.fn(async () => {}),
|
||||||
|
isConversationPushEnabled: vi.fn(() => false),
|
||||||
|
},
|
||||||
hookFns: {
|
hookFns: {
|
||||||
fetchOlderMessages: vi.fn(async () => {}),
|
fetchOlderMessages: vi.fn(async () => {}),
|
||||||
observeMessage: vi.fn(() => ({ added: false, activeConversation: false })),
|
observeMessage: vi.fn(() => ({ added: false, activeConversation: false })),
|
||||||
@@ -51,6 +58,25 @@ vi.mock('../useWebSocket', () => ({
|
|||||||
useWebSocket: vi.fn(),
|
useWebSocket: vi.fn(),
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
vi.mock('../contexts/PushSubscriptionContext', () => ({
|
||||||
|
usePush: () => ({
|
||||||
|
isSupported: mocks.push.isSupported,
|
||||||
|
isSubscribed: mocks.push.isSubscribed,
|
||||||
|
currentSubscriptionId: mocks.push.isSubscribed ? 'sub-1' : null,
|
||||||
|
allSubscriptions: [],
|
||||||
|
pushConversations: [],
|
||||||
|
loading: false,
|
||||||
|
subscribe: mocks.push.subscribe,
|
||||||
|
unsubscribe: vi.fn(async () => {}),
|
||||||
|
toggleConversation: mocks.push.toggleConversation,
|
||||||
|
isConversationPushEnabled: mocks.push.isConversationPushEnabled,
|
||||||
|
deleteSubscription: vi.fn(async () => {}),
|
||||||
|
testPush: vi.fn(async () => {}),
|
||||||
|
refreshSubscriptions: vi.fn(async () => []),
|
||||||
|
refreshConversations: vi.fn(async () => {}),
|
||||||
|
}),
|
||||||
|
}));
|
||||||
|
|
||||||
vi.mock('../hooks', async (importOriginal) => {
|
vi.mock('../hooks', async (importOriginal) => {
|
||||||
const actual = await importOriginal<typeof import('../hooks')>();
|
const actual = await importOriginal<typeof import('../hooks')>();
|
||||||
return {
|
return {
|
||||||
@@ -209,6 +235,10 @@ const publicChannel = {
|
|||||||
describe('App favorite toggle flow', () => {
|
describe('App favorite toggle flow', () => {
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
vi.clearAllMocks();
|
vi.clearAllMocks();
|
||||||
|
mocks.push.isSupported = false;
|
||||||
|
mocks.push.isSubscribed = false;
|
||||||
|
mocks.push.subscribe.mockResolvedValue(null);
|
||||||
|
mocks.push.isConversationPushEnabled.mockReturnValue(false);
|
||||||
|
|
||||||
mocks.api.getRadioConfig.mockResolvedValue(baseConfig);
|
mocks.api.getRadioConfig.mockResolvedValue(baseConfig);
|
||||||
mocks.api.getSettings.mockResolvedValue({ ...baseSettings });
|
mocks.api.getSettings.mockResolvedValue({ ...baseSettings });
|
||||||
@@ -313,4 +343,44 @@ describe('App favorite toggle flow', () => {
|
|||||||
expect(screen.queryByTestId('settings-modal-section')).not.toBeInTheDocument();
|
expect(screen.queryByTestId('settings-modal-section')).not.toBeInTheDocument();
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('subscribes this browser before enabling web push for a conversation', async () => {
|
||||||
|
mocks.push.isSupported = true;
|
||||||
|
mocks.push.isSubscribed = false;
|
||||||
|
mocks.push.subscribe.mockResolvedValue('sub-1');
|
||||||
|
|
||||||
|
render(<App />);
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.getByRole('button', { name: 'Notification settings' })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
fireEvent.click(screen.getByRole('button', { name: 'Notification settings' }));
|
||||||
|
fireEvent.click(screen.getByRole('checkbox', { name: /web push/i }));
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(mocks.push.subscribe).toHaveBeenCalledTimes(1);
|
||||||
|
expect(mocks.push.toggleConversation).toHaveBeenCalledWith(`channel-${publicChannel.key}`);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('does not enable web push when subscription setup fails', async () => {
|
||||||
|
mocks.push.isSupported = true;
|
||||||
|
mocks.push.isSubscribed = false;
|
||||||
|
mocks.push.subscribe.mockResolvedValue(null);
|
||||||
|
|
||||||
|
render(<App />);
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.getByRole('button', { name: 'Notification settings' })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
fireEvent.click(screen.getByRole('button', { name: 'Notification settings' }));
|
||||||
|
fireEvent.click(screen.getByRole('checkbox', { name: /web push/i }));
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(mocks.push.subscribe).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
expect(mocks.push.toggleConversation).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -150,7 +150,7 @@ describe('ChatHeader key visibility', () => {
|
|||||||
expect(screen.getAllByText('#Esperance')).toHaveLength(2);
|
expect(screen.getAllByText('#Esperance')).toHaveLength(2);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('shows enabled notification state and toggles when clicked', () => {
|
it('shows filled bell when notifications are enabled and toggles via dropdown', () => {
|
||||||
const conversation: Conversation = { type: 'contact', id: '11'.repeat(32), name: 'Alice' };
|
const conversation: Conversation = { type: 'contact', id: '11'.repeat(32), name: 'Alice' };
|
||||||
const onToggleNotifications = vi.fn();
|
const onToggleNotifications = vi.fn();
|
||||||
|
|
||||||
@@ -164,12 +164,40 @@ describe('ChatHeader key visibility', () => {
|
|||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
|
|
||||||
fireEvent.click(screen.getByText('Notifications On'));
|
// Bell button should be present; open the dropdown
|
||||||
|
const bellBtn = screen.getByRole('button', { name: 'Notification settings' });
|
||||||
|
fireEvent.click(bellBtn);
|
||||||
|
|
||||||
expect(screen.getByText('Notifications On')).toBeInTheDocument();
|
// Desktop notifications checkbox should be checked
|
||||||
|
const checkbox = screen.getByRole('checkbox', { name: /desktop notifications/i });
|
||||||
|
expect(checkbox).toBeChecked();
|
||||||
|
|
||||||
|
// Toggling calls the handler
|
||||||
|
fireEvent.click(checkbox);
|
||||||
expect(onToggleNotifications).toHaveBeenCalledTimes(1);
|
expect(onToggleNotifications).toHaveBeenCalledTimes(1);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('keeps desktop notifications available when web push is also supported', () => {
|
||||||
|
const conversation: Conversation = { type: 'contact', id: '13'.repeat(32), name: 'Alice' };
|
||||||
|
|
||||||
|
render(
|
||||||
|
<ChatHeader
|
||||||
|
{...baseProps}
|
||||||
|
conversation={conversation}
|
||||||
|
channels={[]}
|
||||||
|
pushSupported
|
||||||
|
pushSubscribed
|
||||||
|
pushEnabledForConversation
|
||||||
|
onTogglePush={vi.fn()}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
fireEvent.click(screen.getByRole('button', { name: 'Notification settings' }));
|
||||||
|
|
||||||
|
expect(screen.getByRole('checkbox', { name: /desktop notifications/i })).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('checkbox', { name: /web push/i })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
it('hides trace and notification controls for room-server contacts', () => {
|
it('hides trace and notification controls for room-server contacts', () => {
|
||||||
const pubKey = '41'.repeat(32);
|
const pubKey = '41'.repeat(32);
|
||||||
const contact: Contact = {
|
const contact: Contact = {
|
||||||
@@ -198,9 +226,7 @@ describe('ChatHeader key visibility', () => {
|
|||||||
|
|
||||||
expect(screen.queryByRole('button', { name: 'Path Discovery' })).not.toBeInTheDocument();
|
expect(screen.queryByRole('button', { name: 'Path Discovery' })).not.toBeInTheDocument();
|
||||||
expect(screen.queryByRole('button', { name: 'Direct Trace' })).not.toBeInTheDocument();
|
expect(screen.queryByRole('button', { name: 'Direct Trace' })).not.toBeInTheDocument();
|
||||||
expect(
|
expect(screen.queryByRole('button', { name: 'Notification settings' })).not.toBeInTheDocument();
|
||||||
screen.queryByRole('button', { name: 'Enable notifications for this conversation' })
|
|
||||||
).not.toBeInTheDocument();
|
|
||||||
});
|
});
|
||||||
|
|
||||||
it('hides the delete button for the canonical Public channel', () => {
|
it('hides the delete button for the canonical Public channel', () => {
|
||||||
|
|||||||
@@ -145,6 +145,7 @@ function createProps(overrides: Partial<React.ComponentProps<typeof Conversation
|
|||||||
onDeleteContact: vi.fn(async () => {}),
|
onDeleteContact: vi.fn(async () => {}),
|
||||||
onDeleteChannel: vi.fn(async () => {}),
|
onDeleteChannel: vi.fn(async () => {}),
|
||||||
onSetChannelFloodScopeOverride: vi.fn(async () => {}),
|
onSetChannelFloodScopeOverride: vi.fn(async () => {}),
|
||||||
|
onSelectConversation: vi.fn(),
|
||||||
onOpenContactInfo: vi.fn(),
|
onOpenContactInfo: vi.fn(),
|
||||||
onOpenChannelInfo: vi.fn(),
|
onOpenChannelInfo: vi.fn(),
|
||||||
onSenderClick: vi.fn(),
|
onSenderClick: vi.fn(),
|
||||||
|
|||||||
@@ -1,26 +1,43 @@
|
|||||||
import { forwardRef } from 'react';
|
import { forwardRef } from 'react';
|
||||||
import { render, screen } from '@testing-library/react';
|
import { fireEvent, render, screen } from '@testing-library/react';
|
||||||
import { describe, expect, it, vi } from 'vitest';
|
import { describe, expect, it, vi } from 'vitest';
|
||||||
import { MapView } from '../components/MapView';
|
import { MapView } from '../components/MapView';
|
||||||
import type { Contact } from '../types';
|
import type { Contact } from '../types';
|
||||||
|
|
||||||
vi.mock('react-leaflet', () => ({
|
vi.mock('react-leaflet', () => {
|
||||||
MapContainer: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
|
const BaseLayer = ({
|
||||||
TileLayer: () => null,
|
children,
|
||||||
CircleMarker: forwardRef<
|
}: {
|
||||||
HTMLDivElement,
|
children: React.ReactNode;
|
||||||
{ children: React.ReactNode; pathOptions?: { fillColor?: string } }
|
name: string;
|
||||||
>(({ children, pathOptions }, ref) => (
|
checked?: boolean;
|
||||||
<div ref={ref} data-fill-color={pathOptions?.fillColor}>
|
}) => <div>{children}</div>;
|
||||||
{children}
|
const LayersControlMock = ({ children }: { children: React.ReactNode }) => <div>{children}</div>;
|
||||||
</div>
|
(LayersControlMock as unknown as { BaseLayer: typeof BaseLayer }).BaseLayer = BaseLayer;
|
||||||
)),
|
return {
|
||||||
Popup: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
|
MapContainer: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
|
||||||
useMap: () => ({
|
TileLayer: () => null,
|
||||||
setView: vi.fn(),
|
CircleMarker: forwardRef<
|
||||||
fitBounds: vi.fn(),
|
HTMLDivElement,
|
||||||
}),
|
{ children: React.ReactNode; pathOptions?: { fillColor?: string } }
|
||||||
}));
|
>(({ children, pathOptions }, ref) => (
|
||||||
|
<div ref={ref} data-fill-color={pathOptions?.fillColor}>
|
||||||
|
{children}
|
||||||
|
</div>
|
||||||
|
)),
|
||||||
|
Popup: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
|
||||||
|
Polyline: () => null,
|
||||||
|
LayersControl: LayersControlMock,
|
||||||
|
useMap: () => ({
|
||||||
|
setView: vi.fn(),
|
||||||
|
fitBounds: vi.fn(),
|
||||||
|
setMaxZoom: vi.fn(),
|
||||||
|
setZoom: vi.fn(),
|
||||||
|
getZoom: vi.fn(() => 2),
|
||||||
|
}),
|
||||||
|
useMapEvents: () => null,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
describe('MapView', () => {
|
describe('MapView', () => {
|
||||||
it('renders a never-heard fallback for a focused contact without last_seen', () => {
|
it('renders a never-heard fallback for a focused contact without last_seen', () => {
|
||||||
@@ -54,6 +71,68 @@ describe('MapView', () => {
|
|||||||
expect(screen.getByText('Last heard: Never heard by this server')).toBeInTheDocument();
|
expect(screen.getByText('Last heard: Never heard by this server')).toBeInTheDocument();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('invokes onSelectContact when the popup name is clicked', () => {
|
||||||
|
const contact: Contact = {
|
||||||
|
public_key: 'cc'.repeat(32),
|
||||||
|
name: 'Clickable',
|
||||||
|
type: 1,
|
||||||
|
flags: 0,
|
||||||
|
direct_path: null,
|
||||||
|
direct_path_len: -1,
|
||||||
|
direct_path_hash_mode: -1,
|
||||||
|
route_override_path: null,
|
||||||
|
route_override_len: null,
|
||||||
|
route_override_hash_mode: null,
|
||||||
|
last_advert: null,
|
||||||
|
lat: 42,
|
||||||
|
lon: -72,
|
||||||
|
last_seen: Math.floor(Date.now() / 1000),
|
||||||
|
on_radio: false,
|
||||||
|
favorite: false,
|
||||||
|
last_contacted: null,
|
||||||
|
last_read_at: null,
|
||||||
|
first_seen: null,
|
||||||
|
};
|
||||||
|
const onSelectContact = vi.fn();
|
||||||
|
|
||||||
|
render(<MapView contacts={[contact]} onSelectContact={onSelectContact} />);
|
||||||
|
|
||||||
|
const link = screen.getByRole('button', { name: 'Clickable' });
|
||||||
|
expect(link).toHaveAttribute('title', 'Open conversation with Clickable');
|
||||||
|
fireEvent.click(link);
|
||||||
|
|
||||||
|
expect(onSelectContact).toHaveBeenCalledWith(contact);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('renders the popup name as plain text when no onSelectContact is provided', () => {
|
||||||
|
const contact: Contact = {
|
||||||
|
public_key: 'dd'.repeat(32),
|
||||||
|
name: 'Static',
|
||||||
|
type: 1,
|
||||||
|
flags: 0,
|
||||||
|
direct_path: null,
|
||||||
|
direct_path_len: -1,
|
||||||
|
direct_path_hash_mode: -1,
|
||||||
|
route_override_path: null,
|
||||||
|
route_override_len: null,
|
||||||
|
route_override_hash_mode: null,
|
||||||
|
last_advert: null,
|
||||||
|
lat: 42,
|
||||||
|
lon: -72,
|
||||||
|
last_seen: Math.floor(Date.now() / 1000),
|
||||||
|
on_radio: false,
|
||||||
|
favorite: false,
|
||||||
|
last_contacted: null,
|
||||||
|
last_read_at: null,
|
||||||
|
first_seen: null,
|
||||||
|
};
|
||||||
|
|
||||||
|
render(<MapView contacts={[contact]} />);
|
||||||
|
|
||||||
|
expect(screen.queryByRole('button', { name: /open conversation with static/i })).toBeNull();
|
||||||
|
expect(screen.getByText('Static')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
it('keeps the 7-day cutoff stable for the lifetime of the mounted map', () => {
|
it('keeps the 7-day cutoff stable for the lifetime of the mounted map', () => {
|
||||||
vi.useFakeTimers();
|
vi.useFakeTimers();
|
||||||
try {
|
try {
|
||||||
|
|||||||
@@ -220,6 +220,24 @@ describe('MessageList channel sender rendering', () => {
|
|||||||
expect(onChannelReferenceClick).toHaveBeenCalledWith('#ops-room');
|
expect(onChannelReferenceClick).toHaveBeenCalledWith('#ops-room');
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('does not strip colon-prefixed text in direct messages (issue #198)', () => {
|
||||||
|
render(
|
||||||
|
<MessageList
|
||||||
|
messages={[
|
||||||
|
createMessage({
|
||||||
|
type: 'PRIV',
|
||||||
|
conversation_key: 'ab'.repeat(32),
|
||||||
|
text: 'TEST1: TEST2',
|
||||||
|
}),
|
||||||
|
]}
|
||||||
|
contacts={[]}
|
||||||
|
loading={false}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(screen.getByText('TEST1: TEST2')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
it('renders and dismisses an unread marker at the first unread message boundary', async () => {
|
it('renders and dismisses an unread marker at the first unread message boundary', async () => {
|
||||||
const user = userEvent.setup();
|
const user = userEvent.setup();
|
||||||
const messages = [
|
const messages = [
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { act, fireEvent, render, screen, waitFor } from '@testing-library/react';
|
import { fireEvent, render, screen, waitFor } from '@testing-library/react';
|
||||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
||||||
|
|
||||||
import { SettingsModal } from '../components/SettingsModal';
|
import { SettingsModal } from '../components/SettingsModal';
|
||||||
@@ -70,6 +70,7 @@ const baseSettings: AppSettings = {
|
|||||||
discovery_blocked_types: [],
|
discovery_blocked_types: [],
|
||||||
tracked_telemetry_repeaters: [],
|
tracked_telemetry_repeaters: [],
|
||||||
auto_resend_channel: false,
|
auto_resend_channel: false,
|
||||||
|
telemetry_interval_hours: 8,
|
||||||
};
|
};
|
||||||
|
|
||||||
function renderModal(overrides?: {
|
function renderModal(overrides?: {
|
||||||
@@ -442,52 +443,86 @@ describe('SettingsModal', () => {
|
|||||||
expect(screen.getByText('iPhone')).toBeInTheDocument();
|
expect(screen.getByText('iPhone')).toBeInTheDocument();
|
||||||
});
|
});
|
||||||
|
|
||||||
it('clears stale errors when switching external desktop sections', async () => {
|
it('reverts checkbox state when auto-persist fails on the database section', async () => {
|
||||||
|
// Auto-persist replaced the old "Save Settings" button on this section.
|
||||||
|
// The risk is now: a toggle gets applied optimistically, the PATCH fails,
|
||||||
|
// and we're left with the UI out of sync with saved state. Verify the
|
||||||
|
// revert-on-error path keeps the checkbox consistent with the server.
|
||||||
const onSaveAppSettings = vi.fn(async () => {
|
const onSaveAppSettings = vi.fn(async () => {
|
||||||
throw new Error('Save failed');
|
throw new Error('Save failed');
|
||||||
});
|
});
|
||||||
|
|
||||||
const { view } = renderModal({
|
renderModal({
|
||||||
externalSidebarNav: true,
|
externalSidebarNav: true,
|
||||||
desktopSection: 'database',
|
desktopSection: 'database',
|
||||||
onSaveAppSettings,
|
onSaveAppSettings,
|
||||||
});
|
});
|
||||||
|
|
||||||
fireEvent.click(screen.getByRole('button', { name: 'Save Settings' }));
|
const checkbox = screen.getByRole('checkbox', {
|
||||||
|
name: /Auto-decrypt historical DMs/i,
|
||||||
|
}) as HTMLInputElement;
|
||||||
|
const initialChecked = checkbox.checked;
|
||||||
|
|
||||||
|
fireEvent.click(checkbox);
|
||||||
|
|
||||||
await waitFor(() => {
|
await waitFor(() => {
|
||||||
expect(screen.getByText('Save failed')).toBeInTheDocument();
|
expect(onSaveAppSettings).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(checkbox.checked).toBe(initialChecked);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('serializes rapid auto-persist clicks so stale writes cannot win', async () => {
|
||||||
|
// Regression test for a race where rapid consecutive checkbox toggles
|
||||||
|
// fire overlapping PATCHes that can land out of order. The page now
|
||||||
|
// chains saves through a single promise, so the server sees them in
|
||||||
|
// the order the user clicked. This test hand-controls resolution
|
||||||
|
// order to force the "stale write" scenario if serialization were off.
|
||||||
|
|
||||||
|
const deferred: { resolve: () => void }[] = [];
|
||||||
|
const callOrder: number[] = [];
|
||||||
|
|
||||||
|
const onSaveAppSettings = vi.fn(async (_update: unknown) => {
|
||||||
|
const index = deferred.length;
|
||||||
|
callOrder.push(index);
|
||||||
|
await new Promise<void>((res) => {
|
||||||
|
deferred.push({ resolve: res });
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
await act(async () => {
|
renderModal({
|
||||||
view.rerender(
|
externalSidebarNav: true,
|
||||||
<SettingsModal
|
desktopSection: 'database',
|
||||||
open
|
onSaveAppSettings,
|
||||||
externalSidebarNav
|
|
||||||
desktopSection="fanout"
|
|
||||||
config={baseConfig}
|
|
||||||
health={baseHealth}
|
|
||||||
appSettings={baseSettings}
|
|
||||||
onClose={vi.fn()}
|
|
||||||
onSave={vi.fn(async () => {})}
|
|
||||||
onSaveAppSettings={onSaveAppSettings}
|
|
||||||
onSetPrivateKey={vi.fn(async () => {})}
|
|
||||||
onReboot={vi.fn(async () => {})}
|
|
||||||
onDisconnect={vi.fn(async () => {})}
|
|
||||||
onReconnect={vi.fn(async () => {})}
|
|
||||||
onAdvertise={vi.fn(async () => {})}
|
|
||||||
meshDiscovery={null}
|
|
||||||
meshDiscoveryLoadingTarget={null}
|
|
||||||
onDiscoverMesh={vi.fn(async () => {})}
|
|
||||||
onHealthRefresh={vi.fn(async () => {})}
|
|
||||||
onRefreshAppSettings={vi.fn(async () => {})}
|
|
||||||
/>
|
|
||||||
);
|
|
||||||
await Promise.resolve();
|
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(api.getFanoutConfigs).toHaveBeenCalled();
|
// Two distinct checkboxes in quick succession.
|
||||||
expect(screen.getByRole('button', { name: 'Add Integration' })).toBeInTheDocument();
|
const blockClients = screen.getByRole('checkbox', { name: /Block clients/i });
|
||||||
expect(screen.queryByText('Save failed')).not.toBeInTheDocument();
|
const blockRepeaters = screen.getByRole('checkbox', { name: /Block repeaters/i });
|
||||||
|
|
||||||
|
fireEvent.click(blockClients);
|
||||||
|
fireEvent.click(blockRepeaters);
|
||||||
|
|
||||||
|
// Wait for the first PATCH to be registered. Only the first should be
|
||||||
|
// in-flight — the second must be queued behind it.
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(deferred.length).toBe(1);
|
||||||
|
});
|
||||||
|
expect(callOrder).toEqual([0]);
|
||||||
|
|
||||||
|
// Resolve the first PATCH. The chain should now dispatch the second.
|
||||||
|
deferred[0].resolve();
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(deferred.length).toBe(2);
|
||||||
|
});
|
||||||
|
expect(callOrder).toEqual([0, 1]);
|
||||||
|
|
||||||
|
// Resolve the second so the test tears down cleanly.
|
||||||
|
deferred[1].resolve();
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(onSaveAppSettings).toHaveBeenCalledTimes(2);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
it('does not call onClose after save/reboot flows in page mode', async () => {
|
it('does not call onClose after save/reboot flows in page mode', async () => {
|
||||||
|
|||||||
@@ -8,9 +8,12 @@ class ResizeObserver {
|
|||||||
|
|
||||||
globalThis.ResizeObserver = ResizeObserver;
|
globalThis.ResizeObserver = ResizeObserver;
|
||||||
|
|
||||||
// Several components call matchMedia at import time for responsive detection
|
// Several components call matchMedia at import time for responsive detection.
|
||||||
|
// Use a configurable descriptor so individual tests can override the stub.
|
||||||
if (typeof globalThis.matchMedia === 'undefined') {
|
if (typeof globalThis.matchMedia === 'undefined') {
|
||||||
Object.defineProperty(globalThis, 'matchMedia', {
|
Object.defineProperty(globalThis, 'matchMedia', {
|
||||||
|
configurable: true,
|
||||||
|
writable: true,
|
||||||
value: (query: string) => ({
|
value: (query: string) => ({
|
||||||
matches: false,
|
matches: false,
|
||||||
media: query,
|
media: query,
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { fireEvent, render, screen } from '@testing-library/react';
|
import { fireEvent, render, screen } from '@testing-library/react';
|
||||||
import { describe, expect, it, vi } from 'vitest';
|
import { afterEach, describe, expect, it, vi } from 'vitest';
|
||||||
|
|
||||||
import { StatusBar } from '../components/StatusBar';
|
import { StatusBar } from '../components/StatusBar';
|
||||||
import type { HealthStatus } from '../types';
|
import type { HealthStatus } from '../types';
|
||||||
@@ -77,4 +77,57 @@ describe('StatusBar', () => {
|
|||||||
expect(localStorage.getItem('remoteterm-theme')).toBe('original');
|
expect(localStorage.getItem('remoteterm-theme')).toBe('original');
|
||||||
expect(document.documentElement.dataset.theme).toBeUndefined();
|
expect(document.documentElement.dataset.theme).toBeUndefined();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('with Follow OS theme saved', () => {
|
||||||
|
const originalMatchMedia = globalThis.matchMedia;
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
globalThis.matchMedia = originalMatchMedia;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Stub matchMedia so prefers-color-scheme: light returns the desired value.
|
||||||
|
const setPrefersLight = (isLight: boolean) => {
|
||||||
|
Object.defineProperty(globalThis, 'matchMedia', {
|
||||||
|
configurable: true,
|
||||||
|
value: (query: string) => ({
|
||||||
|
matches: query.includes('light') ? isLight : !isLight,
|
||||||
|
media: query,
|
||||||
|
onchange: null,
|
||||||
|
addListener: () => {},
|
||||||
|
removeListener: () => {},
|
||||||
|
addEventListener: () => {},
|
||||||
|
removeEventListener: () => {},
|
||||||
|
dispatchEvent: () => false,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
it('clicking toggle while OS prefers dark overrides follow-os into explicit light', () => {
|
||||||
|
setPrefersLight(false);
|
||||||
|
localStorage.setItem('remoteterm-theme', 'follow-os');
|
||||||
|
|
||||||
|
render(<StatusBar health={baseHealth} config={null} onSettingsClick={vi.fn()} />);
|
||||||
|
|
||||||
|
// OS is dark → effective is original → toggle offers "Switch to light theme"
|
||||||
|
const toggle = screen.getByRole('button', { name: 'Switch to light theme' });
|
||||||
|
fireEvent.click(toggle);
|
||||||
|
|
||||||
|
expect(localStorage.getItem('remoteterm-theme')).toBe('light');
|
||||||
|
expect(document.documentElement.dataset.theme).toBe('light');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('clicking toggle while OS prefers light overrides follow-os into explicit dark', () => {
|
||||||
|
setPrefersLight(true);
|
||||||
|
localStorage.setItem('remoteterm-theme', 'follow-os');
|
||||||
|
|
||||||
|
render(<StatusBar health={baseHealth} config={null} onSettingsClick={vi.fn()} />);
|
||||||
|
|
||||||
|
// OS is light → effective is light → toggle offers "Switch to classic theme"
|
||||||
|
const toggle = screen.getByRole('button', { name: 'Switch to classic theme' });
|
||||||
|
fireEvent.click(toggle);
|
||||||
|
|
||||||
|
expect(localStorage.getItem('remoteterm-theme')).toBe('original');
|
||||||
|
expect(document.documentElement.dataset.theme).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -0,0 +1,87 @@
|
|||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
|
||||||
|
import {
|
||||||
|
FOLLOW_OS_THEME_ID,
|
||||||
|
THEMES,
|
||||||
|
applyTheme,
|
||||||
|
getEffectiveTheme,
|
||||||
|
getSavedTheme,
|
||||||
|
} from '../utils/theme';
|
||||||
|
|
||||||
|
const originalMatchMedia = globalThis.matchMedia;
|
||||||
|
|
||||||
|
function stubPrefersLight(isLight: boolean) {
|
||||||
|
Object.defineProperty(globalThis, 'matchMedia', {
|
||||||
|
configurable: true,
|
||||||
|
value: (query: string) => ({
|
||||||
|
matches: query.includes('light') ? isLight : !isLight,
|
||||||
|
media: query,
|
||||||
|
onchange: null,
|
||||||
|
addListener: () => {},
|
||||||
|
removeListener: () => {},
|
||||||
|
addEventListener: () => {},
|
||||||
|
removeEventListener: () => {},
|
||||||
|
dispatchEvent: () => false,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('theme module', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
localStorage.clear();
|
||||||
|
delete document.documentElement.dataset.theme;
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
globalThis.matchMedia = originalMatchMedia;
|
||||||
|
});
|
||||||
|
|
||||||
|
it('exposes an OS-following theme in the selectable list', () => {
|
||||||
|
const followOS = THEMES.find((t) => t.id === FOLLOW_OS_THEME_ID);
|
||||||
|
expect(followOS).toBeDefined();
|
||||||
|
expect(followOS?.name).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('applyTheme("follow-os") resolves to light when OS prefers light', () => {
|
||||||
|
stubPrefersLight(true);
|
||||||
|
|
||||||
|
applyTheme(FOLLOW_OS_THEME_ID);
|
||||||
|
|
||||||
|
// Saved value is the follow-os preference, but the DOM reflects the resolved theme.
|
||||||
|
expect(localStorage.getItem('remoteterm-theme')).toBe(FOLLOW_OS_THEME_ID);
|
||||||
|
expect(getSavedTheme()).toBe(FOLLOW_OS_THEME_ID);
|
||||||
|
expect(document.documentElement.dataset.theme).toBe('light');
|
||||||
|
expect(getEffectiveTheme()).toBe('light');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('applyTheme("follow-os") resolves to original (dark) when OS prefers dark', () => {
|
||||||
|
stubPrefersLight(false);
|
||||||
|
|
||||||
|
applyTheme(FOLLOW_OS_THEME_ID);
|
||||||
|
|
||||||
|
expect(localStorage.getItem('remoteterm-theme')).toBe(FOLLOW_OS_THEME_ID);
|
||||||
|
// Original has no data-theme attribute, it's the default.
|
||||||
|
expect(document.documentElement.dataset.theme).toBeUndefined();
|
||||||
|
expect(getEffectiveTheme()).toBe('original');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('applyTheme updates the PWA meta theme-color to match the effective theme', () => {
|
||||||
|
// Seed the meta tag (jsdom base template has none).
|
||||||
|
const meta = document.createElement('meta');
|
||||||
|
meta.setAttribute('name', 'theme-color');
|
||||||
|
meta.setAttribute('content', '#000000');
|
||||||
|
document.head.appendChild(meta);
|
||||||
|
|
||||||
|
stubPrefersLight(true);
|
||||||
|
applyTheme(FOLLOW_OS_THEME_ID);
|
||||||
|
// Light theme's metaThemeColor
|
||||||
|
expect(meta.getAttribute('content')).toBe('#F8F7F4');
|
||||||
|
|
||||||
|
stubPrefersLight(false);
|
||||||
|
applyTheme(FOLLOW_OS_THEME_ID);
|
||||||
|
// Original theme's metaThemeColor
|
||||||
|
expect(meta.getAttribute('content')).toBe('#111419');
|
||||||
|
|
||||||
|
meta.remove();
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -0,0 +1,203 @@
|
|||||||
|
import { act, renderHook, waitFor } from '@testing-library/react';
|
||||||
|
import { beforeEach, describe, expect, it, vi } from 'vitest';
|
||||||
|
|
||||||
|
import { usePushSubscription } from '../hooks/usePushSubscription';
|
||||||
|
|
||||||
|
const mocks = vi.hoisted(() => ({
|
||||||
|
api: {
|
||||||
|
getPushSubscriptions: vi.fn(),
|
||||||
|
getPushConversations: vi.fn(),
|
||||||
|
getVapidPublicKey: vi.fn(),
|
||||||
|
pushSubscribe: vi.fn(),
|
||||||
|
deletePushSubscription: vi.fn(),
|
||||||
|
togglePushConversation: vi.fn(),
|
||||||
|
testPushSubscription: vi.fn(),
|
||||||
|
},
|
||||||
|
toast: {
|
||||||
|
success: vi.fn(),
|
||||||
|
error: vi.fn(),
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../api', () => ({
|
||||||
|
api: mocks.api,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../components/ui/sonner', () => ({
|
||||||
|
toast: mocks.toast,
|
||||||
|
}));
|
||||||
|
|
||||||
|
function bytesToBase64Url(bytes: number[]): string {
|
||||||
|
return btoa(String.fromCharCode(...bytes))
|
||||||
|
.replace(/\+/g, '-')
|
||||||
|
.replace(/\//g, '_')
|
||||||
|
.replace(/=+$/g, '');
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('usePushSubscription', () => {
|
||||||
|
const vapidOldBytes = [1, 2, 3, 4];
|
||||||
|
const vapidNewBytes = [5, 6, 7, 8];
|
||||||
|
const oldKey = new Uint8Array(vapidOldBytes).buffer;
|
||||||
|
const newKeyBase64 = bytesToBase64Url(vapidNewBytes);
|
||||||
|
|
||||||
|
let activeSubscription: {
|
||||||
|
endpoint: string;
|
||||||
|
options: { applicationServerKey: ArrayBuffer };
|
||||||
|
toJSON: () => { endpoint: string; keys: { p256dh: string; auth: string } };
|
||||||
|
unsubscribe: ReturnType<typeof vi.fn>;
|
||||||
|
} | null;
|
||||||
|
let replacementSubscription: {
|
||||||
|
endpoint: string;
|
||||||
|
options: { applicationServerKey: ArrayBuffer };
|
||||||
|
toJSON: () => { endpoint: string; keys: { p256dh: string; auth: string } };
|
||||||
|
unsubscribe: ReturnType<typeof vi.fn>;
|
||||||
|
};
|
||||||
|
let getSubscriptionMock: ReturnType<typeof vi.fn>;
|
||||||
|
let subscribeMock: ReturnType<typeof vi.fn>;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
|
||||||
|
activeSubscription = {
|
||||||
|
endpoint: 'https://push.example.test/sub-old',
|
||||||
|
options: { applicationServerKey: oldKey },
|
||||||
|
toJSON: () => ({
|
||||||
|
endpoint: 'https://push.example.test/sub-old',
|
||||||
|
keys: { p256dh: 'p256dh-old', auth: 'auth-old' },
|
||||||
|
}),
|
||||||
|
unsubscribe: vi.fn(async () => {
|
||||||
|
activeSubscription = null;
|
||||||
|
return true;
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
|
||||||
|
replacementSubscription = {
|
||||||
|
endpoint: 'https://push.example.test/sub-new',
|
||||||
|
options: { applicationServerKey: new Uint8Array(vapidNewBytes).buffer },
|
||||||
|
toJSON: () => ({
|
||||||
|
endpoint: 'https://push.example.test/sub-new',
|
||||||
|
keys: { p256dh: 'p256dh-new', auth: 'auth-new' },
|
||||||
|
}),
|
||||||
|
unsubscribe: vi.fn(async () => true),
|
||||||
|
};
|
||||||
|
|
||||||
|
getSubscriptionMock = vi.fn(async () => activeSubscription);
|
||||||
|
subscribeMock = vi.fn(async () => {
|
||||||
|
activeSubscription = replacementSubscription;
|
||||||
|
return replacementSubscription;
|
||||||
|
});
|
||||||
|
|
||||||
|
Object.defineProperty(window, 'isSecureContext', {
|
||||||
|
configurable: true,
|
||||||
|
value: true,
|
||||||
|
});
|
||||||
|
Object.defineProperty(window, 'PushManager', {
|
||||||
|
configurable: true,
|
||||||
|
value: function PushManager() {},
|
||||||
|
});
|
||||||
|
Object.defineProperty(window, 'Notification', {
|
||||||
|
configurable: true,
|
||||||
|
value: function Notification() {},
|
||||||
|
});
|
||||||
|
Object.defineProperty(navigator, 'serviceWorker', {
|
||||||
|
configurable: true,
|
||||||
|
value: {
|
||||||
|
ready: Promise.resolve({
|
||||||
|
pushManager: {
|
||||||
|
getSubscription: getSubscriptionMock,
|
||||||
|
subscribe: subscribeMock,
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
mocks.api.getPushConversations.mockResolvedValue([]);
|
||||||
|
mocks.api.getPushSubscriptions.mockResolvedValue([
|
||||||
|
{
|
||||||
|
id: 'sub-1',
|
||||||
|
endpoint: 'https://push.example.test/sub-old',
|
||||||
|
p256dh: 'p256dh-old',
|
||||||
|
auth: 'auth-old',
|
||||||
|
label: 'Chrome on macOS',
|
||||||
|
created_at: 1,
|
||||||
|
last_success_at: null,
|
||||||
|
failure_count: 0,
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
mocks.api.getVapidPublicKey.mockResolvedValue({ public_key: newKeyBase64 });
|
||||||
|
mocks.api.pushSubscribe.mockResolvedValue({
|
||||||
|
id: 'sub-2',
|
||||||
|
endpoint: 'https://push.example.test/sub-new',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('clears currentSubscriptionId when refresh no longer finds this browser on the backend', async () => {
|
||||||
|
const { result } = renderHook(() => usePushSubscription());
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(result.current.currentSubscriptionId).toBe('sub-1');
|
||||||
|
expect(result.current.isSubscribed).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
mocks.api.getPushSubscriptions.mockResolvedValueOnce([]);
|
||||||
|
|
||||||
|
await act(async () => {
|
||||||
|
await result.current.refreshSubscriptions();
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.current.currentSubscriptionId).toBeNull();
|
||||||
|
expect(result.current.isSubscribed).toBe(false);
|
||||||
|
expect(result.current.allSubscriptions).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('recreates a stale browser subscription when the server VAPID key changed', async () => {
|
||||||
|
const oldSubscription = activeSubscription;
|
||||||
|
mocks.api.getPushSubscriptions
|
||||||
|
.mockReset()
|
||||||
|
.mockResolvedValueOnce([
|
||||||
|
{
|
||||||
|
id: 'sub-1',
|
||||||
|
endpoint: 'https://push.example.test/sub-old',
|
||||||
|
p256dh: 'p256dh-old',
|
||||||
|
auth: 'auth-old',
|
||||||
|
label: 'Chrome on macOS',
|
||||||
|
created_at: 1,
|
||||||
|
last_success_at: null,
|
||||||
|
failure_count: 0,
|
||||||
|
},
|
||||||
|
])
|
||||||
|
.mockResolvedValueOnce([
|
||||||
|
{
|
||||||
|
id: 'sub-2',
|
||||||
|
endpoint: 'https://push.example.test/sub-new',
|
||||||
|
p256dh: 'p256dh-new',
|
||||||
|
auth: 'auth-new',
|
||||||
|
label: 'Chrome on macOS',
|
||||||
|
created_at: 2,
|
||||||
|
last_success_at: null,
|
||||||
|
failure_count: 0,
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
const { result } = renderHook(() => usePushSubscription());
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(result.current.isSupported).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
await act(async () => {
|
||||||
|
await result.current.subscribe();
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(oldSubscription?.unsubscribe).toHaveBeenCalledTimes(1);
|
||||||
|
expect(activeSubscription).toBe(replacementSubscription);
|
||||||
|
expect(subscribeMock).toHaveBeenCalledTimes(1);
|
||||||
|
expect(mocks.api.pushSubscribe).toHaveBeenCalledWith({
|
||||||
|
endpoint: 'https://push.example.test/sub-new',
|
||||||
|
p256dh: 'p256dh-new',
|
||||||
|
auth: 'auth-new',
|
||||||
|
label: expect.any(String),
|
||||||
|
});
|
||||||
|
expect(result.current.currentSubscriptionId).toBe('sub-2');
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -355,6 +355,7 @@ export interface AppSettings {
|
|||||||
discovery_blocked_types: number[];
|
discovery_blocked_types: number[];
|
||||||
tracked_telemetry_repeaters: string[];
|
tracked_telemetry_repeaters: string[];
|
||||||
auto_resend_channel: boolean;
|
auto_resend_channel: boolean;
|
||||||
|
telemetry_interval_hours: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface AppSettingsUpdate {
|
export interface AppSettingsUpdate {
|
||||||
@@ -366,11 +367,22 @@ export interface AppSettingsUpdate {
|
|||||||
blocked_keys?: string[];
|
blocked_keys?: string[];
|
||||||
blocked_names?: string[];
|
blocked_names?: string[];
|
||||||
discovery_blocked_types?: number[];
|
discovery_blocked_types?: number[];
|
||||||
|
telemetry_interval_hours?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TelemetrySchedule {
|
||||||
|
preferred_hours: number;
|
||||||
|
effective_hours: number;
|
||||||
|
options: number[];
|
||||||
|
tracked_count: number;
|
||||||
|
max_tracked: number;
|
||||||
|
next_run_at: number | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface TrackedTelemetryResponse {
|
export interface TrackedTelemetryResponse {
|
||||||
tracked_telemetry_repeaters: string[];
|
tracked_telemetry_repeaters: string[];
|
||||||
names: Record<string, string>;
|
names: Record<string, string>;
|
||||||
|
schedule: TelemetrySchedule;
|
||||||
}
|
}
|
||||||
|
|
||||||
/** Contact type constants */
|
/** Contact type constants */
|
||||||
@@ -498,6 +510,17 @@ export interface TelemetryHistoryEntry {
|
|||||||
data: Record<string, number> & { lpp_sensors?: TelemetryLppSensor[] };
|
data: Record<string, number> & { lpp_sensors?: TelemetryLppSensor[] };
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface PushSubscriptionInfo {
|
||||||
|
id: string;
|
||||||
|
endpoint: string;
|
||||||
|
p256dh: string;
|
||||||
|
auth: string;
|
||||||
|
label: string;
|
||||||
|
created_at: number;
|
||||||
|
last_success_at: number | null;
|
||||||
|
failure_count: number;
|
||||||
|
}
|
||||||
|
|
||||||
export interface TraceResponse {
|
export interface TraceResponse {
|
||||||
remote_snr: number | null;
|
remote_snr: number | null;
|
||||||
local_snr: number | null;
|
local_snr: number | null;
|
||||||
|
|||||||
@@ -209,6 +209,37 @@ export function formatRouteLabel(pathLen: number, capitalize: boolean = false):
|
|||||||
return capitalize ? label.charAt(0).toUpperCase() + label.slice(1) : label;
|
return capitalize ? label.charAt(0).toUpperCase() + label.slice(1) : label;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format the learned direct route for display in route-editing dialogs,
|
||||||
|
* e.g. "2 hops (AE -> F1)", "Direct", or "Flood".
|
||||||
|
*/
|
||||||
|
export function formatLearnedRouteSummary(contact: Contact): string {
|
||||||
|
const directRoute = getDirectContactRoute(contact);
|
||||||
|
if (!directRoute) {
|
||||||
|
return formatRouteLabel(-1, true);
|
||||||
|
}
|
||||||
|
const hops = parsePathHops(directRoute.path, directRoute.path_len);
|
||||||
|
const label = formatRouteLabel(directRoute.path_len, true);
|
||||||
|
return hops.length > 0 ? `${label} (${hops.join(' -> ')})` : label;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format the forced (override) route for display in route-editing dialogs,
|
||||||
|
* matching the learned-route format. Returns null when no override is set.
|
||||||
|
*/
|
||||||
|
export function formatForcedRouteSummary(contact: Contact): string | null {
|
||||||
|
if (!hasRoutingOverride(contact)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const effectiveRoute = getEffectiveContactRoute(contact);
|
||||||
|
if (effectiveRoute.pathLen === -1) {
|
||||||
|
return formatRouteLabel(-1, true);
|
||||||
|
}
|
||||||
|
const hops = parsePathHops(effectiveRoute.path, effectiveRoute.pathLen);
|
||||||
|
const label = formatRouteLabel(effectiveRoute.pathLen, true);
|
||||||
|
return hops.length > 0 ? `${label} (${hops.join(' -> ')})` : label;
|
||||||
|
}
|
||||||
|
|
||||||
export function formatRoutingOverrideInput(contact: Contact): string {
|
export function formatRoutingOverrideInput(contact: Contact): string {
|
||||||
const routeOverride = getRouteOverride(contact);
|
const routeOverride = getRouteOverride(contact);
|
||||||
if (!routeOverride) {
|
if (!routeOverride) {
|
||||||
|
|||||||
@@ -0,0 +1,61 @@
|
|||||||
|
export const STATUS_DOT_PULSE_CHANGE_EVENT = 'remoteterm-status-dot-pulse-change';
|
||||||
|
export const STATUS_DOT_PULSE_PACKET_EVENT = 'remoteterm-status-dot-pulse-packet';
|
||||||
|
|
||||||
|
const STORAGE_KEY = 'remoteterm-status-dot-pulse';
|
||||||
|
|
||||||
|
export type StatusDotPulseKind = 'channel' | 'dm' | 'advert' | 'other';
|
||||||
|
|
||||||
|
export function getStatusDotPulseEnabled(): boolean {
|
||||||
|
try {
|
||||||
|
return localStorage.getItem(STORAGE_KEY) === 'true';
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function setStatusDotPulseEnabled(enabled: boolean): void {
|
||||||
|
try {
|
||||||
|
if (enabled) {
|
||||||
|
localStorage.setItem(STORAGE_KEY, 'true');
|
||||||
|
} else {
|
||||||
|
localStorage.removeItem(STORAGE_KEY);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// localStorage may be unavailable
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function payloadTypeToPulseKind(payloadType: string | null | undefined): StatusDotPulseKind {
|
||||||
|
switch (payloadType) {
|
||||||
|
case 'GROUP_TEXT':
|
||||||
|
return 'channel';
|
||||||
|
case 'TEXT_MESSAGE':
|
||||||
|
return 'dm';
|
||||||
|
case 'ADVERT':
|
||||||
|
return 'advert';
|
||||||
|
default:
|
||||||
|
return 'other';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const PULSE_COLORS: Record<StatusDotPulseKind, string> = {
|
||||||
|
channel: 'hsl(210, 90%, 55%)', // blue
|
||||||
|
dm: 'hsl(270, 75%, 60%)', // purple
|
||||||
|
advert: 'hsl(185, 85%, 55%)', // cyan
|
||||||
|
other: 'hsl(140, 80%, 22%)', // dark green
|
||||||
|
};
|
||||||
|
|
||||||
|
export function pulseColorFor(kind: StatusDotPulseKind): string {
|
||||||
|
return PULSE_COLORS[kind];
|
||||||
|
}
|
||||||
|
|
||||||
|
export const STATUS_DOT_PULSE_DURATION_MS = 250;
|
||||||
|
|
||||||
|
export function emitStatusDotPulse(payloadType: string | null | undefined): void {
|
||||||
|
const kind = payloadTypeToPulseKind(payloadType);
|
||||||
|
window.dispatchEvent(
|
||||||
|
new CustomEvent<StatusDotPulseKind>(STATUS_DOT_PULSE_PACKET_EVENT, {
|
||||||
|
detail: kind,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -9,6 +9,8 @@ export interface Theme {
|
|||||||
|
|
||||||
export const THEME_CHANGE_EVENT = 'remoteterm-theme-change';
|
export const THEME_CHANGE_EVENT = 'remoteterm-theme-change';
|
||||||
|
|
||||||
|
export const FOLLOW_OS_THEME_ID = 'follow-os';
|
||||||
|
|
||||||
export const THEMES: Theme[] = [
|
export const THEMES: Theme[] = [
|
||||||
{
|
{
|
||||||
id: 'original',
|
id: 'original',
|
||||||
@@ -22,6 +24,13 @@ export const THEMES: Theme[] = [
|
|||||||
swatches: ['#F8F7F4', '#FFFFFF', '#1B7D4E', '#EDEBE7', '#D97706', '#3B82F6'],
|
swatches: ['#F8F7F4', '#FFFFFF', '#1B7D4E', '#EDEBE7', '#D97706', '#3B82F6'],
|
||||||
metaThemeColor: '#F8F7F4',
|
metaThemeColor: '#F8F7F4',
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
id: FOLLOW_OS_THEME_ID,
|
||||||
|
name: 'OS Light/Dark Mode',
|
||||||
|
// Top row: light theme preview colors; bottom row: original (dark) preview colors
|
||||||
|
swatches: ['#F8F7F4', '#FFFFFF', '#1B7D4E', '#111419', '#181b21', '#27a05c'],
|
||||||
|
metaThemeColor: '#111419',
|
||||||
|
},
|
||||||
{
|
{
|
||||||
id: 'ios',
|
id: 'ios',
|
||||||
name: 'iPhone',
|
name: 'iPhone',
|
||||||
@@ -94,6 +103,23 @@ export function getSavedTheme(): string {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Resolves "Follow OS" to a concrete theme id by inspecting the OS color-scheme preference. */
|
||||||
|
function resolveFollowOS(): 'original' | 'light' {
|
||||||
|
if (typeof window === 'undefined' || typeof window.matchMedia !== 'function') {
|
||||||
|
return 'original';
|
||||||
|
}
|
||||||
|
return window.matchMedia('(prefers-color-scheme: light)').matches ? 'light' : 'original';
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns the concrete theme id currently applied to the document.
|
||||||
|
* Unlike getSavedTheme, this resolves 'follow-os' to 'original' or 'light'.
|
||||||
|
*/
|
||||||
|
export function getEffectiveTheme(): string {
|
||||||
|
const saved = getSavedTheme();
|
||||||
|
return saved === FOLLOW_OS_THEME_ID ? resolveFollowOS() : saved;
|
||||||
|
}
|
||||||
|
|
||||||
export function applyTheme(themeId: string): void {
|
export function applyTheme(themeId: string): void {
|
||||||
try {
|
try {
|
||||||
localStorage.setItem(THEME_KEY, themeId);
|
localStorage.setItem(THEME_KEY, themeId);
|
||||||
@@ -101,14 +127,16 @@ export function applyTheme(themeId: string): void {
|
|||||||
// localStorage may be unavailable
|
// localStorage may be unavailable
|
||||||
}
|
}
|
||||||
|
|
||||||
if (themeId === 'original') {
|
const effective = themeId === FOLLOW_OS_THEME_ID ? resolveFollowOS() : themeId;
|
||||||
|
|
||||||
|
if (effective === 'original') {
|
||||||
delete document.documentElement.dataset.theme;
|
delete document.documentElement.dataset.theme;
|
||||||
} else {
|
} else {
|
||||||
document.documentElement.dataset.theme = themeId;
|
document.documentElement.dataset.theme = effective;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update PWA theme-color meta tag
|
// Update PWA theme-color meta tag — reflect the effective (rendered) theme.
|
||||||
const theme = THEMES.find((t) => t.id === themeId);
|
const theme = THEMES.find((t) => t.id === effective);
|
||||||
if (theme) {
|
if (theme) {
|
||||||
const meta = document.querySelector('meta[name="theme-color"]');
|
const meta = document.querySelector('meta[name="theme-color"]');
|
||||||
if (meta) {
|
if (meta) {
|
||||||
@@ -117,6 +145,33 @@ export function applyTheme(themeId: string): void {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (typeof window !== 'undefined') {
|
if (typeof window !== 'undefined') {
|
||||||
|
// Detail is the saved theme id (including 'follow-os'); listeners that need
|
||||||
|
// the rendered appearance should call getEffectiveTheme().
|
||||||
window.dispatchEvent(new CustomEvent(THEME_CHANGE_EVENT, { detail: themeId }));
|
window.dispatchEvent(new CustomEvent(THEME_CHANGE_EVENT, { detail: themeId }));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let followOSInitialized = false;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Installs a one-time listener on prefers-color-scheme so that when the user is
|
||||||
|
* on "Follow OS", OS appearance changes re-apply the theme. Safe to call once
|
||||||
|
* from app bootstrap.
|
||||||
|
*/
|
||||||
|
export function initFollowOSListener(): void {
|
||||||
|
if (followOSInitialized) return;
|
||||||
|
if (typeof window === 'undefined' || typeof window.matchMedia !== 'function') return;
|
||||||
|
followOSInitialized = true;
|
||||||
|
const mql = window.matchMedia('(prefers-color-scheme: light)');
|
||||||
|
const handler = () => {
|
||||||
|
if (getSavedTheme() === FOLLOW_OS_THEME_ID) {
|
||||||
|
applyTheme(FOLLOW_OS_THEME_ID);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
if (typeof mql.addEventListener === 'function') {
|
||||||
|
mql.addEventListener('change', handler);
|
||||||
|
} else if (typeof (mql as MediaQueryList).addListener === 'function') {
|
||||||
|
// Safari < 14 fallback
|
||||||
|
(mql as MediaQueryList).addListener(handler);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
+2
-1
@@ -1,6 +1,6 @@
|
|||||||
[project]
|
[project]
|
||||||
name = "remoteterm-meshcore"
|
name = "remoteterm-meshcore"
|
||||||
version = "3.11.2"
|
version = "3.11.3"
|
||||||
description = "RemoteTerm - Web interface for MeshCore radio mesh networks"
|
description = "RemoteTerm - Web interface for MeshCore radio mesh networks"
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
requires-python = ">=3.11"
|
requires-python = ">=3.11"
|
||||||
@@ -16,6 +16,7 @@ dependencies = [
|
|||||||
"aiomqtt>=2.0",
|
"aiomqtt>=2.0",
|
||||||
"apprise>=1.9.8",
|
"apprise>=1.9.8",
|
||||||
"boto3>=1.38.0",
|
"boto3>=1.38.0",
|
||||||
|
"pywebpush>=0.14.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.optional-dependencies]
|
[project.optional-dependencies]
|
||||||
|
|||||||
+17
-2
@@ -28,13 +28,28 @@ def cleanup_test_db_dir():
|
|||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
async def test_db():
|
async def test_db():
|
||||||
"""Create an in-memory test database with schema + migrations."""
|
"""Create an in-memory test database with schema + migrations."""
|
||||||
from app.repository import channels, contacts, messages, raw_packets, settings
|
from app.repository import (
|
||||||
|
channels,
|
||||||
|
contacts,
|
||||||
|
messages,
|
||||||
|
raw_packets,
|
||||||
|
repeater_telemetry,
|
||||||
|
settings,
|
||||||
|
)
|
||||||
from app.repository import fanout as fanout_repo
|
from app.repository import fanout as fanout_repo
|
||||||
|
|
||||||
db = Database(":memory:")
|
db = Database(":memory:")
|
||||||
await db.connect()
|
await db.connect()
|
||||||
|
|
||||||
submodules = [contacts, channels, messages, raw_packets, settings, fanout_repo]
|
submodules = [
|
||||||
|
contacts,
|
||||||
|
channels,
|
||||||
|
messages,
|
||||||
|
raw_packets,
|
||||||
|
settings,
|
||||||
|
fanout_repo,
|
||||||
|
repeater_telemetry,
|
||||||
|
]
|
||||||
originals = [(mod, mod.db) for mod in submodules]
|
originals = [(mod, mod.db) for mod in submodules]
|
||||||
|
|
||||||
for mod in submodules:
|
for mod in submodules:
|
||||||
|
|||||||
@@ -105,13 +105,15 @@ class TestCreateContact:
|
|||||||
data = response.json()
|
data = response.json()
|
||||||
assert data["public_key"] == KEY_A
|
assert data["public_key"] == KEY_A
|
||||||
assert data["name"] == "NewContact"
|
assert data["name"] == "NewContact"
|
||||||
assert data["last_seen"] is not None
|
# Manually created contacts have no RF observation yet, so last_seen
|
||||||
|
# stays NULL until we actually hear them on the air.
|
||||||
|
assert data["last_seen"] is None
|
||||||
|
|
||||||
# Verify in DB
|
# Verify in DB
|
||||||
contact = await ContactRepository.get_by_key(KEY_A)
|
contact = await ContactRepository.get_by_key(KEY_A)
|
||||||
assert contact is not None
|
assert contact is not None
|
||||||
assert contact.name == "NewContact"
|
assert contact.name == "NewContact"
|
||||||
assert data["last_seen"] == contact.last_seen
|
assert contact.last_seen is None
|
||||||
mock_broadcast.assert_called_once_with("contact", contact.model_dump())
|
mock_broadcast.assert_called_once_with("contact", contact.model_dump())
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
|
|||||||
@@ -1134,12 +1134,14 @@ class TestOnNewContact:
|
|||||||
|
|
||||||
await on_new_contact(MockEvent())
|
await on_new_contact(MockEvent())
|
||||||
|
|
||||||
# Verify contact was created in real DB
|
# Verify contact was created in real DB. NEW_CONTACT is the radio's
|
||||||
|
# stored contact DB, not an RF observation, so last_seen stays NULL
|
||||||
|
# until we actually hear the contact on the air.
|
||||||
contact = await ContactRepository.get_by_key("cc" * 32)
|
contact = await ContactRepository.get_by_key("cc" * 32)
|
||||||
assert contact is not None
|
assert contact is not None
|
||||||
assert contact.name == "Charlie"
|
assert contact.name == "Charlie"
|
||||||
assert contact.on_radio is False
|
assert contact.on_radio is False
|
||||||
assert contact.last_seen == 1700000000
|
assert contact.last_seen is None
|
||||||
|
|
||||||
mock_broadcast.assert_called_once()
|
mock_broadcast.assert_called_once()
|
||||||
event_type, contact_data = mock_broadcast.call_args[0]
|
event_type, contact_data = mock_broadcast.call_args[0]
|
||||||
|
|||||||
@@ -69,7 +69,12 @@ def test_valid_dist_serves_static_and_spa_fallback(tmp_path):
|
|||||||
assert manifest["scope"] == "http://testserver/"
|
assert manifest["scope"] == "http://testserver/"
|
||||||
assert manifest["id"] == "http://testserver/"
|
assert manifest["id"] == "http://testserver/"
|
||||||
assert manifest["display"] == "standalone"
|
assert manifest["display"] == "standalone"
|
||||||
assert manifest["icons"][0]["src"] == "http://testserver/web-app-manifest-192x192.png"
|
icon_srcs = {icon["src"] for icon in manifest["icons"]}
|
||||||
|
assert "http://testserver/web-app-manifest-192x192.png" in icon_srcs
|
||||||
|
assert "http://testserver/web-app-manifest-512x512.png" in icon_srcs
|
||||||
|
# SVG icons cause inconsistent PWA icon rendering on iOS; the manifest
|
||||||
|
# must be PNG-only.
|
||||||
|
assert all(icon["type"] == "image/png" for icon in manifest["icons"])
|
||||||
|
|
||||||
file_response = client.get("/robots.txt")
|
file_response = client.get("/robots.txt")
|
||||||
assert file_response.status_code == 200
|
assert file_response.status_code == 200
|
||||||
@@ -152,7 +157,9 @@ def test_webmanifest_includes_forwarded_prefix(tmp_path):
|
|||||||
assert data["start_url"] == expected_base
|
assert data["start_url"] == expected_base
|
||||||
assert data["scope"] == expected_base
|
assert data["scope"] == expected_base
|
||||||
assert data["id"] == expected_base
|
assert data["id"] == expected_base
|
||||||
assert data["icons"][0]["src"] == f"{expected_base}web-app-manifest-192x192.png"
|
icon_srcs = {icon["src"] for icon in data["icons"]}
|
||||||
|
assert f"{expected_base}web-app-manifest-192x192.png" in icon_srcs
|
||||||
|
assert f"{expected_base}web-app-manifest-512x512.png" in icon_srcs
|
||||||
|
|
||||||
|
|
||||||
def test_first_available_prefers_dist_over_prebuilt(tmp_path):
|
def test_first_available_prefers_dist_over_prebuilt(tmp_path):
|
||||||
|
|||||||
@@ -479,7 +479,7 @@ class TestLiveSend:
|
|||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_live_send_defaults_to_map_url(self):
|
async def test_live_send_defaults_to_map_url(self):
|
||||||
"""Empty api_url should default to the map.meshcore.dev endpoint."""
|
"""Empty api_url should default to the map.meshcore.io endpoint."""
|
||||||
mod = _make_module({"dry_run": False, "api_url": ""})
|
mod = _make_module({"dry_run": False, "api_url": ""})
|
||||||
await mod.start()
|
await mod.start()
|
||||||
|
|
||||||
|
|||||||
@@ -2,4 +2,4 @@
|
|||||||
# run ``run_migrations`` to completion assert ``get_version == LATEST`` and
|
# run ``run_migrations`` to completion assert ``get_version == LATEST`` and
|
||||||
# ``applied == LATEST - starting_version`` so only this constant needs to
|
# ``applied == LATEST - starting_version`` so only this constant needs to
|
||||||
# change, not every individual assertion.
|
# change, not every individual assertion.
|
||||||
LATEST_SCHEMA_VERSION = 56
|
LATEST_SCHEMA_VERSION = 58
|
||||||
|
|||||||
@@ -322,7 +322,7 @@ class TestUndecryptedTextPacketStreaming:
|
|||||||
[],
|
[],
|
||||||
]
|
]
|
||||||
|
|
||||||
async def fake_execute(*_args, **_kwargs):
|
def fake_execute(*_args, **_kwargs):
|
||||||
batch = batches.pop(0)
|
batch = batches.pop(0)
|
||||||
|
|
||||||
class FakeCursor:
|
class FakeCursor:
|
||||||
@@ -332,6 +332,16 @@ class TestUndecryptedTextPacketStreaming:
|
|||||||
async def close(self):
|
async def close(self):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
async def __aenter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
async def __aexit__(self, exc_type, exc, tb):
|
||||||
|
return None
|
||||||
|
|
||||||
|
# aiosqlite's execute() returns a `contextmanager`-decorated
|
||||||
|
# coroutine that is both awaitable and usable as an async-with.
|
||||||
|
# Our repo code now uses `async with conn.execute(...) as cursor:`,
|
||||||
|
# so the mock just needs to return something with __aenter__/__aexit__.
|
||||||
return FakeCursor()
|
return FakeCursor()
|
||||||
|
|
||||||
with patch.object(test_db.conn, "execute", side_effect=fake_execute):
|
with patch.object(test_db.conn, "execute", side_effect=fake_execute):
|
||||||
|
|||||||
@@ -0,0 +1,74 @@
|
|||||||
|
"""Tests for Web Push delivery transport behavior."""
|
||||||
|
|
||||||
|
from types import SimpleNamespace
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from app.push.send import (
|
||||||
|
DEFAULT_PUSH_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
DEFAULT_PUSH_READ_TIMEOUT_SECONDS,
|
||||||
|
IPV4_FALLBACK_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
IPv4HTTPAdapter,
|
||||||
|
send_push,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_send_push_prefers_default_dual_stack_session_before_any_ipv4_fallback():
|
||||||
|
"""Successful sends should use the normal requests transport without forcing IPv4."""
|
||||||
|
captured_kwargs: dict = {}
|
||||||
|
|
||||||
|
def fake_webpush(**kwargs):
|
||||||
|
captured_kwargs.update(kwargs)
|
||||||
|
return SimpleNamespace(status_code=201)
|
||||||
|
|
||||||
|
with patch("app.push.send.webpush", side_effect=fake_webpush):
|
||||||
|
status = await send_push(
|
||||||
|
subscription_info={"endpoint": "https://push.example.test", "keys": {}},
|
||||||
|
payload='{"message":"hello"}',
|
||||||
|
vapid_private_key="private-key",
|
||||||
|
vapid_claims={"sub": "mailto:test@example.com"},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert status == 201
|
||||||
|
session = captured_kwargs["requests_session"]
|
||||||
|
assert not isinstance(session.adapters["https://"], IPv4HTTPAdapter)
|
||||||
|
assert captured_kwargs["timeout"] == (
|
||||||
|
DEFAULT_PUSH_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
DEFAULT_PUSH_READ_TIMEOUT_SECONDS,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_send_push_retries_with_ipv4_session_after_connect_timeout():
|
||||||
|
"""Connect failures should retry through the isolated IPv4-only transport."""
|
||||||
|
calls: list[dict] = []
|
||||||
|
|
||||||
|
def fake_webpush(**kwargs):
|
||||||
|
calls.append(kwargs)
|
||||||
|
if len(calls) == 1:
|
||||||
|
raise requests.exceptions.ConnectTimeout("ipv6 connect timed out")
|
||||||
|
return SimpleNamespace(status_code=201)
|
||||||
|
|
||||||
|
with patch("app.push.send.webpush", side_effect=fake_webpush):
|
||||||
|
status = await send_push(
|
||||||
|
subscription_info={"endpoint": "https://push.example.test", "keys": {}},
|
||||||
|
payload='{"message":"hello"}',
|
||||||
|
vapid_private_key="private-key",
|
||||||
|
vapid_claims={"sub": "mailto:test@example.com"},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert status == 201
|
||||||
|
assert len(calls) == 2
|
||||||
|
assert not isinstance(calls[0]["requests_session"].adapters["https://"], IPv4HTTPAdapter)
|
||||||
|
assert isinstance(calls[1]["requests_session"].adapters["https://"], IPv4HTTPAdapter)
|
||||||
|
assert calls[0]["timeout"] == (
|
||||||
|
DEFAULT_PUSH_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
DEFAULT_PUSH_READ_TIMEOUT_SECONDS,
|
||||||
|
)
|
||||||
|
assert calls[1]["timeout"] == (
|
||||||
|
IPV4_FALLBACK_CONNECT_TIMEOUT_SECONDS,
|
||||||
|
DEFAULT_PUSH_READ_TIMEOUT_SECONDS,
|
||||||
|
)
|
||||||
+421
-7
@@ -377,14 +377,22 @@ class TestSyncRecentContactsToRadio:
|
|||||||
assert result["loaded"] == 2
|
assert result["loaded"] == 2
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_fills_remaining_slots_with_recently_contacted_then_advertised(self, test_db):
|
async def test_fills_remaining_slots_with_dm_active_then_advertised(self, test_db):
|
||||||
"""Fill order is favorites, then recent contacts, then recent adverts."""
|
"""Fill order is favorites, then DM-active contacts, then recent adverts."""
|
||||||
await _insert_contact(KEY_A, "Alice", last_contacted=100)
|
await _insert_contact(KEY_A, "Alice")
|
||||||
await _insert_contact(KEY_B, "Bob", last_contacted=2000)
|
await _insert_contact(KEY_B, "Bob")
|
||||||
await _insert_contact("cc" * 32, "Carol", last_contacted=1000)
|
await _insert_contact("cc" * 32, "Carol")
|
||||||
await _insert_contact("dd" * 32, "Dave", last_advert=3000)
|
await _insert_contact("dd" * 32, "Dave", last_advert=3000)
|
||||||
await _insert_contact("ee" * 32, "Eve", last_advert=2500)
|
await _insert_contact("ee" * 32, "Eve", last_advert=2500)
|
||||||
|
|
||||||
|
# Create DM activity for Alice (oldest), Bob (most recent), Carol (middle)
|
||||||
|
for key, ts in [(KEY_A, 100), (KEY_B, 2000), ("cc" * 32, 1000)]:
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO messages (type, conversation_key, text, received_at) VALUES ('PRIV', ?, 'hi', ?)",
|
||||||
|
(key, ts),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
await AppSettingsRepository.update(max_radio_contacts=5)
|
await AppSettingsRepository.update(max_radio_contacts=5)
|
||||||
await ContactRepository.set_favorite(KEY_A, True)
|
await ContactRepository.set_favorite(KEY_A, True)
|
||||||
|
|
||||||
@@ -401,6 +409,7 @@ class TestSyncRecentContactsToRadio:
|
|||||||
loaded_keys = [
|
loaded_keys = [
|
||||||
call.args[0]["public_key"] for call in mock_mc.commands.add_contact.call_args_list
|
call.args[0]["public_key"] for call in mock_mc.commands.add_contact.call_args_list
|
||||||
]
|
]
|
||||||
|
# Alice (favorite), then Bob & Carol (DM-active, most recent first), then Dave (advert)
|
||||||
assert loaded_keys == [KEY_A, KEY_B, "cc" * 32, "dd" * 32]
|
assert loaded_keys == [KEY_A, KEY_B, "cc" * 32, "dd" * 32]
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
@@ -509,8 +518,15 @@ class TestSyncAndOffloadAll:
|
|||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_duplicate_favorite_not_loaded_twice(self, test_db):
|
async def test_duplicate_favorite_not_loaded_twice(self, test_db):
|
||||||
"""Duplicate favorite entries still load the contact only once."""
|
"""Duplicate favorite entries still load the contact only once."""
|
||||||
await _insert_contact(KEY_A, "Alice", last_contacted=2000)
|
await _insert_contact(KEY_A, "Alice")
|
||||||
await _insert_contact(KEY_B, "Bob", last_contacted=1000)
|
await _insert_contact(KEY_B, "Bob")
|
||||||
|
|
||||||
|
# Bob has DM activity so he appears in tier 2
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO messages (type, conversation_key, text, received_at) VALUES ('PRIV', ?, 'hi', 1000)",
|
||||||
|
(KEY_B,),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
await AppSettingsRepository.update(max_radio_contacts=2)
|
await AppSettingsRepository.update(max_radio_contacts=2)
|
||||||
await ContactRepository.set_favorite(KEY_A, True)
|
await ContactRepository.set_favorite(KEY_A, True)
|
||||||
@@ -1862,3 +1878,401 @@ class TestCollectRepeaterTelemetryLpp:
|
|||||||
await _collect_repeater_telemetry(mc, contact)
|
await _collect_repeater_telemetry(mc, contact)
|
||||||
|
|
||||||
assert "lpp_sensors" not in recorded_data
|
assert "lpp_sensors" not in recorded_data
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# _telemetry_collect_loop — UTC modulo scheduler
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class TestTelemetryCollectSchedulerDecision:
|
||||||
|
"""Verify the scheduler's run/skip decision at an hourly wake.
|
||||||
|
|
||||||
|
We test the decision logic by stubbing the sleep + datetime functions
|
||||||
|
and asserting ``_run_telemetry_cycle`` is called exactly on matching
|
||||||
|
hours. Full end-to-end of the loop is covered implicitly by the
|
||||||
|
existing telemetry-collect tests; what we're pinning here is the
|
||||||
|
hour-modulo gate the new scheduler depends on.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_skips_when_hour_modulo_mismatch(self):
|
||||||
|
"""At 09:00 UTC with interval 8h, the loop must NOT run a cycle."""
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
|
||||||
|
from app import radio_sync
|
||||||
|
from app.models import AppSettings
|
||||||
|
|
||||||
|
settings = AppSettings(
|
||||||
|
tracked_telemetry_repeaters=["aa" * 32],
|
||||||
|
telemetry_interval_hours=8,
|
||||||
|
)
|
||||||
|
ran = False
|
||||||
|
|
||||||
|
async def fake_cycle():
|
||||||
|
nonlocal ran
|
||||||
|
ran = True
|
||||||
|
|
||||||
|
def make_fake_datetime(hour: int):
|
||||||
|
class FakeDatetime:
|
||||||
|
@classmethod
|
||||||
|
def now(cls, tz=None):
|
||||||
|
import datetime as real_datetime
|
||||||
|
|
||||||
|
return real_datetime.datetime(2026, 4, 16, hour, 0, 0, tzinfo=real_datetime.UTC)
|
||||||
|
|
||||||
|
return FakeDatetime
|
||||||
|
|
||||||
|
sleep_count = 0
|
||||||
|
|
||||||
|
async def fake_sleep(_duration):
|
||||||
|
# The loop does: (1) initial-delay sleep, (2) sleep-to-top-of-hour,
|
||||||
|
# then evaluates the run/skip decision. Allow both sleeps to
|
||||||
|
# pass, then cancel on the 3rd (next iteration's top-of-hour sleep).
|
||||||
|
nonlocal sleep_count
|
||||||
|
sleep_count += 1
|
||||||
|
if sleep_count >= 3:
|
||||||
|
raise asyncio.CancelledError()
|
||||||
|
|
||||||
|
with (
|
||||||
|
patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=settings,
|
||||||
|
),
|
||||||
|
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
|
||||||
|
patch("app.radio_sync.asyncio.sleep", new=fake_sleep),
|
||||||
|
patch("app.radio_sync.datetime", new=make_fake_datetime(9)),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
await radio_sync._telemetry_collect_loop()
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
assert ran is False, "09:00 UTC is not a multiple of 8h; cycle must not run"
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_runs_when_hour_modulo_matches(self):
|
||||||
|
"""At 16:00 UTC with interval 8h, the loop must run a cycle."""
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
|
||||||
|
from app import radio_sync
|
||||||
|
from app.models import AppSettings
|
||||||
|
|
||||||
|
settings = AppSettings(
|
||||||
|
tracked_telemetry_repeaters=["aa" * 32],
|
||||||
|
telemetry_interval_hours=8,
|
||||||
|
)
|
||||||
|
ran = False
|
||||||
|
|
||||||
|
async def fake_cycle():
|
||||||
|
nonlocal ran
|
||||||
|
ran = True
|
||||||
|
|
||||||
|
class FakeDatetime:
|
||||||
|
@classmethod
|
||||||
|
def now(cls, tz=None):
|
||||||
|
import datetime as real_datetime
|
||||||
|
|
||||||
|
return real_datetime.datetime(2026, 4, 16, 16, 0, 0, tzinfo=real_datetime.UTC)
|
||||||
|
|
||||||
|
sleep_count = 0
|
||||||
|
|
||||||
|
async def fake_sleep(_duration):
|
||||||
|
# Let the loop's initial-delay + top-of-hour sleeps pass; cancel
|
||||||
|
# on the third sleep (next iteration's top-of-hour wake).
|
||||||
|
nonlocal sleep_count
|
||||||
|
sleep_count += 1
|
||||||
|
if sleep_count >= 3:
|
||||||
|
raise asyncio.CancelledError()
|
||||||
|
|
||||||
|
with (
|
||||||
|
patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=settings,
|
||||||
|
),
|
||||||
|
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
|
||||||
|
patch("app.radio_sync.asyncio.sleep", new=fake_sleep),
|
||||||
|
patch("app.radio_sync.datetime", new=FakeDatetime),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
await radio_sync._telemetry_collect_loop()
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
assert ran is True, "16:00 UTC is a multiple of 8h; cycle must run"
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_skips_when_no_repeaters_tracked(self):
|
||||||
|
"""Empty tracked list short-circuits regardless of modulo match."""
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
|
||||||
|
from app import radio_sync
|
||||||
|
from app.models import AppSettings
|
||||||
|
|
||||||
|
settings = AppSettings(tracked_telemetry_repeaters=[], telemetry_interval_hours=8)
|
||||||
|
ran = False
|
||||||
|
|
||||||
|
async def fake_cycle():
|
||||||
|
nonlocal ran
|
||||||
|
ran = True
|
||||||
|
|
||||||
|
class FakeDatetime:
|
||||||
|
@classmethod
|
||||||
|
def now(cls, tz=None):
|
||||||
|
import datetime as real_datetime
|
||||||
|
|
||||||
|
return real_datetime.datetime(2026, 4, 16, 16, 0, 0, tzinfo=real_datetime.UTC)
|
||||||
|
|
||||||
|
sleep_count = 0
|
||||||
|
|
||||||
|
async def fake_sleep(_duration):
|
||||||
|
# Let the loop's initial-delay + top-of-hour sleeps pass; cancel
|
||||||
|
# on the third sleep (next iteration's top-of-hour wake).
|
||||||
|
nonlocal sleep_count
|
||||||
|
sleep_count += 1
|
||||||
|
if sleep_count >= 3:
|
||||||
|
raise asyncio.CancelledError()
|
||||||
|
|
||||||
|
with (
|
||||||
|
patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=settings,
|
||||||
|
),
|
||||||
|
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
|
||||||
|
patch("app.radio_sync.asyncio.sleep", new=fake_sleep),
|
||||||
|
patch("app.radio_sync.datetime", new=FakeDatetime),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
await radio_sync._telemetry_collect_loop()
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
assert ran is False, "No tracked repeaters: no cycle regardless of hour"
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_runs_on_boundary_immediately_after_initial_delay(self):
|
||||||
|
"""Regression test: if the post-boot initial delay finishes inside a
|
||||||
|
matching hour, the cycle must run even if the first
|
||||||
|
sleep-to-next-top-of-hour would otherwise carry us past the boundary.
|
||||||
|
|
||||||
|
Scenario: server starts at 23:59:30 UTC with a 24-hour interval. The
|
||||||
|
60-second boot guard pushes the first check into 00:00:30 — a matching
|
||||||
|
hour that we must NOT skip. Before the fix, the loop went straight to
|
||||||
|
sleeping until 01:00 and then failing the modulo, missing the entire
|
||||||
|
day's only scheduled collection.
|
||||||
|
"""
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
|
||||||
|
from app import radio_sync
|
||||||
|
from app.models import AppSettings
|
||||||
|
|
||||||
|
settings = AppSettings(
|
||||||
|
tracked_telemetry_repeaters=["aa" * 32],
|
||||||
|
telemetry_interval_hours=24, # daily cadence; only matching hour is 00
|
||||||
|
)
|
||||||
|
ran = False
|
||||||
|
|
||||||
|
async def fake_cycle():
|
||||||
|
nonlocal ran
|
||||||
|
ran = True
|
||||||
|
|
||||||
|
class FakeDatetime:
|
||||||
|
@classmethod
|
||||||
|
def now(cls, tz=None):
|
||||||
|
import datetime as real_datetime
|
||||||
|
|
||||||
|
# Simulates "initial delay just ended at 00:00:30 UTC on a
|
||||||
|
# restart that began at 23:59:30." Without the post-boot
|
||||||
|
# boundary check, the loop would have skipped this.
|
||||||
|
return real_datetime.datetime(2026, 4, 16, 0, 0, 30, tzinfo=real_datetime.UTC)
|
||||||
|
|
||||||
|
sleep_count = 0
|
||||||
|
|
||||||
|
async def fake_sleep(_duration):
|
||||||
|
# Let the initial delay pass, then cancel before the first
|
||||||
|
# top-of-hour sleep so we isolate the post-boot check as the
|
||||||
|
# only opportunity to run.
|
||||||
|
nonlocal sleep_count
|
||||||
|
sleep_count += 1
|
||||||
|
if sleep_count >= 2:
|
||||||
|
raise asyncio.CancelledError()
|
||||||
|
|
||||||
|
with (
|
||||||
|
patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=settings,
|
||||||
|
),
|
||||||
|
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
|
||||||
|
patch("app.radio_sync.asyncio.sleep", new=fake_sleep),
|
||||||
|
patch("app.radio_sync.datetime", new=FakeDatetime),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
await radio_sync._telemetry_collect_loop()
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
assert ran is True, (
|
||||||
|
"Post-boot check must fire the due 00:00 cycle; otherwise a "
|
||||||
|
"restart near midnight suppresses the whole day's collection."
|
||||||
|
)
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_clamps_up_when_preferred_illegal_for_current_count(self):
|
||||||
|
"""5 tracked repeaters with saved pref 1h: scheduler should use 6h.
|
||||||
|
|
||||||
|
At 02:00 UTC: 2 % 6 == 2 (not a run), so cycle must not fire.
|
||||||
|
If clamping were skipped, 2 % 1 == 0 and cycle would incorrectly run.
|
||||||
|
"""
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
|
||||||
|
from app import radio_sync
|
||||||
|
from app.models import AppSettings
|
||||||
|
|
||||||
|
settings = AppSettings(
|
||||||
|
tracked_telemetry_repeaters=["aa" * 32] * 5,
|
||||||
|
telemetry_interval_hours=1, # illegal at N=5; shortest legal is 6h
|
||||||
|
)
|
||||||
|
ran = False
|
||||||
|
|
||||||
|
async def fake_cycle():
|
||||||
|
nonlocal ran
|
||||||
|
ran = True
|
||||||
|
|
||||||
|
class FakeDatetime:
|
||||||
|
@classmethod
|
||||||
|
def now(cls, tz=None):
|
||||||
|
import datetime as real_datetime
|
||||||
|
|
||||||
|
return real_datetime.datetime(2026, 4, 16, 2, 0, 0, tzinfo=real_datetime.UTC)
|
||||||
|
|
||||||
|
sleep_count = 0
|
||||||
|
|
||||||
|
async def fake_sleep(_duration):
|
||||||
|
# Let the loop's initial-delay + top-of-hour sleeps pass; cancel
|
||||||
|
# on the third sleep (next iteration's top-of-hour wake).
|
||||||
|
nonlocal sleep_count
|
||||||
|
sleep_count += 1
|
||||||
|
if sleep_count >= 3:
|
||||||
|
raise asyncio.CancelledError()
|
||||||
|
|
||||||
|
with (
|
||||||
|
patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=settings,
|
||||||
|
),
|
||||||
|
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
|
||||||
|
patch("app.radio_sync.asyncio.sleep", new=fake_sleep),
|
||||||
|
patch("app.radio_sync.datetime", new=FakeDatetime),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
await radio_sync._telemetry_collect_loop()
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
assert ran is False, (
|
||||||
|
"Clamping to 6h must prevent the 02:00 run that 1h cadence would've triggered"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# get_contacts_selected_for_radio_sync — DM-active prioritization
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class TestContactSelectionDmActive:
|
||||||
|
"""Verify that tier 2 prioritizes contacts with recent DM activity."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_incoming_dm_contact_selected_over_advert_only(self, test_db):
|
||||||
|
"""A contact who sent us a DM should be prioritized over one who only advertised."""
|
||||||
|
from app.radio_sync import get_contacts_selected_for_radio_sync
|
||||||
|
|
||||||
|
# Create two non-repeater contacts
|
||||||
|
dm_sender_key = "aa" * 32
|
||||||
|
advert_only_key = "bb" * 32
|
||||||
|
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO contacts (public_key, name, type, last_seen, last_advert) VALUES (?, ?, 1, 100, 100)",
|
||||||
|
(dm_sender_key, "DM Sender"),
|
||||||
|
)
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO contacts (public_key, name, type, last_seen, last_advert) VALUES (?, ?, 1, 200, 200)",
|
||||||
|
(advert_only_key, "Advert Only"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# DM Sender sent us a message (incoming DM)
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO messages (type, conversation_key, text, received_at) VALUES ('PRIV', ?, 'hello', 300)",
|
||||||
|
(dm_sender_key,),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
|
with patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=MagicMock(max_radio_contacts=200, tracked_telemetry_repeaters=[]),
|
||||||
|
):
|
||||||
|
selected = await get_contacts_selected_for_radio_sync()
|
||||||
|
|
||||||
|
keys = [c.public_key for c in selected]
|
||||||
|
assert dm_sender_key in keys
|
||||||
|
assert advert_only_key in keys
|
||||||
|
# DM Sender should come before Advert Only (tier 2 before tier 3)
|
||||||
|
assert keys.index(dm_sender_key) < keys.index(advert_only_key)
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_outgoing_dm_contact_also_selected(self, test_db):
|
||||||
|
"""A contact we sent a DM to should also appear via DM-active tier."""
|
||||||
|
from app.radio_sync import get_contacts_selected_for_radio_sync
|
||||||
|
|
||||||
|
contact_key = "cc" * 32
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO contacts (public_key, name, type) VALUES (?, ?, 1)",
|
||||||
|
(contact_key, "Outgoing Target"),
|
||||||
|
)
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES ('PRIV', ?, 'hey', 300, 1)",
|
||||||
|
(contact_key,),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
|
with patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=MagicMock(max_radio_contacts=200, tracked_telemetry_repeaters=[]),
|
||||||
|
):
|
||||||
|
selected = await get_contacts_selected_for_radio_sync()
|
||||||
|
|
||||||
|
keys = [c.public_key for c in selected]
|
||||||
|
assert contact_key in keys
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_repeaters_excluded_from_dm_active_tier(self, test_db):
|
||||||
|
"""Repeater contacts should not appear in tier 2 even with DM activity."""
|
||||||
|
from app.radio_sync import get_contacts_selected_for_radio_sync
|
||||||
|
|
||||||
|
repeater_key = "dd" * 32
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO contacts (public_key, name, type) VALUES (?, ?, 2)",
|
||||||
|
(repeater_key, "Repeater"),
|
||||||
|
)
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO messages (type, conversation_key, text, received_at) VALUES ('PRIV', ?, 'cmd', 300)",
|
||||||
|
(repeater_key,),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
|
with patch(
|
||||||
|
"app.radio_sync.AppSettingsRepository.get",
|
||||||
|
new_callable=AsyncMock,
|
||||||
|
return_value=MagicMock(max_radio_contacts=200, tracked_telemetry_repeaters=[]),
|
||||||
|
):
|
||||||
|
selected = await get_contacts_selected_for_radio_sync()
|
||||||
|
|
||||||
|
keys = [c.public_key for c in selected]
|
||||||
|
assert repeater_key not in keys
|
||||||
|
|||||||
+214
-24
@@ -1,11 +1,12 @@
|
|||||||
"""Tests for repository layer."""
|
"""Tests for repository layer."""
|
||||||
|
|
||||||
from unittest.mock import AsyncMock, MagicMock, patch
|
from unittest.mock import patch
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from app.models import Contact, ContactUpsert
|
from app.models import Contact, ContactUpsert
|
||||||
from app.repository import (
|
from app.repository import (
|
||||||
|
AppSettingsRepository,
|
||||||
ContactAdvertPathRepository,
|
ContactAdvertPathRepository,
|
||||||
ContactNameHistoryRepository,
|
ContactNameHistoryRepository,
|
||||||
ContactRepository,
|
ContactRepository,
|
||||||
@@ -613,37 +614,103 @@ class TestAppSettingsRepository:
|
|||||||
"""Test AppSettingsRepository parsing and migration edge cases."""
|
"""Test AppSettingsRepository parsing and migration edge cases."""
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_get_handles_corrupted_json_and_invalid_sort_order(self):
|
async def test_get_handles_corrupted_json_and_invalid_sort_order(self, test_db):
|
||||||
"""Corrupted JSON fields are recovered with safe defaults."""
|
"""Corrupted JSON fields are recovered with safe defaults.
|
||||||
mock_conn = AsyncMock()
|
|
||||||
mock_cursor = AsyncMock()
|
Uses the real DB so it exercises the lock-aware path. We stuff
|
||||||
mock_cursor.fetchone = AsyncMock(
|
malformed JSON directly into the row, then verify ``get()`` recovers
|
||||||
return_value={
|
with defaults rather than propagating a parse error.
|
||||||
"max_radio_contacts": 250,
|
"""
|
||||||
"auto_decrypt_dm_on_advert": 1,
|
await test_db.conn.execute(
|
||||||
"last_message_times": "{also-not-json",
|
"""
|
||||||
"advert_interval": None,
|
UPDATE app_settings
|
||||||
"last_advert_time": None,
|
SET max_radio_contacts = 250,
|
||||||
"flood_scope": "",
|
auto_decrypt_dm_on_advert = 1,
|
||||||
"blocked_keys": "[]",
|
last_message_times = '{also-not-json',
|
||||||
"blocked_names": "[]",
|
advert_interval = NULL,
|
||||||
"discovery_blocked_types": "[]",
|
last_advert_time = NULL,
|
||||||
}
|
flood_scope = '',
|
||||||
|
blocked_keys = '[]',
|
||||||
|
blocked_names = '[]',
|
||||||
|
discovery_blocked_types = '[]'
|
||||||
|
WHERE id = 1
|
||||||
|
"""
|
||||||
)
|
)
|
||||||
mock_conn.execute = AsyncMock(return_value=mock_cursor)
|
await test_db.conn.commit()
|
||||||
mock_db = MagicMock()
|
|
||||||
mock_db.conn = mock_conn
|
|
||||||
|
|
||||||
with patch("app.repository.settings.db", mock_db):
|
settings = await AppSettingsRepository.get()
|
||||||
from app.repository import AppSettingsRepository
|
|
||||||
|
|
||||||
settings = await AppSettingsRepository.get()
|
|
||||||
|
|
||||||
assert settings.max_radio_contacts == 250
|
assert settings.max_radio_contacts == 250
|
||||||
assert settings.last_message_times == {}
|
assert settings.last_message_times == {}
|
||||||
assert settings.advert_interval == 0
|
assert settings.advert_interval == 0
|
||||||
assert settings.last_advert_time == 0
|
assert settings.last_advert_time == 0
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_get_in_conn_tolerates_missing_columns(self):
|
||||||
|
"""Defend against partial migrations where columns added by later
|
||||||
|
migrations are absent from the row.
|
||||||
|
|
||||||
|
Real DBs can't produce this state (schema init + migrations always
|
||||||
|
run to the latest version on startup), but hand-rolled snapshots,
|
||||||
|
external DB tools, or interrupted migrations might. The
|
||||||
|
``KeyError``-catching branches in ``_get_in_conn`` exist specifically
|
||||||
|
to guarantee graceful degradation.
|
||||||
|
|
||||||
|
We test these directly by mocking the connection boundary with a
|
||||||
|
dict-backed row that mimics a pre-migration snapshot missing:
|
||||||
|
- ``tracked_telemetry_repeaters`` (migration 53)
|
||||||
|
- ``auto_resend_channel`` (migration 54)
|
||||||
|
- ``telemetry_interval_hours`` (migration 57)
|
||||||
|
"""
|
||||||
|
from unittest.mock import MagicMock
|
||||||
|
|
||||||
|
from app.telemetry_interval import DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
|
# sqlite3.Row raises KeyError for missing columns when accessed by
|
||||||
|
# name, which is what we want to simulate. We mimic that here with a
|
||||||
|
# dict-backed object whose __getitem__ raises KeyError for absent
|
||||||
|
# keys (dict.__getitem__ already does this).
|
||||||
|
class PartialRow(dict):
|
||||||
|
def keys(self): # pragma: no cover - aiosqlite.Row compat
|
||||||
|
return super().keys()
|
||||||
|
|
||||||
|
partial_row = PartialRow(
|
||||||
|
{
|
||||||
|
"max_radio_contacts": 123,
|
||||||
|
"auto_decrypt_dm_on_advert": 1,
|
||||||
|
"last_message_times": "{}",
|
||||||
|
"advert_interval": 0,
|
||||||
|
"last_advert_time": 0,
|
||||||
|
"flood_scope": "",
|
||||||
|
"blocked_keys": "[]",
|
||||||
|
"blocked_names": "[]",
|
||||||
|
"discovery_blocked_types": "[]",
|
||||||
|
# intentionally missing: tracked_telemetry_repeaters,
|
||||||
|
# auto_resend_channel, telemetry_interval_hours
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
class FakeCursor:
|
||||||
|
async def fetchone(self):
|
||||||
|
return partial_row
|
||||||
|
|
||||||
|
async def __aenter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
async def __aexit__(self, exc_type, exc, tb):
|
||||||
|
return None
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_conn.execute = MagicMock(return_value=FakeCursor())
|
||||||
|
|
||||||
|
settings = await AppSettingsRepository._get_in_conn(mock_conn)
|
||||||
|
|
||||||
|
assert settings.max_radio_contacts == 123
|
||||||
|
# Missing-column defaults kick in:
|
||||||
|
assert settings.tracked_telemetry_repeaters == []
|
||||||
|
assert settings.auto_resend_channel is False
|
||||||
|
assert settings.telemetry_interval_hours == DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
|
|
||||||
class TestMessageRepositoryGetById:
|
class TestMessageRepositoryGetById:
|
||||||
"""Test MessageRepository.get_by_id method."""
|
"""Test MessageRepository.get_by_id method."""
|
||||||
@@ -697,3 +764,126 @@ class TestContactRepositoryUpsertContracts:
|
|||||||
assert contact.name == "Bob"
|
assert contact.name == "Bob"
|
||||||
assert contact.type == 2
|
assert contact.type == 2
|
||||||
assert contact.on_radio is True
|
assert contact.on_radio is True
|
||||||
|
|
||||||
|
|
||||||
|
class TestContactRepositoryLastSeenSemantics:
|
||||||
|
"""Guard the 'last_seen = last RF reception' contract.
|
||||||
|
|
||||||
|
Radio-driven contact-DB syncs must not clobber an earlier real RF timestamp,
|
||||||
|
and callers that don't supply last_seen must leave the existing value alone.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_upsert_without_last_seen_preserves_existing(self, test_db):
|
||||||
|
real_rf_observation = 1_700_000_000
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(
|
||||||
|
public_key="aa" * 32,
|
||||||
|
name="Alice",
|
||||||
|
type=1,
|
||||||
|
last_seen=real_rf_observation,
|
||||||
|
on_radio=False,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# A subsequent radio-sync style upsert (no last_seen supplied) must not
|
||||||
|
# overwrite the real RF timestamp with now().
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, name="Alice", type=1, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == real_rf_observation
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_upsert_monotonically_bumps_last_seen(self, test_db):
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_700_000_000, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Newer RF observation advances last_seen.
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_700_000_500, on_radio=False)
|
||||||
|
)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
|
||||||
|
# An older timestamp (out-of-order arrival) must not move it backwards.
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_699_999_000, on_radio=False)
|
||||||
|
)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_upsert_inserts_null_last_seen_when_not_supplied(self, test_db):
|
||||||
|
# A radio-sync-only contact (never heard on RF) should have last_seen=NULL.
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, name="Alice", type=1, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen is None
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_touch_last_seen_bumps_monotonically(self, test_db):
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_700_000_000, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
await ContactRepository.touch_last_seen("aa" * 32, 1_700_000_500)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
|
||||||
|
# Older timestamps never move last_seen backwards.
|
||||||
|
await ContactRepository.touch_last_seen("aa" * 32, 1_699_999_000)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_update_last_contacted_does_not_touch_last_seen(self, test_db):
|
||||||
|
# last_contacted = we sent TO them. It must not forge RF reception.
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_700_000_000, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
await ContactRepository.update_last_contacted("aa" * 32, 1_700_500_000)
|
||||||
|
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_contacted == 1_700_500_000
|
||||||
|
assert contact.last_seen == 1_700_000_000
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_update_direct_path_bumps_last_seen_monotonically(self, test_db):
|
||||||
|
# update_direct_path is driven by RF PATH reception on both callers
|
||||||
|
# (packet processor + firmware PATH_UPDATE, which only fires from
|
||||||
|
# onContactPathRecv during RF reception). It should advance last_seen
|
||||||
|
# forward-only.
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key="aa" * 32, last_seen=1_700_000_000, on_radio=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
await ContactRepository.update_direct_path(
|
||||||
|
"aa" * 32, path="ab", path_len=1, path_hash_mode=0, updated_at=1_700_000_500
|
||||||
|
)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
assert contact.direct_path == "ab"
|
||||||
|
|
||||||
|
# Out-of-order PATH arrival with an older timestamp must not rewind.
|
||||||
|
await ContactRepository.update_direct_path(
|
||||||
|
"aa" * 32, path="cd", path_len=1, path_hash_mode=0, updated_at=1_699_999_000
|
||||||
|
)
|
||||||
|
contact = await ContactRepository.get_by_key("aa" * 32)
|
||||||
|
assert contact is not None
|
||||||
|
assert contact.last_seen == 1_700_000_500
|
||||||
|
# The path itself still updates — only last_seen is monotonic-guarded.
|
||||||
|
assert contact.direct_path == "cd"
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ from app.routers.settings import (
|
|||||||
AppSettingsUpdate,
|
AppSettingsUpdate,
|
||||||
FavoriteRequest,
|
FavoriteRequest,
|
||||||
TrackedTelemetryRequest,
|
TrackedTelemetryRequest,
|
||||||
|
get_telemetry_schedule,
|
||||||
toggle_favorite,
|
toggle_favorite,
|
||||||
toggle_tracked_telemetry,
|
toggle_tracked_telemetry,
|
||||||
update_settings,
|
update_settings,
|
||||||
@@ -244,3 +245,88 @@ class TestToggleTrackedTelemetry:
|
|||||||
result = await toggle_tracked_telemetry(TrackedTelemetryRequest(public_key=keys[0]))
|
result = await toggle_tracked_telemetry(TrackedTelemetryRequest(public_key=keys[0]))
|
||||||
assert keys[0] not in result.tracked_telemetry_repeaters
|
assert keys[0] not in result.tracked_telemetry_repeaters
|
||||||
assert len(result.tracked_telemetry_repeaters) == 7
|
assert len(result.tracked_telemetry_repeaters) == 7
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_toggle_response_includes_schedule(self, test_db):
|
||||||
|
"""After toggle, response must carry the schedule derivation so the UI
|
||||||
|
can update the interval dropdown without a follow-up fetch."""
|
||||||
|
key = "aa" * 32
|
||||||
|
await self._create_repeater(key)
|
||||||
|
|
||||||
|
result = await toggle_tracked_telemetry(TrackedTelemetryRequest(public_key=key))
|
||||||
|
|
||||||
|
assert result.schedule.tracked_count == 1
|
||||||
|
# N=1 unlocks the full menu including 1h
|
||||||
|
assert 1 in result.schedule.options
|
||||||
|
assert result.schedule.max_tracked == 8
|
||||||
|
|
||||||
|
|
||||||
|
class TestTelemetryIntervalValidation:
|
||||||
|
"""PATCH /settings validation for telemetry_interval_hours."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_accepts_valid_interval(self, test_db):
|
||||||
|
result = await update_settings(AppSettingsUpdate(telemetry_interval_hours=4))
|
||||||
|
assert result.telemetry_interval_hours == 4
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_invalid_interval_falls_back_to_default(self, test_db):
|
||||||
|
"""Non-menu values are defaulted rather than 400-ing to keep stale
|
||||||
|
clients from getting stuck on a save error."""
|
||||||
|
result = await update_settings(AppSettingsUpdate(telemetry_interval_hours=99))
|
||||||
|
assert result.telemetry_interval_hours == 8 # DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_preference_is_preserved_even_when_illegal_for_count(self, test_db):
|
||||||
|
"""User picks 1h at N=5 tracked: stored pref must stay 1h. Scheduler
|
||||||
|
handles the clamping at run time; storage is verbatim."""
|
||||||
|
# Seed 5 tracked repeaters
|
||||||
|
keys = [f"{i:02x}" * 32 for i in range(5)]
|
||||||
|
for k in keys:
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key=k, name=f"R{k[:4]}", type=CONTACT_TYPE_REPEATER)
|
||||||
|
)
|
||||||
|
await AppSettingsRepository.update(tracked_telemetry_repeaters=keys)
|
||||||
|
|
||||||
|
result = await update_settings(AppSettingsUpdate(telemetry_interval_hours=1))
|
||||||
|
assert result.telemetry_interval_hours == 1
|
||||||
|
|
||||||
|
# But the GET schedule endpoint should report the clamped effective value.
|
||||||
|
schedule = await get_telemetry_schedule()
|
||||||
|
assert schedule.preferred_hours == 1
|
||||||
|
assert schedule.effective_hours == 6 # N=5 -> shortest legal = 6h
|
||||||
|
|
||||||
|
|
||||||
|
class TestTelemetryScheduleEndpoint:
|
||||||
|
"""GET /settings/tracked-telemetry/schedule."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_schedule_with_no_tracked_repeaters(self, test_db):
|
||||||
|
"""No tracked repeaters means nothing to schedule; next_run_at is None.
|
||||||
|
|
||||||
|
At N=0 the clamp helper returns the default 8h, which is a fine
|
||||||
|
display value for an empty state. Options start at 8h for the same
|
||||||
|
reason — any lower shortest-legal only makes sense once the user
|
||||||
|
has at least one repeater tracked.
|
||||||
|
"""
|
||||||
|
schedule = await get_telemetry_schedule()
|
||||||
|
|
||||||
|
assert schedule.tracked_count == 0
|
||||||
|
assert schedule.next_run_at is None
|
||||||
|
# At N=0 shortest-legal defaults to 8h.
|
||||||
|
assert schedule.options == [8, 12, 24]
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_schedule_filters_options_by_tracked_count(self, test_db):
|
||||||
|
keys = [f"{i:02x}" * 32 for i in range(5)]
|
||||||
|
for k in keys:
|
||||||
|
await ContactRepository.upsert(
|
||||||
|
ContactUpsert(public_key=k, name=f"R{k[:4]}", type=CONTACT_TYPE_REPEATER)
|
||||||
|
)
|
||||||
|
await AppSettingsRepository.update(tracked_telemetry_repeaters=keys)
|
||||||
|
|
||||||
|
schedule = await get_telemetry_schedule()
|
||||||
|
|
||||||
|
assert schedule.tracked_count == 5
|
||||||
|
assert schedule.options == [6, 8, 12, 24]
|
||||||
|
assert schedule.next_run_at is not None
|
||||||
|
|||||||
+15
-10
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
import time
|
import time
|
||||||
from types import SimpleNamespace
|
from types import SimpleNamespace
|
||||||
from unittest.mock import AsyncMock, patch
|
from unittest.mock import patch
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
@@ -353,13 +353,21 @@ class TestPathHashWidthStats:
|
|||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_path_hash_width_scan_fetches_all_then_buckets(self, test_db):
|
async def test_path_hash_width_scan_fetches_all_then_buckets(self, test_db):
|
||||||
"""Hash-width stats should fetchall() then bucket synchronously."""
|
"""Hash-width stats should fetchall() then bucket synchronously.
|
||||||
|
|
||||||
fake_rows = [{"data": b"a"}, {"data": b"b"}, {"data": b"c"}]
|
Uses real DB rows + a patched parser so it exercises the lock-aware
|
||||||
|
readonly path. Mocking ``conn.execute`` on the pre-refactor code no
|
||||||
|
longer reflects the actual call pattern (we use ``async with``).
|
||||||
|
"""
|
||||||
|
|
||||||
class FakeCursor:
|
now = int(time.time())
|
||||||
async def fetchall(self):
|
# Seed three raw packets in the last 24h with arbitrary distinguishing bytes.
|
||||||
return fake_rows
|
for i, data in enumerate((b"a", b"b", b"c")):
|
||||||
|
await test_db.conn.execute(
|
||||||
|
"INSERT INTO raw_packets (timestamp, data) VALUES (?, ?)",
|
||||||
|
(now - (i + 1), data),
|
||||||
|
)
|
||||||
|
await test_db.conn.commit()
|
||||||
|
|
||||||
def fake_parse(raw_packet: bytes):
|
def fake_parse(raw_packet: bytes):
|
||||||
hash_sizes = {
|
hash_sizes = {
|
||||||
@@ -372,10 +380,7 @@ class TestPathHashWidthStats:
|
|||||||
return None
|
return None
|
||||||
return SimpleNamespace(hash_size=hash_size)
|
return SimpleNamespace(hash_size=hash_size)
|
||||||
|
|
||||||
with (
|
with patch("app.path_utils.parse_packet_envelope", side_effect=fake_parse):
|
||||||
patch.object(test_db.conn, "execute", new=AsyncMock(return_value=FakeCursor())),
|
|
||||||
patch("app.path_utils.parse_packet_envelope", side_effect=fake_parse),
|
|
||||||
):
|
|
||||||
breakdown = await StatisticsRepository._path_hash_width_24h()
|
breakdown = await StatisticsRepository._path_hash_width_24h()
|
||||||
|
|
||||||
assert breakdown["total_packets"] == 3
|
assert breakdown["total_packets"] == 3
|
||||||
|
|||||||
@@ -0,0 +1,116 @@
|
|||||||
|
"""Tests for the telemetry interval math helpers.
|
||||||
|
|
||||||
|
These helpers back both the PATCH validation and the scheduler clamping,
|
||||||
|
so regressions here silently corrupt cadence for every operator. Keep this
|
||||||
|
suite fast, pure, and focused on the boundary values in the N=1..8 table.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import UTC, datetime, timezone
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.telemetry_interval import (
|
||||||
|
DAILY_CHECK_CEILING,
|
||||||
|
DEFAULT_TELEMETRY_INTERVAL_HOURS,
|
||||||
|
TELEMETRY_INTERVAL_OPTIONS_HOURS,
|
||||||
|
clamp_telemetry_interval,
|
||||||
|
legal_interval_options,
|
||||||
|
next_run_timestamp_utc,
|
||||||
|
shortest_legal_interval_hours,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
("n", "expected_hours"),
|
||||||
|
[
|
||||||
|
(1, 1),
|
||||||
|
(2, 2),
|
||||||
|
(3, 3),
|
||||||
|
(4, 4),
|
||||||
|
(5, 6),
|
||||||
|
(6, 6),
|
||||||
|
(7, 8),
|
||||||
|
(8, 8),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_shortest_legal_interval_table(n: int, expected_hours: int):
|
||||||
|
"""The N=1..8 table must match the user-facing design exactly."""
|
||||||
|
assert shortest_legal_interval_hours(n) == expected_hours
|
||||||
|
|
||||||
|
|
||||||
|
def test_shortest_legal_interval_above_ceiling_falls_back_to_24h():
|
||||||
|
# Not reachable today (max 8 tracked), but verify the math terminates
|
||||||
|
# gracefully if the limit is ever raised above DAILY_CHECK_CEILING.
|
||||||
|
assert shortest_legal_interval_hours(DAILY_CHECK_CEILING + 1) == 24
|
||||||
|
|
||||||
|
|
||||||
|
def test_shortest_legal_interval_zero_returns_default():
|
||||||
|
# No repeaters tracked: loop skips the cycle regardless, but the math
|
||||||
|
# must terminate with a sane value (otherwise div-by-zero).
|
||||||
|
assert shortest_legal_interval_hours(0) == DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
|
|
||||||
|
def test_clamp_respects_user_pref_when_legal():
|
||||||
|
# User picks 2h with N=2 tracked -> 2h is the shortest legal, keep it.
|
||||||
|
assert clamp_telemetry_interval(2, 2) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_clamp_pushes_up_when_pref_illegal():
|
||||||
|
# User picked 1h, then grew to 5 tracked. 5 repeaters' shortest legal is
|
||||||
|
# 6h, so the scheduler should be using 6h while the saved pref is still 1.
|
||||||
|
assert clamp_telemetry_interval(1, 5) == 6
|
||||||
|
|
||||||
|
|
||||||
|
def test_clamp_unrecognized_value_falls_back_to_default():
|
||||||
|
# A malformed saved value (e.g. from a hand-edited DB row) should default,
|
||||||
|
# not error. Default 8h still gets clamped up if illegal for N.
|
||||||
|
assert clamp_telemetry_interval(99, 1) == DEFAULT_TELEMETRY_INTERVAL_HOURS
|
||||||
|
|
||||||
|
|
||||||
|
def test_clamp_preserves_longer_than_shortest_legal():
|
||||||
|
# 24h is always legal at any N.
|
||||||
|
assert clamp_telemetry_interval(24, 8) == 24
|
||||||
|
|
||||||
|
|
||||||
|
def test_legal_options_filters_menu():
|
||||||
|
assert legal_interval_options(5) == [6, 8, 12, 24]
|
||||||
|
assert legal_interval_options(1) == list(TELEMETRY_INTERVAL_OPTIONS_HOURS)
|
||||||
|
assert legal_interval_options(8) == [8, 12, 24]
|
||||||
|
|
||||||
|
|
||||||
|
def test_next_run_is_strictly_future_even_on_boundary():
|
||||||
|
# Exactly at a matching top-of-hour (8:00 UTC with interval=8), we want
|
||||||
|
# the *next* one (16:00), never "now". Prevents a double-run in the same
|
||||||
|
# minute if code mishandles equality.
|
||||||
|
now = datetime(2026, 4, 16, 8, 0, 0, tzinfo=UTC)
|
||||||
|
result = next_run_timestamp_utc(8, now=now)
|
||||||
|
expected = datetime(2026, 4, 16, 16, 0, 0, tzinfo=UTC)
|
||||||
|
assert result == int(expected.timestamp())
|
||||||
|
|
||||||
|
|
||||||
|
def test_next_run_rounds_up_from_mid_hour():
|
||||||
|
# 14:37 UTC with interval=8 -> next matching hour is 16:00.
|
||||||
|
now = datetime(2026, 4, 16, 14, 37, 0, tzinfo=UTC)
|
||||||
|
result = next_run_timestamp_utc(8, now=now)
|
||||||
|
expected = datetime(2026, 4, 16, 16, 0, 0, tzinfo=UTC)
|
||||||
|
assert result == int(expected.timestamp())
|
||||||
|
|
||||||
|
|
||||||
|
def test_next_run_crosses_midnight():
|
||||||
|
# 23:12 UTC with interval=8 -> midnight (00:00 next day) is legal.
|
||||||
|
now = datetime(2026, 4, 16, 23, 12, 0, tzinfo=UTC)
|
||||||
|
result = next_run_timestamp_utc(8, now=now)
|
||||||
|
expected = datetime(2026, 4, 17, 0, 0, 0, tzinfo=UTC)
|
||||||
|
assert result == int(expected.timestamp())
|
||||||
|
|
||||||
|
|
||||||
|
def test_next_run_accepts_non_utc_input():
|
||||||
|
# Non-UTC input should be normalized internally.
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
pst = timezone(timedelta(hours=-8))
|
||||||
|
# 08:00 PST == 16:00 UTC, a matching boundary for interval=8 -> next is 00:00 UTC.
|
||||||
|
now = datetime(2026, 4, 16, 8, 0, 0, tzinfo=pst)
|
||||||
|
result = next_run_timestamp_utc(8, now=now)
|
||||||
|
expected = datetime(2026, 4, 17, 0, 0, 0, tzinfo=UTC)
|
||||||
|
assert result == int(expected.timestamp())
|
||||||
@@ -2,6 +2,117 @@ version = 1
|
|||||||
revision = 1
|
revision = 1
|
||||||
requires-python = ">=3.11"
|
requires-python = ">=3.11"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "aiohappyeyeballs"
|
||||||
|
version = "2.6.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265 },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "aiohttp"
|
||||||
|
version = "3.13.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "aiohappyeyeballs" },
|
||||||
|
{ name = "aiosignal" },
|
||||||
|
{ name = "attrs" },
|
||||||
|
{ name = "frozenlist" },
|
||||||
|
{ name = "multidict" },
|
||||||
|
{ name = "propcache" },
|
||||||
|
{ name = "yarl" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/77/9a/152096d4808df8e4268befa55fba462f440f14beab85e8ad9bf990516918/aiohttp-3.13.5.tar.gz", hash = "sha256:9d98cc980ecc96be6eb4c1994ce35d28d8b1f5e5208a23b421187d1209dbb7d1", size = 7858271 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d6/f5/a20c4ac64aeaef1679e25c9983573618ff765d7aa829fa2b84ae7573169e/aiohttp-3.13.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7ab7229b6f9b5c1ba4910d6c41a9eb11f543eadb3f384df1b4c293f4e73d44d6", size = 757513 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/75/0a/39fa6c6b179b53fcb3e4b3d2b6d6cad0180854eda17060c7218540102bef/aiohttp-3.13.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8f14c50708bb156b3a3ca7230b3d820199d56a48e3af76fa21c2d6087190fe3d", size = 506748 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/87/ec/e38ce072e724fd7add6243613f8d1810da084f54175353d25ccf9f9c7e5a/aiohttp-3.13.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e7d2f8616f0ff60bd332022279011776c3ac0faa0f1b463f7bb12326fbc97a1c", size = 501673 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ba/ba/3bc7525d7e2beaa11b309a70d48b0d3cfc3c2089ec6a7d0820d59c657053/aiohttp-3.13.5-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a2567b72e1ffc3ab25510db43f355b29eeada56c0a622e58dcdb19530eb0a3cb", size = 1763757 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5e/ab/e87744cf18f1bd78263aba24924d4953b41086bd3a31d22452378e9028a0/aiohttp-3.13.5-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fb0540c854ac9c0c5ad495908fdfd3e332d553ec731698c0e29b1877ba0d2ec6", size = 1720152 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6b/f3/ed17a6f2d742af17b50bae2d152315ed1b164b07a5fd5cc1754d99e4dfa5/aiohttp-3.13.5-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c9883051c6972f58bfc4ebb2116345ee2aa151178e99c3f2b2bbe2af712abd13", size = 1818010 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/06/ecbc63dc937192e2a5cb46df4d3edb21deb8225535818802f210a6ea5816/aiohttp-3.13.5-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2294172ce08a82fb7c7273485895de1fa1186cc8294cfeb6aef4af42ad261174", size = 1907251 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/a5/0521aa32c1ddf3aa1e71dcc466be0b7db2771907a13f18cddaa45967d97b/aiohttp-3.13.5-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3a807cabd5115fb55af198b98178997a5e0e57dead43eb74a93d9c07d6d4a7dc", size = 1759969 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f6/78/a38f8c9105199dd3b9706745865a8a59d0041b6be0ca0cc4b2ccf1bab374/aiohttp-3.13.5-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:aa6d0d932e0f39c02b80744273cd5c388a2d9bc07760a03164f229c8e02662f6", size = 1616871 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6f/41/27392a61ead8ab38072105c71aa44ff891e71653fe53d576a7067da2b4e8/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:60869c7ac4aaabe7110f26499f3e6e5696eae98144735b12a9c3d9eae2b51a49", size = 1739844 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6e/55/5564e7ae26d94f3214250009a0b1c65a0c6af4bf88924ccb6fdab901de28/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:26d2f8546f1dfa75efa50c3488215a903c0168d253b75fba4210f57ab77a0fb8", size = 1731969 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6d/c5/705a3929149865fc941bcbdd1047b238e4a72bcb215a9b16b9d7a2e8d992/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f1162a1492032c82f14271e831c8f4b49f2b6078f4f5fc74de2c912fa225d51d", size = 1795193 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/19/edabed62f718d02cff7231ca0db4ef1c72504235bc467f7b67adb1679f48/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:8b14eb3262fad0dc2f89c1a43b13727e709504972186ff6a99a3ecaa77102b6c", size = 1606477 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/de/fc/76f80ef008675637d88d0b21584596dc27410a990b0918cb1e5776545b5b/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:ca9ac61ac6db4eb6c2a0cd1d0f7e1357647b638ccc92f7e9d8d133e71ed3c6ac", size = 1813198 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/67/5b3ac26b80adb20ea541c487f73730dc8fa107d632c998f25bbbab98fcda/aiohttp-3.13.5-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7996023b2ed59489ae4762256c8516df9820f751cf2c5da8ed2fb20ee50abab3", size = 1752321 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/88/06/e4a2e49255ea23fa4feeb5ab092d90240d927c15e47b5b5c48dff5a9ce29/aiohttp-3.13.5-cp311-cp311-win32.whl", hash = "sha256:77dfa48c9f8013271011e51c00f8ada19851f013cde2c48fca1ba5e0caf5bb06", size = 439069 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/43/8c7163a596dab4f8be12c190cf467a1e07e4734cf90eebb39f7f5d53fc6a/aiohttp-3.13.5-cp311-cp311-win_amd64.whl", hash = "sha256:d3a4834f221061624b8887090637db9ad4f61752001eae37d56c52fddade2dc8", size = 462859 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/be/6f/353954c29e7dcce7cf00280a02c75f30e133c00793c7a2ed3776d7b2f426/aiohttp-3.13.5-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:023ecba036ddd840b0b19bf195bfae970083fd7024ce1ac22e9bba90464620e9", size = 748876 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/1b/428a7c64687b3b2e9cd293186695affc0e1e54a445d0361743b231f11066/aiohttp-3.13.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:15c933ad7920b7d9a20de151efcd05a6e38302cbf0e10c9b2acb9a42210a2416", size = 499557 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/29/47/7be41556bfbb6917069d6a6634bb7dd5e163ba445b783a90d40f5ac7e3a7/aiohttp-3.13.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ab2899f9fa2f9f741896ebb6fa07c4c883bfa5c7f2ddd8cf2aafa86fa981b2d2", size = 500258 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/84/c9ecc5828cb0b3695856c07c0a6817a99d51e2473400f705275a2b3d9239/aiohttp-3.13.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a60eaa2d440cd4707696b52e40ed3e2b0f73f65be07fd0ef23b6b539c9c0b0b4", size = 1749199 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f0/d3/3c6d610e66b495657622edb6ae7c7fd31b2e9086b4ec50b47897ad6042a9/aiohttp-3.13.5-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:55b3bdd3292283295774ab585160c4004f4f2f203946997f49aac032c84649e9", size = 1721013 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/a0/24409c12217456df0bae7babe3b014e460b0b38a8e60753d6cb339f6556d/aiohttp-3.13.5-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c2b2355dc094e5f7d45a7bb262fe7207aa0460b37a0d87027dcf21b5d890e7d5", size = 1781501 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/9d/b65ec649adc5bccc008b0957a9a9c691070aeac4e41cea18559fef49958b/aiohttp-3.13.5-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b38765950832f7d728297689ad78f5f2cf79ff82487131c4d26fe6ceecdc5f8e", size = 1878981 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/57/d8/8d44036d7eb7b6a8ec4c5494ea0c8c8b94fbc0ed3991c1a7adf230df03bf/aiohttp-3.13.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b18f31b80d5a33661e08c89e202edabf1986e9b49c42b4504371daeaa11b47c1", size = 1767934 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/31/04/d3f8211f273356f158e3464e9e45484d3fb8c4ce5eb2f6fe9405c3273983/aiohttp-3.13.5-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:33add2463dde55c4f2d9635c6ab33ce154e5ecf322bd26d09af95c5f81cfa286", size = 1566671 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/db/073e4ebe00b78e2dfcacff734291651729a62953b48933d765dc513bf798/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:327cc432fdf1356fb4fbc6fe833ad4e9f6aacb71a8acaa5f1855e4b25910e4a9", size = 1705219 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/45/7dfba71a2f9fd97b15c95c06819de7eb38113d2cdb6319669195a7d64270/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:7c35b0bf0b48a70b4cb4fc5d7bed9b932532728e124874355de1a0af8ec4bc88", size = 1743049 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/71/901db0061e0f717d226386a7f471bb59b19566f2cae5f0d93874b017271f/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:df23d57718f24badef8656c49743e11a89fd6f5358fa8a7b96e728fda2abf7d3", size = 1749557 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/08/d5/41eebd16066e59cd43728fe74bce953d7402f2b4ddfdfef2c0e9f17ca274/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:02e048037a6501a5ec1f6fc9736135aec6eb8a004ce48838cb951c515f32c80b", size = 1558931 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/30/e6/4a799798bf05740e66c3a1161079bda7a3dd8e22ca392481d7a7f9af82a6/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:31cebae8b26f8a615d2b546fee45d5ffb76852ae6450e2a03f42c9102260d6fe", size = 1774125 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/63/7749337c90f92bc2cb18f9560d67aa6258c7060d1397d21529b8004fcf6f/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:888e78eb5ca55a615d285c3c09a7a91b42e9dd6fc699b166ebd5dee87c9ccf14", size = 1732427 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/de/cf2f44ff98d307e72fb97d5f5bbae3bfcb442f0ea9790c0bf5c5c2331404/aiohttp-3.13.5-cp312-cp312-win32.whl", hash = "sha256:8bd3ec6376e68a41f9f95f5ed170e2fcf22d4eb27a1f8cb361d0508f6e0557f3", size = 433534 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/ca/eadf6f9c8fa5e31d40993e3db153fb5ed0b11008ad5d9de98a95045bed84/aiohttp-3.13.5-cp312-cp312-win_amd64.whl", hash = "sha256:110e448e02c729bcebb18c60b9214a87ba33bac4a9fa5e9a5f139938b56c6cb1", size = 460446 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/e9/d76bf503005709e390122d34e15256b88f7008e246c4bdbe915cd4f1adce/aiohttp-3.13.5-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5029cc80718bbd545123cd8fe5d15025eccaaaace5d0eeec6bd556ad6163d61", size = 742930 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/57/00/4b7b70223deaebd9bb85984d01a764b0d7bd6526fcdc73cca83bcbe7243e/aiohttp-3.13.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4bb6bf5811620003614076bdc807ef3b5e38244f9d25ca5fe888eaccea2a9832", size = 496927 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/f5/0fb20fb49f8efdcdce6cd8127604ad2c503e754a8f139f5e02b01626523f/aiohttp-3.13.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a84792f8631bf5a94e52d9cc881c0b824ab42717165a5579c760b830d9392ac9", size = 497141 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/86/b7c870053e36a94e8951b803cb5b909bfbc9b90ca941527f5fcafbf6b0fa/aiohttp-3.13.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:57653eac22c6a4c13eb22ecf4d673d64a12f266e72785ab1c8b8e5940d0e8090", size = 1732476 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b5/e5/4e161f84f98d80c03a238671b4136e6530453d65262867d989bbe78244d0/aiohttp-3.13.5-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5e5f7debc7a57af53fdf5c5009f9391d9f4c12867049d509bf7bb164a6e295b", size = 1706507 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/56/ea11a9f01518bd5a2a2fcee869d248c4b8a0cfa0bb13401574fa31adf4d4/aiohttp-3.13.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c719f65bebcdf6716f10e9eff80d27567f7892d8988c06de12bbbd39307c6e3a", size = 1773465 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/eb/40/333ca27fb74b0383f17c90570c748f7582501507307350a79d9f9f3c6eb1/aiohttp-3.13.5-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d97f93fdae594d886c5a866636397e2bcab146fd7a132fd6bb9ce182224452f8", size = 1873523 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f0/d2/e2f77eef1acb7111405433c707dc735e63f67a56e176e72e9e7a2cd3f493/aiohttp-3.13.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3df334e39d4c2f899a914f1dba283c1aadc311790733f705182998c6f7cae665", size = 1754113 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fb/56/3f653d7f53c89669301ec9e42c95233e2a0c0a6dd051269e6e678db4fdb0/aiohttp-3.13.5-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fe6970addfea9e5e081401bcbadf865d2b6da045472f58af08427e108d618540", size = 1562351 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/a6/9b3e91eb8ae791cce4ee736da02211c85c6f835f1bdfac0594a8a3b7018c/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7becdf835feff2f4f335d7477f121af787e3504b48b449ff737afb35869ba7bb", size = 1693205 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/98/fc/bfb437a99a2fcebd6b6eaec609571954de2ed424f01c352f4b5504371dd3/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:676e5651705ad5d8a70aeb8eb6936c436d8ebbd56e63436cb7dd9bb36d2a9a46", size = 1730618 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/b6/c8534862126191a034f68153194c389addc285a0f1347d85096d349bbc15/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:9b16c653d38eb1a611cc898c41e76859ca27f119d25b53c12875fd0474ae31a8", size = 1745185 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/93/4ca8ee2ef5236e2707e0fd5fecb10ce214aee1ff4ab307af9c558bda3b37/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:999802d5fa0389f58decd24b537c54aa63c01c3219ce17d1214cbda3c2b22d2d", size = 1557311 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/57/ae/76177b15f18c5f5d094f19901d284025db28eccc5ae374d1d254181d33f4/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ec707059ee75732b1ba130ed5f9580fe10ff75180c812bc267ded039db5128c6", size = 1773147 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/a4/62f05a0a98d88af59d93b7fcac564e5f18f513cb7471696ac286db970d6a/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2d6d44a5b48132053c2f6cd5c8cb14bc67e99a63594e336b0f2af81e94d5530c", size = 1730356 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/85/fc8601f59dfa8c9523808281f2da571f8b4699685f9809a228adcc90838d/aiohttp-3.13.5-cp313-cp313-win32.whl", hash = "sha256:329f292ed14d38a6c4c435e465f48bebb47479fd676a0411936cc371643225cc", size = 432637 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/1b/ac685a8882896acf0f6b31d689e3792199cfe7aba37969fa91da63a7fa27/aiohttp-3.13.5-cp313-cp313-win_amd64.whl", hash = "sha256:69f571de7500e0557801c0b51f4780482c0ec5fe2ac851af5a92cfce1af1cb83", size = 458896 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/ce/46572759afc859e867a5bc8ec3487315869013f59281ce61764f76d879de/aiohttp-3.13.5-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:eb4639f32fd4a9904ab8fb45bf3383ba71137f3d9d4ba25b3b3f3109977c5b8c", size = 745721 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/fe/8a2efd7626dbe6049b2ef8ace18ffda8a4dfcbe1bcff3ac30c0c7575c20b/aiohttp-3.13.5-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:7e5dc4311bd5ac493886c63cbf76ab579dbe4641268e7c74e48e774c74b6f2be", size = 497663 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9b/91/cc8cc78a111826c54743d88651e1687008133c37e5ee615fee9b57990fac/aiohttp-3.13.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:756c3c304d394977519824449600adaf2be0ccee76d206ee339c5e76b70ded25", size = 499094 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/33/a8362cb15cf16a3af7e86ed11962d5cd7d59b449202dc576cdc731310bde/aiohttp-3.13.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ecc26751323224cf8186efcf7fbcbc30f4e1d8c7970659daf25ad995e4032a56", size = 1726701 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/0c/c091ac5c3a17114bd76cbf85d674650969ddf93387876cf67f754204bd77/aiohttp-3.13.5-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10a75acfcf794edf9d8db50e5a7ec5fc818b2a8d3f591ce93bc7b1210df016d2", size = 1683360 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/23/73/bcee1c2b79bc275e964d1446c55c54441a461938e70267c86afaae6fba27/aiohttp-3.13.5-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f7a18f258d124cd678c5fe072fe4432a4d5232b0657fca7c1847f599233c83a", size = 1773023 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/ef/720e639df03004fee2d869f771799d8c23046dec47d5b81e396c7cda583a/aiohttp-3.13.5-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:df6104c009713d3a89621096f3e3e88cc323fd269dbd7c20afe18535094320be", size = 1853795 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bd/c9/989f4034fb46841208de7aeeac2c6d8300745ab4f28c42f629ba77c2d916/aiohttp-3.13.5-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:241a94f7de7c0c3b616627aaad530fe2cb620084a8b144d3be7b6ecfe95bae3b", size = 1730405 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/75/ee1fd286ca7dc599d824b5651dad7b3be7ff8d9a7e7b3fe9820d9180f7db/aiohttp-3.13.5-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c974fb66180e58709b6fc402846f13791240d180b74de81d23913abe48e96d94", size = 1558082 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/20/1e9e6650dfc436340116b7aa89ff8cb2bbdf0abc11dfaceaad8f74273a10/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:6e27ea05d184afac78aabbac667450c75e54e35f62238d44463131bd3f96753d", size = 1692346 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d8/40/8ebc6658d48ea630ac7903912fe0dd4e262f0e16825aa4c833c56c9f1f56/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a79a6d399cef33a11b6f004c67bb07741d91f2be01b8d712d52c75711b1e07c7", size = 1698891 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d8/78/ea0ae5ec8ba7a5c10bdd6e318f1ba5e76fcde17db8275188772afc7917a4/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c632ce9c0b534fbe25b52c974515ed674937c5b99f549a92127c85f771a78772", size = 1742113 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/66/9d308ed71e3f2491be1acb8769d96c6f0c47d92099f3bc9119cada27b357/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:fceedde51fbd67ee2bcc8c0b33d0126cc8b51ef3bbde2f86662bd6d5a6f10ec5", size = 1553088 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/da/a6/6cc25ed8dfc6e00c90f5c6d126a98e2cf28957ad06fa1036bd34b6f24a2c/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f92995dfec9420bb69ae629abf422e516923ba79ba4403bc750d94fb4a6c68c1", size = 1757976 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/2b/cce5b0ffe0de99c83e5e36d8f828e4161e415660a9f3e58339d07cce3006/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:20ae0ff08b1f2c8788d6fb85afcb798654ae6ba0b747575f8562de738078457b", size = 1712444 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6c/cf/9e1795b4160c58d29421eafd1a69c6ce351e2f7c8d3c6b7e4ca44aea1a5b/aiohttp-3.13.5-cp314-cp314-win32.whl", hash = "sha256:b20df693de16f42b2472a9c485e1c948ee55524786a0a34345511afdd22246f3", size = 438128 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/4d/eaedff67fc805aeba4ba746aec891b4b24cebb1a7d078084b6300f79d063/aiohttp-3.13.5-cp314-cp314-win_amd64.whl", hash = "sha256:f85c6f327bf0b8c29da7d93b1cabb6363fb5e4e160a32fa241ed2dce21b73162", size = 464029 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/11/c27d9332ee20d68dd164dc12a6ecdef2e2e35ecc97ed6cf0d2442844624b/aiohttp-3.13.5-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:1efb06900858bb618ff5cee184ae2de5828896c448403d51fb633f09e109be0a", size = 778758 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/fb/377aead2e0a3ba5f09b7624f702a964bdf4f08b5b6728a9799830c80041e/aiohttp-3.13.5-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:fee86b7c4bd29bdaf0d53d14739b08a106fdda809ca5fe032a15f52fae5fe254", size = 512883 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bb/a6/aa109a33671f7a5d3bd78b46da9d852797c5e665bfda7d6b373f56bff2ec/aiohttp-3.13.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:20058e23909b9e65f9da62b396b77dfa95965cbe840f8def6e572538b1d32e36", size = 516668 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/b3/ca078f9f2fa9563c36fb8ef89053ea2bb146d6f792c5104574d49d8acb63/aiohttp-3.13.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8cf20a8d6868cb15a73cab329ffc07291ba8c22b1b88176026106ae39aa6df0f", size = 1883461 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/e3/a7ad633ca1ca497b852233a3cce6906a56c3225fb6d9217b5e5e60b7419d/aiohttp-3.13.5-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:330f5da04c987f1d5bdb8ae189137c77139f36bd1cb23779ca1a354a4b027800", size = 1747661 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/b9/cd6fe579bed34a906d3d783fe60f2fa297ef55b27bb4538438ee49d4dc41/aiohttp-3.13.5-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6f1cbf0c7926d315c3c26c2da41fd2b5d2fe01ac0e157b78caefc51a782196cf", size = 1863800 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/3f/2c1e2f5144cefa889c8afd5cf431994c32f3b29da9961698ff4e3811b79a/aiohttp-3.13.5-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:53fc049ed6390d05423ba33103ded7281fe897cf97878f369a527070bd95795b", size = 1958382 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/1d/f31ec3f1013723b3babe3609e7f119c2c2fb6ef33da90061a705ef3e1bc8/aiohttp-3.13.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:898703aa2667e3c5ca4c54ca36cd73f58b7a38ef87a5606414799ebce4d3fd3a", size = 1803724 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/b4/57712dfc6f1542f067daa81eb61da282fab3e6f1966fca25db06c4fc62d5/aiohttp-3.13.5-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0494a01ca9584eea1e5fbd6d748e61ecff218c51b576ee1999c23db7066417d8", size = 1640027 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/25/3c/734c878fb43ec083d8e31bf029daae1beafeae582d1b35da234739e82ee7/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:6cf81fe010b8c17b09495cbd15c1d35afbc8fb405c0c9cf4738e5ae3af1d65be", size = 1806644 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/a5/f671e5cbec1c21d044ff3078223f949748f3a7f86b14e34a365d74a5d21f/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:c564dd5f09ddc9d8f2c2d0a301cd30a79a2cc1b46dd1a73bef8f0038863d016b", size = 1791630 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/63/fb8d0ad63a0b8a99be97deac8c04dacf0785721c158bdf23d679a87aa99e/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:2994be9f6e51046c4f864598fd9abeb4fba6e88f0b2152422c9666dcd4aea9c6", size = 1809403 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/0c/bfed7f30662fcf12206481c2aac57dedee43fe1c49275e85b3a1e1742294/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:157826e2fa245d2ef46c83ea8a5faf77ca19355d278d425c29fda0beb3318037", size = 1634924 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/17/d6/fd518d668a09fd5a3319ae5e984d4d80b9a4b3df4e21c52f02251ef5a32e/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:a8aca50daa9493e9e13c0f566201a9006f080e7c50e5e90d0b06f53146a54500", size = 1836119 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/b7/15fb7a9d52e112a25b621c67b69c167805cb1f2ab8f1708a5c490d1b52fe/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3b13560160d07e047a93f23aaa30718606493036253d5430887514715b67c9d9", size = 1772072 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/df/57ba7f0c4a553fc2bd8b6321df236870ec6fd64a2a473a8a13d4f733214e/aiohttp-3.13.5-cp314-cp314t-win32.whl", hash = "sha256:9a0f4474b6ea6818b41f82172d799e4b3d29e22c2c520ce4357856fced9af2f8", size = 471819 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/29/2f8418269e46454a26171bfdd6a055d74febf32234e474930f2f60a17145/aiohttp-3.13.5-cp314-cp314t-win_amd64.whl", hash = "sha256:18a2f6c1182c51baa1d28d68fea51513cb2a76612f038853c0ad3c145423d3d9", size = 505441 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "aiomqtt"
|
name = "aiomqtt"
|
||||||
version = "2.5.0"
|
version = "2.5.0"
|
||||||
@@ -14,6 +125,19 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/84/24/1d5d7d89906db1a94b5029942c2fd909cf8e8551b288f56c03053d5615f8/aiomqtt-2.5.0-py3-none-any.whl", hash = "sha256:65dabeafeeee7b88864361ae9a118d81bd27082093f32f670a5c5fab17de8cf2", size = 15983 },
|
{ url = "https://files.pythonhosted.org/packages/84/24/1d5d7d89906db1a94b5029942c2fd909cf8e8551b288f56c03053d5615f8/aiomqtt-2.5.0-py3-none-any.whl", hash = "sha256:65dabeafeeee7b88864361ae9a118d81bd27082093f32f670a5c5fab17de8cf2", size = 15983 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "aiosignal"
|
||||||
|
version = "1.4.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "frozenlist" },
|
||||||
|
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "aiosqlite"
|
name = "aiosqlite"
|
||||||
version = "0.22.1"
|
version = "0.22.1"
|
||||||
@@ -72,6 +196,15 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/e6/2f/54d068d7e011a8b4e0aae3e93b09a30b33bcf780829fe70c6e8876aeb0e0/apprise-1.9.9-py3-none-any.whl", hash = "sha256:55ceb8827a1c783d683881c9f77fa42eb43b3fc91b854419c452d557101c7068", size = 1519940 },
|
{ url = "https://files.pythonhosted.org/packages/e6/2f/54d068d7e011a8b4e0aae3e93b09a30b33bcf780829fe70c6e8876aeb0e0/apprise-1.9.9-py3-none-any.whl", hash = "sha256:55ceb8827a1c783d683881c9f77fa42eb43b3fc91b854419c452d557101c7068", size = 1519940 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "attrs"
|
||||||
|
version = "26.1.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/9a/8e/82a0fe20a541c03148528be8cac2408564a6c9a0cc7e9171802bc1d26985/attrs-26.1.0.tar.gz", hash = "sha256:d03ceb89cb322a8fd706d4fb91940737b6642aa36998fe130a9bc96c985eff32", size = 952055 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/b4/17d4b0b2a2dc85a6df63d1157e028ed19f90d4cd97c36717afef2bc2f395/attrs-26.1.0-py3-none-any.whl", hash = "sha256:c647aa4a12dfbad9333ca4e71fe62ddc36f4e63b2d260a37a8b83d2f043ac309", size = 67548 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "bleak"
|
name = "bleak"
|
||||||
version = "2.1.1"
|
version = "2.1.1"
|
||||||
@@ -298,6 +431,65 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
|
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "cryptography"
|
||||||
|
version = "46.0.7"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/47/93/ac8f3d5ff04d54bc814e961a43ae5b0b146154c89c61b47bb07557679b18/cryptography-46.0.7.tar.gz", hash = "sha256:e4cfd68c5f3e0bfdad0d38e023239b96a2fe84146481852dffbcca442c245aa5", size = 750652 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/5d/4a8f770695d73be252331e60e526291e3df0c9b27556a90a6b47bccca4c2/cryptography-46.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:ea42cbe97209df307fdc3b155f1b6fa2577c0defa8f1f7d3be7d31d189108ad4", size = 7179869 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/45/6d80dc379b0bbc1f9d1e429f42e4cb9e1d319c7a8201beffd967c516ea01/cryptography-46.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b36a4695e29fe69215d75960b22577197aca3f7a25b9cf9d165dcfe9d80bc325", size = 4275492 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4a/9a/1765afe9f572e239c3469f2cb429f3ba7b31878c893b246b4b2994ffe2fe/cryptography-46.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5ad9ef796328c5e3c4ceed237a183f5d41d21150f972455a9d926593a1dcb308", size = 4426670 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/3e/af9246aaf23cd4ee060699adab1e47ced3f5f7e7a8ffdd339f817b446462/cryptography-46.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:73510b83623e080a2c35c62c15298096e2a5dc8d51c3b4e1740211839d0dea77", size = 4280275 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0f/54/6bbbfc5efe86f9d71041827b793c24811a017c6ac0fd12883e4caa86b8ed/cryptography-46.0.7-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:cbd5fb06b62bd0721e1170273d3f4d5a277044c47ca27ee257025146c34cbdd1", size = 4928402 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/cf/054b9d8220f81509939599c8bdbc0c408dbd2bdd41688616a20731371fe0/cryptography-46.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:420b1e4109cc95f0e5700eed79908cef9268265c773d3a66f7af1eef53d409ef", size = 4459985 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/46/4e4e9c6040fb01c7467d47217d2f882daddeb8828f7df800cb806d8a2288/cryptography-46.0.7-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:24402210aa54baae71d99441d15bb5a1919c195398a87b563df84468160a65de", size = 3990652 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/5f/313586c3be5a2fbe87e4c9a254207b860155a8e1f3cca99f9910008e7d08/cryptography-46.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8a469028a86f12eb7d2fe97162d0634026d92a21f3ae0ac87ed1c4a447886c83", size = 4279805 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/33/60dfc4595f334a2082749673386a4d05e4f0cf4df8248e63b2c3437585f2/cryptography-46.0.7-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9694078c5d44c157ef3162e3bf3946510b857df5a3955458381d1c7cfc143ddb", size = 4892883 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/0b/333ddab4270c4f5b972f980adef4faa66951a4aaf646ca067af597f15563/cryptography-46.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:42a1e5f98abb6391717978baf9f90dc28a743b7d9be7f0751a6f56a75d14065b", size = 4459756 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/14/633913398b43b75f1234834170947957c6b623d1701ffc7a9600da907e89/cryptography-46.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:91bbcb08347344f810cbe49065914fe048949648f6bd5c2519f34619142bbe85", size = 4410244 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/f2/19ceb3b3dc14009373432af0c13f46aa08e3ce334ec6eff13492e1812ccd/cryptography-46.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:5d1c02a14ceb9148cc7816249f64f623fbfee39e8c03b3650d842ad3f34d637e", size = 4674868 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1a/bb/a5c213c19ee94b15dfccc48f363738633a493812687f5567addbcbba9f6f/cryptography-46.0.7-cp311-abi3-win32.whl", hash = "sha256:d23c8ca48e44ee015cd0a54aeccdf9f09004eba9fc96f38c911011d9ff1bd457", size = 3026504 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/02/7788f9fefa1d060ca68717c3901ae7fffa21ee087a90b7f23c7a603c32ae/cryptography-46.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:397655da831414d165029da9bc483bed2fe0e75dde6a1523ec2fe63f3c46046b", size = 3488363 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7b/56/15619b210e689c5403bb0540e4cb7dbf11a6bf42e483b7644e471a2812b3/cryptography-46.0.7-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:d151173275e1728cf7839aaa80c34fe550c04ddb27b34f48c232193df8db5842", size = 7119671 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/66/e3ce040721b0b5599e175ba91ab08884c75928fbeb74597dd10ef13505d2/cryptography-46.0.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:db0f493b9181c7820c8134437eb8b0b4792085d37dbb24da050476ccb664e59c", size = 4268551 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/03/11/5e395f961d6868269835dee1bafec6a1ac176505a167f68b7d8818431068/cryptography-46.0.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ebd6daf519b9f189f85c479427bbd6e9c9037862cf8fe89ee35503bd209ed902", size = 4408887 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/40/53/8ed1cf4c3b9c8e611e7122fb56f1c32d09e1fff0f1d77e78d9ff7c82653e/cryptography-46.0.7-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:b7b412817be92117ec5ed95f880defe9cf18a832e8cafacf0a22337dc1981b4d", size = 4271354 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/50/46/cf71e26025c2e767c5609162c866a78e8a2915bbcfa408b7ca495c6140c4/cryptography-46.0.7-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:fbfd0e5f273877695cb93baf14b185f4878128b250cc9f8e617ea0c025dfb022", size = 4905845 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/ea/01276740375bac6249d0a971ebdf6b4dc9ead0ee0a34ef3b5a88c1a9b0d4/cryptography-46.0.7-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:ffca7aa1d00cf7d6469b988c581598f2259e46215e0140af408966a24cf086ce", size = 4444641 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/4c/7d258f169ae71230f25d9f3d06caabcff8c3baf0978e2b7d65e0acac3827/cryptography-46.0.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:60627cf07e0d9274338521205899337c5d18249db56865f943cbe753aa96f40f", size = 3967749 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b5/2a/2ea0767cad19e71b3530e4cad9605d0b5e338b6a1e72c37c9c1ceb86c333/cryptography-46.0.7-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:80406c3065e2c55d7f49a9550fe0c49b3f12e5bfff5dedb727e319e1afb9bf99", size = 4270942 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/3d/fe14df95a83319af25717677e956567a105bb6ab25641acaa093db79975d/cryptography-46.0.7-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:c5b1ccd1239f48b7151a65bc6dd54bcfcc15e028c8ac126d3fada09db0e07ef1", size = 4871079 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/59/4a479e0f36f8f378d397f4eab4c850b4ffb79a2f0d58704b8fa0703ddc11/cryptography-46.0.7-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:d5f7520159cd9c2154eb61eb67548ca05c5774d39e9c2c4339fd793fe7d097b2", size = 4443999 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/28/17/b59a741645822ec6d04732b43c5d35e4ef58be7bfa84a81e5ae6f05a1d33/cryptography-46.0.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:fcd8eac50d9138c1d7fc53a653ba60a2bee81a505f9f8850b6b2888555a45d0e", size = 4399191 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/6a/bb2e166d6d0e0955f1e9ff70f10ec4b2824c9cfcdb4da772c7dd69cc7d80/cryptography-46.0.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:65814c60f8cc400c63131584e3e1fad01235edba2614b61fbfbfa954082db0ee", size = 4655782 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/b6/3da51d48415bcb63b00dc17c2eff3a651b7c4fed484308d0f19b30e8cb2c/cryptography-46.0.7-cp314-cp314t-win32.whl", hash = "sha256:fdd1736fed309b4300346f88f74cd120c27c56852c3838cab416e7a166f67298", size = 3002227 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/32/a8/9f0e4ed57ec9cebe506e58db11ae472972ecb0c659e4d52bbaee80ca340a/cryptography-46.0.7-cp314-cp314t-win_amd64.whl", hash = "sha256:e06acf3c99be55aa3b516397fe42f5855597f430add9c17fa46bf2e0fb34c9bb", size = 3475332 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/7f/cd42fc3614386bc0c12f0cb3c4ae1fc2bbca5c9662dfed031514911d513d/cryptography-46.0.7-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:462ad5cb1c148a22b2e3bcc5ad52504dff325d17daf5df8d88c17dda1f75f2a4", size = 7165618 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a5/d0/36a49f0262d2319139d2829f773f1b97ef8aef7f97e6e5bd21455e5a8fb5/cryptography-46.0.7-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:84d4cced91f0f159a7ddacad249cc077e63195c36aac40b4150e7a57e84fffe7", size = 4270628 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/6c/1a42450f464dda6ffbe578a911f773e54dd48c10f9895a23a7e88b3e7db5/cryptography-46.0.7-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:128c5edfe5e5938b86b03941e94fac9ee793a94452ad1365c9fc3f4f62216832", size = 4415405 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/92/4ed714dbe93a066dc1f4b4581a464d2d7dbec9046f7c8b7016f5286329e2/cryptography-46.0.7-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:5e51be372b26ef4ba3de3c167cd3d1022934bc838ae9eaad7e644986d2a3d163", size = 4272715 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/e6/a26b84096eddd51494bba19111f8fffe976f6a09f132706f8f1bf03f51f7/cryptography-46.0.7-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:cdf1a610ef82abb396451862739e3fc93b071c844399e15b90726ef7470eeaf2", size = 4918400 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/08/ffd537b605568a148543ac3c2b239708ae0bd635064bab41359252ef88ed/cryptography-46.0.7-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1d25aee46d0c6f1a501adcddb2d2fee4b979381346a78558ed13e50aa8a59067", size = 4450634 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/16/01/0cd51dd86ab5b9befe0d031e276510491976c3a80e9f6e31810cce46c4ad/cryptography-46.0.7-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:cdfbe22376065ffcf8be74dc9a909f032df19bc58a699456a21712d6e5eabfd0", size = 3985233 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/49/819d6ed3a7d9349c2939f81b500a738cb733ab62fbecdbc1e38e83d45e12/cryptography-46.0.7-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:abad9dac36cbf55de6eb49badd4016806b3165d396f64925bf2999bcb67837ba", size = 4271955 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/80/07/ad9b3c56ebb95ed2473d46df0847357e01583f4c52a85754d1a55e29e4d0/cryptography-46.0.7-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:935ce7e3cfdb53e3536119a542b839bb94ec1ad081013e9ab9b7cfd478b05006", size = 4879888 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b8/c7/201d3d58f30c4c2bdbe9b03844c291feb77c20511cc3586daf7edc12a47b/cryptography-46.0.7-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:35719dc79d4730d30f1c2b6474bd6acda36ae2dfae1e3c16f2051f215df33ce0", size = 4449961 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a5/ef/649750cbf96f3033c3c976e112265c33906f8e462291a33d77f90356548c/cryptography-46.0.7-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:7bbc6ccf49d05ac8f7d7b5e2e2c33830d4fe2061def88210a126d130d7f71a85", size = 4401696 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/52/a8908dcb1a389a459a29008c29966c1d552588d4ae6d43f3a1a4512e0ebe/cryptography-46.0.7-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a1529d614f44b863a7b480c6d000fe93b59acee9c82ffa027cfadc77521a9f5e", size = 4664256 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4b/fa/f0ab06238e899cc3fb332623f337a7364f36f4bb3f2534c2bb95a35b132c/cryptography-46.0.7-cp38-abi3-win32.whl", hash = "sha256:f247c8c1a1fb45e12586afbb436ef21ff1e80670b2861a90353d9b025583d246", size = 3013001 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/f1/00ce3bde3ca542d1acd8f8cfa38e446840945aa6363f9b74746394b14127/cryptography-46.0.7-cp38-abi3-win_amd64.whl", hash = "sha256:506c4ff91eff4f82bdac7633318a526b1d1309fc07ca76a3ad182cb5b686d6d3", size = 3472985 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/63/0c/dca8abb64e7ca4f6b2978769f6fea5ad06686a190cec381f0a796fdcaaba/cryptography-46.0.7-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:fc9ab8856ae6cf7c9358430e49b368f3108f050031442eaeb6b9d87e4dcf4e4f", size = 3476879 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/ea/075aac6a84b7c271578d81a2f9968acb6e273002408729f2ddff517fed4a/cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d3b99c535a9de0adced13d159c5a9cf65c325601aa30f4be08afd680643e9c15", size = 4219700 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6c/7b/1c55db7242b5e5612b29fc7a630e91ee7a6e3c8e7bf5406d22e206875fbd/cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d02c738dacda7dc2a74d1b2b3177042009d5cab7c7079db74afc19e56ca1b455", size = 4385982 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/da/9870eec4b69c63ef5925bf7d8342b7e13bc2ee3d47791461c4e49ca212f4/cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:04959522f938493042d595a736e7dbdff6eb6cc2339c11465b3ff89343b65f65", size = 4219115 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/72/05aa5832b82dd341969e9a734d1812a6aadb088d9eb6f0430fc337cc5a8f/cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:3986ac1dee6def53797289999eabe84798ad7817f3e97779b5061a95b0ee4968", size = 4385479 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/2a/1b016902351a523aa2bd446b50a5bc1175d7a7d1cf90fe2ef904f9b84ebc/cryptography-46.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:258514877e15963bd43b558917bc9f54cf7cf866c38aa576ebf47a77ddbc43a4", size = 3412829 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "dbus-fast"
|
name = "dbus-fast"
|
||||||
version = "3.1.2"
|
version = "3.1.2"
|
||||||
@@ -351,6 +543,111 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/5c/05/5cbb59154b093548acd0f4c7c474a118eda06da25aa75c616b72d8fcd92a/fastapi-0.128.0-py3-none-any.whl", hash = "sha256:aebd93f9716ee3b4f4fcfe13ffb7cf308d99c9f3ab5622d8877441072561582d", size = 103094 },
|
{ url = "https://files.pythonhosted.org/packages/5c/05/5cbb59154b093548acd0f4c7c474a118eda06da25aa75c616b72d8fcd92a/fastapi-0.128.0-py3-none-any.whl", hash = "sha256:aebd93f9716ee3b4f4fcfe13ffb7cf308d99c9f3ab5622d8877441072561582d", size = 103094 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "frozenlist"
|
||||||
|
version = "1.8.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/2d/f5/c831fac6cc817d26fd54c7eaccd04ef7e0288806943f7cc5bbf69f3ac1f0/frozenlist-1.8.0.tar.gz", hash = "sha256:3ede829ed8d842f6cd48fc7081d7a41001a56f1f38603f9d49bf3020d59a31ad", size = 45875 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bc/03/077f869d540370db12165c0aa51640a873fb661d8b315d1d4d67b284d7ac/frozenlist-1.8.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:09474e9831bc2b2199fad6da3c14c7b0fbdd377cce9d3d77131be28906cb7d84", size = 86912 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/b5/7610b6bd13e4ae77b96ba85abea1c8cb249683217ef09ac9e0ae93f25a91/frozenlist-1.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:17c883ab0ab67200b5f964d2b9ed6b00971917d5d8a92df149dc2c9779208ee9", size = 50046 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6e/ef/0e8f1fe32f8a53dd26bdd1f9347efe0778b0fddf62789ea683f4cc7d787d/frozenlist-1.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:fa47e444b8ba08fffd1c18e8cdb9a75db1b6a27f17507522834ad13ed5922b93", size = 50119 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/11/b1/71a477adc7c36e5fb628245dfbdea2166feae310757dea848d02bd0689fd/frozenlist-1.8.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2552f44204b744fba866e573be4c1f9048d6a324dfe14475103fd51613eb1d1f", size = 231067 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/7e/afe40eca3a2dc19b9904c0f5d7edfe82b5304cb831391edec0ac04af94c2/frozenlist-1.8.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:957e7c38f250991e48a9a73e6423db1bb9dd14e722a10f6b8bb8e16a0f55f695", size = 233160 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/aa/7416eac95603ce428679d273255ffc7c998d4132cfae200103f164b108aa/frozenlist-1.8.0-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:8585e3bb2cdea02fc88ffa245069c36555557ad3609e83be0ec71f54fd4abb52", size = 228544 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/3d/2a2d1f683d55ac7e3875e4263d28410063e738384d3adc294f5ff3d7105e/frozenlist-1.8.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:edee74874ce20a373d62dc28b0b18b93f645633c2943fd90ee9d898550770581", size = 243797 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/1e/2d5565b589e580c296d3bb54da08d206e797d941a83a6fdea42af23be79c/frozenlist-1.8.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c9a63152fe95756b85f31186bddf42e4c02c6321207fd6601a1c89ebac4fe567", size = 247923 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/c3/65872fcf1d326a7f101ad4d86285c403c87be7d832b7470b77f6d2ed5ddc/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b6db2185db9be0a04fecf2f241c70b63b1a242e2805be291855078f2b404dd6b", size = 230886 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a0/76/ac9ced601d62f6956f03cc794f9e04c81719509f85255abf96e2510f4265/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f4be2e3d8bc8aabd566f8d5b8ba7ecc09249d74ba3c9ed52e54dc23a293f0b92", size = 245731 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b9/49/ecccb5f2598daf0b4a1415497eba4c33c1e8ce07495eb07d2860c731b8d5/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:c8d1634419f39ea6f5c427ea2f90ca85126b54b50837f31497f3bf38266e853d", size = 241544 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/4b/ddf24113323c0bbcc54cb38c8b8916f1da7165e07b8e24a717b4a12cbf10/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:1a7fa382a4a223773ed64242dbe1c9c326ec09457e6b8428efb4118c685c3dfd", size = 241806 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/fb/9b9a084d73c67175484ba2789a59f8eebebd0827d186a8102005ce41e1ba/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:11847b53d722050808926e785df837353bd4d75f1d494377e59b23594d834967", size = 229382 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/a3/c8fb25aac55bf5e12dae5c5aa6a98f85d436c1dc658f21c3ac73f9fa95e5/frozenlist-1.8.0-cp311-cp311-win32.whl", hash = "sha256:27c6e8077956cf73eadd514be8fb04d77fc946a7fe9f7fe167648b0b9085cc25", size = 39647 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/f5/603d0d6a02cfd4c8f2a095a54672b3cf967ad688a60fb9faf04fc4887f65/frozenlist-1.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:ac913f8403b36a2c8610bbfd25b8013488533e71e62b4b4adce9c86c8cea905b", size = 44064 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/16/c2c9ab44e181f043a86f9a8f84d5124b62dbcb3a02c0977ec72b9ac1d3e0/frozenlist-1.8.0-cp311-cp311-win_arm64.whl", hash = "sha256:d4d3214a0f8394edfa3e303136d0575eece0745ff2b47bd2cb2e66dd92d4351a", size = 39937 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/29/948b9aa87e75820a38650af445d2ef2b6b8a6fab1a23b6bb9e4ef0be2d59/frozenlist-1.8.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:78f7b9e5d6f2fdb88cdde9440dc147259b62b9d3b019924def9f6478be254ac1", size = 87782 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/80/4f6e318ee2a7c0750ed724fa33a4bdf1eacdc5a39a7a24e818a773cd91af/frozenlist-1.8.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:229bf37d2e4acdaf808fd3f06e854a4a7a3661e871b10dc1f8f1896a3b05f18b", size = 50594 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/94/5c8a2b50a496b11dd519f4a24cb5496cf125681dd99e94c604ccdea9419a/frozenlist-1.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f833670942247a14eafbb675458b4e61c82e002a148f49e68257b79296e865c4", size = 50448 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6a/bd/d91c5e39f490a49df14320f4e8c80161cfcce09f1e2cde1edd16a551abb3/frozenlist-1.8.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:494a5952b1c597ba44e0e78113a7266e656b9794eec897b19ead706bd7074383", size = 242411 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/83/f61505a05109ef3293dfb1ff594d13d64a2324ac3482be2cedc2be818256/frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96f423a119f4777a4a056b66ce11527366a8bb92f54e541ade21f2374433f6d4", size = 243014 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d8/cb/cb6c7b0f7d4023ddda30cf56b8b17494eb3a79e3fda666bf735f63118b35/frozenlist-1.8.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3462dd9475af2025c31cc61be6652dfa25cbfb56cbbf52f4ccfe029f38decaf8", size = 234909 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/31/c5/cd7a1f3b8b34af009fb17d4123c5a778b44ae2804e3ad6b86204255f9ec5/frozenlist-1.8.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c4c800524c9cd9bac5166cd6f55285957fcfc907db323e193f2afcd4d9abd69b", size = 250049 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/01/2f95d3b416c584a1e7f0e1d6d31998c4a795f7544069ee2e0962a4b60740/frozenlist-1.8.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d6a5df73acd3399d893dafc71663ad22534b5aa4f94e8a2fabfe856c3c1b6a52", size = 256485 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/03/024bf7720b3abaebcff6d0793d73c154237b85bdf67b7ed55e5e9596dc9a/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:405e8fe955c2280ce66428b3ca55e12b3c4e9c336fb2103a4937e891c69a4a29", size = 237619 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/fa/f8abdfe7d76b731f5d8bd217827cf6764d4f1d9763407e42717b4bed50a0/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:908bd3f6439f2fef9e85031b59fd4f1297af54415fb60e4254a95f75b3cab3f3", size = 250320 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/3c/b051329f718b463b22613e269ad72138cc256c540f78a6de89452803a47d/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:294e487f9ec720bd8ffcebc99d575f7eff3568a08a253d1ee1a0378754b74143", size = 246820 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0f/ae/58282e8f98e444b3f4dd42448ff36fa38bef29e40d40f330b22e7108f565/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:74c51543498289c0c43656701be6b077f4b265868fa7f8a8859c197006efb608", size = 250518 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/96/007e5944694d66123183845a106547a15944fbbb7154788cbf7272789536/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:776f352e8329135506a1d6bf16ac3f87bc25b28e765949282dcc627af36123aa", size = 239096 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/bb/852b9d6db2fa40be96f29c0d1205c306288f0684df8fd26ca1951d461a56/frozenlist-1.8.0-cp312-cp312-win32.whl", hash = "sha256:433403ae80709741ce34038da08511d4a77062aa924baf411ef73d1146e74faf", size = 39985 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b8/af/38e51a553dd66eb064cdf193841f16f077585d4d28394c2fa6235cb41765/frozenlist-1.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:34187385b08f866104f0c0617404c8eb08165ab1272e884abc89c112e9c00746", size = 44591 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/06/1dc65480ab147339fecc70797e9c2f69d9cea9cf38934ce08df070fdb9cb/frozenlist-1.8.0-cp312-cp312-win_arm64.whl", hash = "sha256:fe3c58d2f5db5fbd18c2987cba06d51b0529f52bc3a6cdc33d3f4eab725104bd", size = 40102 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/40/0832c31a37d60f60ed79e9dfb5a92e1e2af4f40a16a29abcc7992af9edff/frozenlist-1.8.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8d92f1a84bb12d9e56f818b3a746f3efba93c1b63c8387a73dde655e1e42282a", size = 85717 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/30/ba/b0b3de23f40bc55a7057bd38434e25c34fa48e17f20ee273bbde5e0650f3/frozenlist-1.8.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:96153e77a591c8adc2ee805756c61f59fef4cf4073a9275ee86fe8cba41241f7", size = 49651 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/ab/6e5080ee374f875296c4243c381bbdef97a9ac39c6e3ce1d5f7d42cb78d6/frozenlist-1.8.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f21f00a91358803399890ab167098c131ec2ddd5f8f5fd5fe9c9f2c6fcd91e40", size = 49417 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/4e/e4691508f9477ce67da2015d8c00acd751e6287739123113a9fca6f1604e/frozenlist-1.8.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fb30f9626572a76dfe4293c7194a09fb1fe93ba94c7d4f720dfae3b646b45027", size = 234391 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/40/76/c202df58e3acdf12969a7895fd6f3bc016c642e6726aa63bd3025e0fc71c/frozenlist-1.8.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaa352d7047a31d87dafcacbabe89df0aa506abb5b1b85a2fb91bc3faa02d822", size = 233048 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/c0/8746afb90f17b73ca5979c7a3958116e105ff796e718575175319b5bb4ce/frozenlist-1.8.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:03ae967b4e297f58f8c774c7eabcce57fe3c2434817d4385c50661845a058121", size = 226549 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/eb/4c7eefc718ff72f9b6c4893291abaae5fbc0c82226a32dcd8ef4f7a5dbef/frozenlist-1.8.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f6292f1de555ffcc675941d65fffffb0a5bcd992905015f85d0592201793e0e5", size = 239833 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c2/4e/e5c02187cf704224f8b21bee886f3d713ca379535f16893233b9d672ea71/frozenlist-1.8.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29548f9b5b5e3460ce7378144c3010363d8035cea44bc0bf02d57f5a685e084e", size = 245363 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/96/cb85ec608464472e82ad37a17f844889c36100eed57bea094518bf270692/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ec3cc8c5d4084591b4237c0a272cc4f50a5b03396a47d9caaf76f5d7b38a4f11", size = 229314 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/6f/4ae69c550e4cee66b57887daeebe006fe985917c01d0fff9caab9883f6d0/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:517279f58009d0b1f2e7c1b130b377a349405da3f7621ed6bfae50b10adf20c1", size = 243365 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/58/afd56de246cf11780a40a2c28dc7cbabbf06337cc8ddb1c780a2d97e88d8/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:db1e72ede2d0d7ccb213f218df6a078a9c09a7de257c2fe8fcef16d5925230b1", size = 237763 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/36/cdfaf6ed42e2644740d4a10452d8e97fa1c062e2a8006e4b09f1b5fd7d63/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b4dec9482a65c54a5044486847b8a66bf10c9cb4926d42927ec4e8fd5db7fed8", size = 240110 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/03/a8/9ea226fbefad669f11b52e864c55f0bd57d3c8d7eb07e9f2e9a0b39502e1/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:21900c48ae04d13d416f0e1e0c4d81f7931f73a9dfa0b7a8746fb2fe7dd970ed", size = 233717 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/0b/1b5531611e83ba7d13ccc9988967ea1b51186af64c42b7a7af465dcc9568/frozenlist-1.8.0-cp313-cp313-win32.whl", hash = "sha256:8b7b94a067d1c504ee0b16def57ad5738701e4ba10cec90529f13fa03c833496", size = 39628 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d8/cf/174c91dbc9cc49bc7b7aab74d8b734e974d1faa8f191c74af9b7e80848e6/frozenlist-1.8.0-cp313-cp313-win_amd64.whl", hash = "sha256:878be833caa6a3821caf85eb39c5ba92d28e85df26d57afb06b35b2efd937231", size = 43882 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/17/502cd212cbfa96eb1388614fe39a3fc9ab87dbbe042b66f97acb57474834/frozenlist-1.8.0-cp313-cp313-win_arm64.whl", hash = "sha256:44389d135b3ff43ba8cc89ff7f51f5a0bb6b63d829c8300f79a2fe4fe61bcc62", size = 39676 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/5c/3bbfaa920dfab09e76946a5d2833a7cbdf7b9b4a91c714666ac4855b88b4/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:e25ac20a2ef37e91c1b39938b591457666a0fa835c7783c3a8f33ea42870db94", size = 89235 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/d6/f03961ef72166cec1687e84e8925838442b615bd0b8854b54923ce5b7b8a/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07cdca25a91a4386d2e76ad992916a85038a9b97561bf7a3fd12d5d9ce31870c", size = 50742 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/bb/a6d12b7ba4c3337667d0e421f7181c82dda448ce4e7ad7ecd249a16fa806/frozenlist-1.8.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4e0c11f2cc6717e0a741f84a527c52616140741cd812a50422f83dc31749fb52", size = 51725 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bc/71/d1fed0ffe2c2ccd70b43714c6cab0f4188f09f8a67a7914a6b46ee30f274/frozenlist-1.8.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b3210649ee28062ea6099cfda39e147fa1bc039583c8ee4481cb7811e2448c51", size = 284533 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/1f/fb1685a7b009d89f9bf78a42d94461bc06581f6e718c39344754a5d9bada/frozenlist-1.8.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:581ef5194c48035a7de2aefc72ac6539823bb71508189e5de01d60c9dcd5fa65", size = 292506 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e6/3b/b991fe1612703f7e0d05c0cf734c1b77aaf7c7d321df4572e8d36e7048c8/frozenlist-1.8.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3ef2d026f16a2b1866e1d86fc4e1291e1ed8a387b2c333809419a2f8b3a77b82", size = 274161 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ca/ec/c5c618767bcdf66e88945ec0157d7f6c4a1322f1473392319b7a2501ded7/frozenlist-1.8.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5500ef82073f599ac84d888e3a8c1f77ac831183244bfd7f11eaa0289fb30714", size = 294676 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7c/ce/3934758637d8f8a88d11f0585d6495ef54b2044ed6ec84492a91fa3b27aa/frozenlist-1.8.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:50066c3997d0091c411a66e710f4e11752251e6d2d73d70d8d5d4c76442a199d", size = 300638 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fc/4f/a7e4d0d467298f42de4b41cbc7ddaf19d3cfeabaf9ff97c20c6c7ee409f9/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5c1c8e78426e59b3f8005e9b19f6ff46e5845895adbde20ece9218319eca6506", size = 283067 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/48/c7b163063d55a83772b268e6d1affb960771b0e203b632cfe09522d67ea5/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:eefdba20de0d938cec6a89bd4d70f346a03108a19b9df4248d3cf0d88f1b0f51", size = 292101 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/d0/2366d3c4ecdc2fd391e0afa6e11500bfba0ea772764d631bbf82f0136c9d/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cf253e0e1c3ceb4aaff6df637ce033ff6535fb8c70a764a8f46aafd3d6ab798e", size = 289901 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b8/94/daff920e82c1b70e3618a2ac39fbc01ae3e2ff6124e80739ce5d71c9b920/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:032efa2674356903cd0261c4317a561a6850f3ac864a63fc1583147fb05a79b0", size = 289395 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e3/20/bba307ab4235a09fdcd3cc5508dbabd17c4634a1af4b96e0f69bfe551ebd/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6da155091429aeba16851ecb10a9104a108bcd32f6c1642867eadaee401c1c41", size = 283659 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/00/04ca1c3a7a124b6de4f8a9a17cc2fcad138b4608e7a3fc5877804b8715d7/frozenlist-1.8.0-cp313-cp313t-win32.whl", hash = "sha256:0f96534f8bfebc1a394209427d0f8a63d343c9779cda6fc25e8e121b5fd8555b", size = 43492 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/5e/c69f733a86a94ab10f68e496dc6b7e8bc078ebb415281d5698313e3af3a1/frozenlist-1.8.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5d63a068f978fc69421fb0e6eb91a9603187527c86b7cd3f534a5b77a592b888", size = 48034 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/16/6c/be9d79775d8abe79b05fa6d23da99ad6e7763a1d080fbae7290b286093fd/frozenlist-1.8.0-cp313-cp313t-win_arm64.whl", hash = "sha256:bf0a7e10b077bf5fb9380ad3ae8ce20ef919a6ad93b4552896419ac7e1d8e042", size = 41749 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/c8/85da824b7e7b9b6e7f7705b2ecaf9591ba6f79c1177f324c2735e41d36a2/frozenlist-1.8.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cee686f1f4cadeb2136007ddedd0aaf928ab95216e7691c63e50a8ec066336d0", size = 86127 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8e/e8/a1185e236ec66c20afd72399522f142c3724c785789255202d27ae992818/frozenlist-1.8.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:119fb2a1bd47307e899c2fac7f28e85b9a543864df47aa7ec9d3c1b4545f096f", size = 49698 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a1/93/72b1736d68f03fda5fdf0f2180fb6caaae3894f1b854d006ac61ecc727ee/frozenlist-1.8.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4970ece02dbc8c3a92fcc5228e36a3e933a01a999f7094ff7c23fbd2beeaa67c", size = 49749 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/b2/fabede9fafd976b991e9f1b9c8c873ed86f202889b864756f240ce6dd855/frozenlist-1.8.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:cba69cb73723c3f329622e34bdbf5ce1f80c21c290ff04256cff1cd3c2036ed2", size = 231298 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/3b/d9b1e0b0eed36e70477ffb8360c49c85c8ca8ef9700a4e6711f39a6e8b45/frozenlist-1.8.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:778a11b15673f6f1df23d9586f83c4846c471a8af693a22e066508b77d201ec8", size = 232015 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/94/be719d2766c1138148564a3960fc2c06eb688da592bdc25adcf856101be7/frozenlist-1.8.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0325024fe97f94c41c08872db482cf8ac4800d80e79222c6b0b7b162d5b13686", size = 225038 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/09/6712b6c5465f083f52f50cf74167b92d4ea2f50e46a9eea0523d658454ae/frozenlist-1.8.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:97260ff46b207a82a7567b581ab4190bd4dfa09f4db8a8b49d1a958f6aa4940e", size = 240130 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f8/d4/cd065cdcf21550b54f3ce6a22e143ac9e4836ca42a0de1022da8498eac89/frozenlist-1.8.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:54b2077180eb7f83dd52c40b2750d0a9f175e06a42e3213ce047219de902717a", size = 242845 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/c3/f57a5c8c70cd1ead3d5d5f776f89d33110b1addae0ab010ad774d9a44fb9/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2f05983daecab868a31e1da44462873306d3cbfd76d1f0b5b69c473d21dbb128", size = 229131 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6c/52/232476fe9cb64f0742f3fde2b7d26c1dac18b6d62071c74d4ded55e0ef94/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:33f48f51a446114bc5d251fb2954ab0164d5be02ad3382abcbfe07e2531d650f", size = 240542 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/85/07bf3f5d0fb5414aee5f47d33c6f5c77bfe49aac680bfece33d4fdf6a246/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:154e55ec0655291b5dd1b8731c637ecdb50975a2ae70c606d100750a540082f7", size = 237308 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/11/99/ae3a33d5befd41ac0ca2cc7fd3aa707c9c324de2e89db0e0f45db9a64c26/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:4314debad13beb564b708b4a496020e5306c7333fa9a3ab90374169a20ffab30", size = 238210 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/60/b1d2da22f4970e7a155f0adde9b1435712ece01b3cd45ba63702aea33938/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:073f8bf8becba60aa931eb3bc420b217bb7d5b8f4750e6f8b3be7f3da85d38b7", size = 231972 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3f/ab/945b2f32de889993b9c9133216c068b7fcf257d8595a0ac420ac8677cab0/frozenlist-1.8.0-cp314-cp314-win32.whl", hash = "sha256:bac9c42ba2ac65ddc115d930c78d24ab8d4f465fd3fc473cdedfccadb9429806", size = 40536 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/ad/9caa9b9c836d9ad6f067157a531ac48b7d36499f5036d4141ce78c230b1b/frozenlist-1.8.0-cp314-cp314-win_amd64.whl", hash = "sha256:3e0761f4d1a44f1d1a47996511752cf3dcec5bbdd9cc2b4fe595caf97754b7a0", size = 44330 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/82/13/e6950121764f2676f43534c555249f57030150260aee9dcf7d64efda11dd/frozenlist-1.8.0-cp314-cp314-win_arm64.whl", hash = "sha256:d1eaff1d00c7751b7c6662e9c5ba6eb2c17a2306ba5e2a37f24ddf3cc953402b", size = 40627 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/c7/43200656ecc4e02d3f8bc248df68256cd9572b3f0017f0a0c4e93440ae23/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d3bb933317c52d7ea5004a1c442eef86f426886fba134ef8cf4226ea6ee1821d", size = 89238 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/29/55c5f0689b9c0fb765055629f472c0de484dcaf0acee2f7707266ae3583c/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8009897cdef112072f93a0efdce29cd819e717fd2f649ee3016efd3cd885a7ed", size = 50738 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ba/7d/b7282a445956506fa11da8c2db7d276adcbf2b17d8bb8407a47685263f90/frozenlist-1.8.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2c5dcbbc55383e5883246d11fd179782a9d07a986c40f49abe89ddf865913930", size = 51739 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/1c/3d8622e60d0b767a5510d1d3cf21065b9db874696a51ea6d7a43180a259c/frozenlist-1.8.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:39ecbc32f1390387d2aa4f5a995e465e9e2f79ba3adcac92d68e3e0afae6657c", size = 284186 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/14/aa36d5f85a89679a85a1d44cd7a6657e0b1c75f61e7cad987b203d2daca8/frozenlist-1.8.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92db2bf818d5cc8d9c1f1fc56b897662e24ea5adb36ad1f1d82875bd64e03c24", size = 292196 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/05/23/6bde59eb55abd407d34f77d39a5126fb7b4f109a3f611d3929f14b700c66/frozenlist-1.8.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2dc43a022e555de94c3b68a4ef0b11c4f747d12c024a520c7101709a2144fb37", size = 273830 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/3f/22cff331bfad7a8afa616289000ba793347fcd7bc275f3b28ecea2a27909/frozenlist-1.8.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb89a7f2de3602cfed448095bab3f178399646ab7c61454315089787df07733a", size = 294289 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a4/89/5b057c799de4838b6c69aa82b79705f2027615e01be996d2486a69ca99c4/frozenlist-1.8.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:33139dc858c580ea50e7e60a1b0ea003efa1fd42e6ec7fdbad78fff65fad2fd2", size = 300318 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/30/de/2c22ab3eb2a8af6d69dc799e48455813bab3690c760de58e1bf43b36da3e/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:168c0969a329b416119507ba30b9ea13688fafffac1b7822802537569a1cb0ef", size = 282814 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/f7/970141a6a8dbd7f556d94977858cfb36fa9b66e0892c6dd780d2219d8cd8/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:28bd570e8e189d7f7b001966435f9dac6718324b5be2990ac496cf1ea9ddb7fe", size = 291762 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/15/ca1adae83a719f82df9116d66f5bb28bb95557b3951903d39135620ef157/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b2a095d45c5d46e5e79ba1e5b9cb787f541a8dee0433836cea4b96a2c439dcd8", size = 289470 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/83/dca6dc53bf657d371fbc88ddeb21b79891e747189c5de990b9dfff2ccba1/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:eab8145831a0d56ec9c4139b6c3e594c7a83c2c8be25d5bcf2d86136a532287a", size = 289042 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/96/52/abddd34ca99be142f354398700536c5bd315880ed0a213812bc491cff5e4/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:974b28cf63cc99dfb2188d8d222bc6843656188164848c4f679e63dae4b0708e", size = 283148 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/af/d3/76bd4ed4317e7119c2b7f57c3f6934aba26d277acc6309f873341640e21f/frozenlist-1.8.0-cp314-cp314t-win32.whl", hash = "sha256:342c97bf697ac5480c0a7ec73cd700ecfa5a8a40ac923bd035484616efecc2df", size = 44676 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/76/c615883b7b521ead2944bb3480398cbb07e12b7b4e4d073d3752eb721558/frozenlist-1.8.0-cp314-cp314t-win_amd64.whl", hash = "sha256:06be8f67f39c8b1dc671f5d83aaefd3358ae5cdcf8314552c57e7ed3e6475bdd", size = 49451 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/a3/5982da14e113d07b325230f95060e2169f5311b1017ea8af2a29b374c289/frozenlist-1.8.0-cp314-cp314t-win_arm64.whl", hash = "sha256:102e6314ca4da683dca92e3b1355490fed5f313b768500084fbe6371fddfdb79", size = 42507 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/9a/e35b4a917281c0b8419d4207f4334c8e8c5dbf4f3f5f9ada73958d937dcc/frozenlist-1.8.0-py3-none-any.whl", hash = "sha256:0c18a16eab41e82c295618a77502e17b195883241c563b00f0aa5106fc4eaa0d", size = 13409 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "h11"
|
name = "h11"
|
||||||
version = "0.16.0"
|
version = "0.16.0"
|
||||||
@@ -360,6 +657,15 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515 },
|
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "http-ece"
|
||||||
|
version = "1.2.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "cryptography" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/7c/af/249d1576653b69c20b9ac30e284b63bd94af6a175d72d87813235caf2482/http_ece-1.2.1.tar.gz", hash = "sha256:8c6ab23116bbf6affda894acfd5f2ca0fb8facbcbb72121c11c75c33e7ce8cff", size = 8830 }
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "httpcore"
|
name = "httpcore"
|
||||||
version = "1.0.9"
|
version = "1.0.9"
|
||||||
@@ -475,6 +781,123 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/db/e4/9aafcd70315e48ca1bbae2f4ad1e00a13d5ef00019c486f964b31c34c488/meshcore-2.3.2-py3-none-any.whl", hash = "sha256:7b98e6d71f2c1e1ee146dd2fe96da40eb5bf33077e34ca840557ee53b192e322", size = 53325 },
|
{ url = "https://files.pythonhosted.org/packages/db/e4/9aafcd70315e48ca1bbae2f4ad1e00a13d5ef00019c486f964b31c34c488/meshcore-2.3.2-py3-none-any.whl", hash = "sha256:7b98e6d71f2c1e1ee146dd2fe96da40eb5bf33077e34ca840557ee53b192e322", size = 53325 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "multidict"
|
||||||
|
version = "6.7.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/1a/c2/c2d94cbe6ac1753f3fc980da97b3d930efe1da3af3c9f5125354436c073d/multidict-6.7.1.tar.gz", hash = "sha256:ec6652a1bee61c53a3e5776b6049172c53b6aaba34f18c9ad04f82712bac623d", size = 102010 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/f1/a90635c4f88fb913fbf4ce660b83b7445b7a02615bda034b2f8eb38fd597/multidict-6.7.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7ff981b266af91d7b4b3793ca3382e53229088d193a85dfad6f5f4c27fc73e5d", size = 76626 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/9b/267e64eaf6fc637a15b35f5de31a566634a2740f97d8d094a69d34f524a4/multidict-6.7.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:844c5bca0b5444adb44a623fb0a1310c2f4cd41f402126bb269cd44c9b3f3e1e", size = 44706 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dd/a4/d45caf2b97b035c57267791ecfaafbd59c68212004b3842830954bb4b02e/multidict-6.7.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f2a0a924d4c2e9afcd7ec64f9de35fcd96915149b2216e1cb2c10a56df483855", size = 44356 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/d2/0a36c8473f0cbaeadd5db6c8b72d15bbceeec275807772bfcd059bef487d/multidict-6.7.1-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:8be1802715a8e892c784c0197c2ace276ea52702a0ede98b6310c8f255a5afb3", size = 244355 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/16/8c65be997fd7dd311b7d39c7b6e71a0cb449bad093761481eccbbe4b42a2/multidict-6.7.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2e2d2ed645ea29f31c4c7ea1552fcfd7cb7ba656e1eafd4134a6620c9f5fdd9e", size = 246433 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/fb/4dbd7e848d2799c6a026ec88ad39cf2b8416aa167fcc903baa55ecaa045c/multidict-6.7.1-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:95922cee9a778659e91db6497596435777bd25ed116701a4c034f8e46544955a", size = 225376 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b6/8a/4a3a6341eac3830f6053062f8fbc9a9e54407c80755b3f05bc427295c2d0/multidict-6.7.1-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6b83cabdc375ffaaa15edd97eb7c0c672ad788e2687004990074d7d6c9b140c8", size = 257365 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f7/a2/dd575a69c1aa206e12d27d0770cdf9b92434b48a9ef0cd0d1afdecaa93c4/multidict-6.7.1-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:38fb49540705369bab8484db0689d86c0a33a0a9f2c1b197f506b71b4b6c19b0", size = 254747 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/56/21b27c560c13822ed93133f08aa6372c53a8e067f11fbed37b4adcdac922/multidict-6.7.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:439cbebd499f92e9aa6793016a8acaa161dfa749ae86d20960189f5398a19144", size = 246293 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/a4/23466059dc3854763423d0ad6c0f3683a379d97673b1b89ec33826e46728/multidict-6.7.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6d3bc717b6fe763b8be3f2bee2701d3c8eb1b2a8ae9f60910f1b2860c82b6c49", size = 242962 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/67/51dd754a3524d685958001e8fa20a0f5f90a6a856e0a9dcabff69be3dbb7/multidict-6.7.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:619e5a1ac57986dbfec9f0b301d865dddf763696435e2962f6d9cf2fdff2bb71", size = 237360 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/3f/036dfc8c174934d4b55d86ff4f978e558b0e585cef70cfc1ad01adc6bf18/multidict-6.7.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:0b38ebffd9be37c1170d33bc0f36f4f262e0a09bc1aac1c34c7aa51a7293f0b3", size = 245940 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/20/6214d3c105928ebc353a1c644a6ef1408bc5794fcb4f170bb524a3c16311/multidict-6.7.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:10ae39c9cfe6adedcdb764f5e8411d4a92b055e35573a2eaa88d3323289ef93c", size = 253502 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/e2/c653bc4ae1be70a0f836b82172d643fcf1dade042ba2676ab08ec08bff0f/multidict-6.7.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:25167cc263257660290fba06b9318d2026e3c910be240a146e1f66dd114af2b0", size = 247065 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/11/a854b4154cd3bd8b1fd375e8a8ca9d73be37610c361543d56f764109509b/multidict-6.7.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:128441d052254f42989ef98b7b6a6ecb1e6f708aa962c7984235316db59f50fa", size = 241870 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/bf/9676c0392309b5fdae322333d22a829715b570edb9baa8016a517b55b558/multidict-6.7.1-cp311-cp311-win32.whl", hash = "sha256:d62b7f64ffde3b99d06b707a280db04fb3855b55f5a06df387236051d0668f4a", size = 41302 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/68/f16a3a8ba6f7b6dc92a1f19669c0810bd2c43fc5a02da13b1cbf8e253845/multidict-6.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:bdbf9f3b332abd0cdb306e7c2113818ab1e922dc84b8f8fd06ec89ed2a19ab8b", size = 45981 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/ad/9dd5305253fa00cd3c7555dbef69d5bf4133debc53b87ab8d6a44d411665/multidict-6.7.1-cp311-cp311-win_arm64.whl", hash = "sha256:b8c990b037d2fff2f4e33d3f21b9b531c5745b33a49a7d6dbe7a177266af44f6", size = 43159 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8d/9c/f20e0e2cf80e4b2e4b1c365bf5fe104ee633c751a724246262db8f1a0b13/multidict-6.7.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:a90f75c956e32891a4eda3639ce6dd86e87105271f43d43442a3aedf3cddf172", size = 76893 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/cf/18ef143a81610136d3da8193da9d80bfe1cb548a1e2d1c775f26b23d024a/multidict-6.7.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fccb473e87eaa1382689053e4a4618e7ba7b9b9b8d6adf2027ee474597128cd", size = 45456 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a9/65/1caac9d4cd32e8433908683446eebc953e82d22b03d10d41a5f0fefe991b/multidict-6.7.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b0fa96985700739c4c7853a43c0b3e169360d6855780021bfc6d0f1ce7c123e7", size = 43872 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/3b/d6bd75dc4f3ff7c73766e04e705b00ed6dbbaccf670d9e05a12b006f5a21/multidict-6.7.1-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:cb2a55f408c3043e42b40cc8eecd575afa27b7e0b956dfb190de0f8499a57a53", size = 251018 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/80/c959c5933adedb9ac15152e4067c702a808ea183a8b64cf8f31af8ad3155/multidict-6.7.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eb0ce7b2a32d09892b3dd6cc44877a0d02a33241fafca5f25c8b6b62374f8b75", size = 258883 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/85/7ed40adafea3d4f1c8b916e3b5cc3a8e07dfcdcb9cd72800f4ed3ca1b387/multidict-6.7.1-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c3a32d23520ee37bf327d1e1a656fec76a2edd5c038bf43eddfa0572ec49c60b", size = 242413 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/57/b8565ff533e48595503c785f8361ff9a4fde4d67de25c207cd0ba3befd03/multidict-6.7.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9c90fed18bffc0189ba814749fdcc102b536e83a9f738a9003e569acd540a733", size = 268404 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/50/9810c5c29350f7258180dfdcb2e52783a0632862eb334c4896ac717cebcb/multidict-6.7.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:da62917e6076f512daccfbbde27f46fed1c98fee202f0559adec8ee0de67f71a", size = 269456 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f3/8d/5e5be3ced1d12966fefb5c4ea3b2a5b480afcea36406559442c6e31d4a48/multidict-6.7.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bfde23ef6ed9db7eaee6c37dcec08524cb43903c60b285b172b6c094711b3961", size = 256322 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/31/6e/d8a26d81ac166a5592782d208dd90dfdc0a7a218adaa52b45a672b46c122/multidict-6.7.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3758692429e4e32f1ba0df23219cd0b4fc0a52f476726fff9337d1a57676a582", size = 253955 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/4c/7c672c8aad41534ba619bcd4ade7a0dc87ed6b8b5c06149b85d3dd03f0cd/multidict-6.7.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:398c1478926eca669f2fd6a5856b6de9c0acf23a2cb59a14c0ba5844fa38077e", size = 251254 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7b/bd/84c24de512cbafbdbc39439f74e967f19570ce7924e3007174a29c348916/multidict-6.7.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:c102791b1c4f3ab36ce4101154549105a53dc828f016356b3e3bcae2e3a039d3", size = 252059 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fa/ba/f5449385510825b73d01c2d4087bf6d2fccc20a2d42ac34df93191d3dd03/multidict-6.7.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:a088b62bd733e2ad12c50dad01b7d0166c30287c166e137433d3b410add807a6", size = 263588 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d7/11/afc7c677f68f75c84a69fe37184f0f82fce13ce4b92f49f3db280b7e92b3/multidict-6.7.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:3d51ff4785d58d3f6c91bdbffcb5e1f7ddfda557727043aa20d20ec4f65e324a", size = 259642 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/17/ebb9644da78c4ab36403739e0e6e0e30ebb135b9caf3440825001a0bddcb/multidict-6.7.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fc5907494fccf3e7d3f94f95c91d6336b092b5fc83811720fae5e2765890dfba", size = 251377 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ca/a4/840f5b97339e27846c46307f2530a2805d9d537d8b8bd416af031cad7fa0/multidict-6.7.1-cp312-cp312-win32.whl", hash = "sha256:28ca5ce2fd9716631133d0e9a9b9a745ad7f60bac2bccafb56aa380fc0b6c511", size = 41887 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/80/31/0b2517913687895f5904325c2069d6a3b78f66cc641a86a2baf75a05dcbb/multidict-6.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcee94dfbd638784645b066074b338bc9cc155d4b4bffa4adce1615c5a426c19", size = 46053 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/5b/aba28e4ee4006ae4c7df8d327d31025d760ffa992ea23812a601d226e682/multidict-6.7.1-cp312-cp312-win_arm64.whl", hash = "sha256:ba0a9fb644d0c1a2194cf7ffb043bd852cea63a57f66fbd33959f7dae18517bf", size = 43307 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f2/22/929c141d6c0dba87d3e1d38fbdf1ba8baba86b7776469f2bc2d3227a1e67/multidict-6.7.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:2b41f5fed0ed563624f1c17630cb9941cf2309d4df00e494b551b5f3e3d67a23", size = 76174 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c7/75/bc704ae15fee974f8fccd871305e254754167dce5f9e42d88a2def741a1d/multidict-6.7.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:84e61e3af5463c19b67ced91f6c634effb89ef8bfc5ca0267f954451ed4bb6a2", size = 45116 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/76/55cd7186f498ed080a18440c9013011eb548f77ae1b297206d030eb1180a/multidict-6.7.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:935434b9853c7c112eee7ac891bc4cb86455aa631269ae35442cb316790c1445", size = 43524 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/3c/414842ef8d5a1628d68edee29ba0e5bcf235dbfb3ccd3ea303a7fe8c72ff/multidict-6.7.1-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:432feb25a1cb67fe82a9680b4d65fb542e4635cb3166cd9c01560651ad60f177", size = 249368 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f6/32/befed7f74c458b4a525e60519fe8d87eef72bb1e99924fa2b0f9d97a221e/multidict-6.7.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e82d14e3c948952a1a85503817e038cba5905a3352de76b9a465075d072fba23", size = 256952 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/03/d6/c878a44ba877f366630c860fdf74bfb203c33778f12b6ac274936853c451/multidict-6.7.1-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4cfb48c6ea66c83bcaaf7e4dfa7ec1b6bbcf751b7db85a328902796dfde4c060", size = 240317 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/68/49/57421b4d7ad2e9e60e25922b08ceb37e077b90444bde6ead629095327a6f/multidict-6.7.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1d540e51b7e8e170174555edecddbd5538105443754539193e3e1061864d444d", size = 267132 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/fe/ec0edd52ddbcea2a2e89e174f0206444a61440b40f39704e64dc807a70bd/multidict-6.7.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:273d23f4b40f3dce4d6c8a821c741a86dec62cded82e1175ba3d99be128147ed", size = 268140 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b0/73/6e1b01cbeb458807aa0831742232dbdd1fa92bfa33f52a3f176b4ff3dc11/multidict-6.7.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d624335fd4fa1c08a53f8b4be7676ebde19cd092b3895c421045ca87895b429", size = 254277 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6a/b2/5fb8c124d7561a4974c342bc8c778b471ebbeb3cc17df696f034a7e9afe7/multidict-6.7.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:12fad252f8b267cc75b66e8fc51b3079604e8d43a75428ffe193cd9e2195dfd6", size = 252291 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/96/51d4e4e06bcce92577fcd488e22600bd38e4fd59c20cb49434d054903bd2/multidict-6.7.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:03ede2a6ffbe8ef936b92cb4529f27f42be7f56afcdab5ab739cd5f27fb1cbf9", size = 250156 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/db/6b/420e173eec5fba721a50e2a9f89eda89d9c98fded1124f8d5c675f7a0c0f/multidict-6.7.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:90efbcf47dbe33dcf643a1e400d67d59abeac5db07dc3f27d6bdeae497a2198c", size = 249742 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/44/a3/ec5b5bd98f306bc2aa297b8c6f11a46714a56b1e6ef5ebda50a4f5d7c5fb/multidict-6.7.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:5c4b9bfc148f5a91be9244d6264c53035c8a0dcd2f51f1c3c6e30e30ebaa1c84", size = 262221 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cd/f7/e8c0d0da0cd1e28d10e624604e1a36bcc3353aaebdfdc3a43c72bc683a12/multidict-6.7.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:401c5a650f3add2472d1d288c26deebc540f99e2fb83e9525007a74cd2116f1d", size = 258664 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/52/da/151a44e8016dd33feed44f730bd856a66257c1ee7aed4f44b649fb7edeb3/multidict-6.7.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:97891f3b1b3ffbded884e2916cacf3c6fc87b66bb0dde46f7357404750559f33", size = 249490 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/87/af/a3b86bf9630b732897f6fc3f4c4714b90aa4361983ccbdcd6c0339b21b0c/multidict-6.7.1-cp313-cp313-win32.whl", hash = "sha256:e1c5988359516095535c4301af38d8a8838534158f649c05dd1050222321bcb3", size = 41695 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/35/e994121b0e90e46134673422dd564623f93304614f5d11886b1b3e06f503/multidict-6.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:960c83bf01a95b12b08fd54324a4eb1d5b52c88932b5cba5d6e712bb3ed12eb5", size = 45884 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ca/61/42d3e5dbf661242a69c97ea363f2d7b46c567da8eadef8890022be6e2ab0/multidict-6.7.1-cp313-cp313-win_arm64.whl", hash = "sha256:563fe25c678aaba333d5399408f5ec3c383ca5b663e7f774dd179a520b8144df", size = 43122 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6d/b3/e6b21c6c4f314bb956016b0b3ef2162590a529b84cb831c257519e7fde44/multidict-6.7.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:c76c4bec1538375dad9d452d246ca5368ad6e1c9039dadcf007ae59c70619ea1", size = 83175 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fb/76/23ecd2abfe0957b234f6c960f4ade497f55f2c16aeb684d4ecdbf1c95791/multidict-6.7.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:57b46b24b5d5ebcc978da4ec23a819a9402b4228b8a90d9c656422b4bdd8a963", size = 48460 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/57/a0ed92b23f3a042c36bc4227b72b97eca803f5f1801c1ab77c8a212d455e/multidict-6.7.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e954b24433c768ce78ab7929e84ccf3422e46deb45a4dc9f93438f8217fa2d34", size = 46930 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b5/66/02ec7ace29162e447f6382c495dc95826bf931d3818799bbef11e8f7df1a/multidict-6.7.1-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:3bd231490fa7217cc832528e1cd8752a96f0125ddd2b5749390f7c3ec8721b65", size = 242582 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/58/18/64f5a795e7677670e872673aca234162514696274597b3708b2c0d276cce/multidict-6.7.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:253282d70d67885a15c8a7716f3a73edf2d635793ceda8173b9ecc21f2fb8292", size = 250031 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/ed/e192291dbbe51a8290c5686f482084d31bcd9d09af24f63358c3d42fd284/multidict-6.7.1-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0b4c48648d7649c9335cf1927a8b87fa692de3dcb15faa676c6a6f1f1aabda43", size = 228596 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/7e/3562a15a60cf747397e7f2180b0a11dc0c38d9175a650e75fa1b4d325e15/multidict-6.7.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:98bc624954ec4d2c7cb074b8eefc2b5d0ce7d482e410df446414355d158fe4ca", size = 257492 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/02/7d0f9eae92b5249bb50ac1595b295f10e263dd0078ebb55115c31e0eaccd/multidict-6.7.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:1b99af4d9eec0b49927b4402bcbb58dea89d3e0db8806a4086117019939ad3dd", size = 255899 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/00/e3/9b60ed9e23e64c73a5cde95269ef1330678e9c6e34dd4eb6b431b85b5a10/multidict-6.7.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6aac4f16b472d5b7dc6f66a0d49dd57b0e0902090be16594dc9ebfd3d17c47e7", size = 247970 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3e/06/538e58a63ed5cfb0bd4517e346b91da32fde409d839720f664e9a4ae4f9d/multidict-6.7.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:21f830fe223215dffd51f538e78c172ed7c7f60c9b96a2bf05c4848ad49921c3", size = 245060 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/2f/d743a3045a97c895d401e9bd29aaa09b94f5cbdf1bd561609e5a6c431c70/multidict-6.7.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:f5dd81c45b05518b9aa4da4aa74e1c93d715efa234fd3e8a179df611cc85e5f4", size = 235888 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/38/83/5a325cac191ab28b63c52f14f1131f3b0a55ba3b9aa65a6d0bf2a9b921a0/multidict-6.7.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:eb304767bca2bb92fb9c5bd33cedc95baee5bb5f6c88e63706533a1c06ad08c8", size = 243554 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/1f/9d2327086bd15da2725ef6aae624208e2ef828ed99892b17f60c344e57ed/multidict-6.7.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:c9035dde0f916702850ef66460bc4239d89d08df4d02023a5926e7446724212c", size = 252341 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e8/2c/2a1aa0280cf579d0f6eed8ee5211c4f1730bd7e06c636ba2ee6aafda302e/multidict-6.7.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:af959b9beeb66c822380f222f0e0a1889331597e81f1ded7f374f3ecb0fd6c52", size = 246391 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/03/7ca022ffc36c5a3f6e03b179a5ceb829be9da5783e6fe395f347c0794680/multidict-6.7.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:41f2952231456154ee479651491e94118229844dd7226541788be783be2b5108", size = 243422 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/1d/b31650eab6c5778aceed46ba735bd97f7c7d2f54b319fa916c0f96e7805b/multidict-6.7.1-cp313-cp313t-win32.whl", hash = "sha256:df9f19c28adcb40b6aae30bbaa1478c389efd50c28d541d76760199fc1037c32", size = 47770 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/5b/2d2d1d522e51285bd61b1e20df8f47ae1a9d80839db0b24ea783b3832832/multidict-6.7.1-cp313-cp313t-win_amd64.whl", hash = "sha256:d54ecf9f301853f2c5e802da559604b3e95bb7a3b01a9c295c6ee591b9882de8", size = 53109 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/a3/cc409ba012c83ca024a308516703cf339bdc4b696195644a7215a5164a24/multidict-6.7.1-cp313-cp313t-win_arm64.whl", hash = "sha256:5a37ca18e360377cfda1d62f5f382ff41f2b8c4ccb329ed974cc2e1643440118", size = 45573 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/91/cc/db74228a8be41884a567e88a62fd589a913708fcf180d029898c17a9a371/multidict-6.7.1-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:8f333ec9c5eb1b7105e3b84b53141e66ca05a19a605368c55450b6ba208cb9ee", size = 75190 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/22/492f2246bb5b534abd44804292e81eeaf835388901f0c574bac4eeec73c5/multidict-6.7.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:a407f13c188f804c759fc6a9f88286a565c242a76b27626594c133b82883b5c2", size = 44486 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/4f/733c48f270565d78b4544f2baddc2fb2a245e5a8640254b12c36ac7ac68e/multidict-6.7.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0e161ddf326db5577c3a4cc2d8648f81456e8a20d40415541587a71620d7a7d1", size = 43219 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/bb/2c0c2287963f4259c85e8bcbba9182ced8d7fca65c780c38e99e61629d11/multidict-6.7.1-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:1e3a8bb24342a8201d178c3b4984c26ba81a577c80d4d525727427460a50c22d", size = 245132 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/f9/44d4b3064c65079d2467888794dea218d1601898ac50222ab8a9a8094460/multidict-6.7.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97231140a50f5d447d3164f994b86a0bed7cd016e2682f8650d6a9158e14fd31", size = 252420 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/13/78f7275e73fa17b24c9a51b0bd9d73ba64bb32d0ed51b02a746eb876abe7/multidict-6.7.1-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6b10359683bd8806a200fd2909e7c8ca3a7b24ec1d8132e483d58e791d881048", size = 233510 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4b/25/8167187f62ae3cbd52da7893f58cb036b47ea3fb67138787c76800158982/multidict-6.7.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:283ddac99f7ac25a4acadbf004cb5ae34480bbeb063520f70ce397b281859362", size = 264094 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a1/e7/69a3a83b7b030cf283fb06ce074a05a02322359783424d7edf0f15fe5022/multidict-6.7.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:538cec1e18c067d0e6103aa9a74f9e832904c957adc260e61cd9d8cf0c3b3d37", size = 260786 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/3b/8ec5074bcfc450fe84273713b4b0a0dd47c0249358f5d82eb8104ffe2520/multidict-6.7.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eee46ccb30ff48a1e35bb818cc90846c6be2b68240e42a78599166722cea709", size = 248483 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/5a/d5a99e3acbca0e29c5d9cba8f92ceb15dce78bab963b308ae692981e3a5d/multidict-6.7.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:fa263a02f4f2dd2d11a7b1bb4362aa7cb1049f84a9235d31adf63f30143469a0", size = 248403 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/35/48/e58cd31f6c7d5102f2a4bf89f96b9cf7e00b6c6f3d04ecc44417c00a5a3c/multidict-6.7.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:2e1425e2f99ec5bd36c15a01b690a1a2456209c5deed58f95469ffb46039ccbb", size = 240315 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/94/33/1cd210229559cb90b6786c30676bb0c58249ff42f942765f88793b41fdce/multidict-6.7.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:497394b3239fc6f0e13a78a3e1b61296e72bf1c5f94b4c4eb80b265c37a131cd", size = 245528 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/f2/6e1107d226278c876c783056b7db43d800bb64c6131cec9c8dfb6903698e/multidict-6.7.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:233b398c29d3f1b9676b4b6f75c518a06fcb2ea0b925119fb2c1bc35c05e1601", size = 258784 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4d/c1/11f664f14d525e4a1b5327a82d4de61a1db604ab34c6603bb3c2cc63ad34/multidict-6.7.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:93b1818e4a6e0930454f0f2af7dfce69307ca03cdcfb3739bf4d91241967b6c1", size = 251980 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e1/9f/75a9ac888121d0c5bbd4ecf4eead45668b1766f6baabfb3b7f66a410e231/multidict-6.7.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:f33dc2a3abe9249ea5d8360f969ec7f4142e7ac45ee7014d8f8d5acddf178b7b", size = 243602 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/e7/50bf7b004cc8525d80dbbbedfdc7aed3e4c323810890be4413e589074032/multidict-6.7.1-cp314-cp314-win32.whl", hash = "sha256:3ab8b9d8b75aef9df299595d5388b14530839f6422333357af1339443cff777d", size = 40930 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/bf/52f25716bbe93745595800f36fb17b73711f14da59ed0bb2eba141bc9f0f/multidict-6.7.1-cp314-cp314-win_amd64.whl", hash = "sha256:5e01429a929600e7dab7b166062d9bb54a5eed752384c7384c968c2afab8f50f", size = 45074 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/97/ab/22803b03285fa3a525f48217963da3a65ae40f6a1b6f6cf2768879e208f9/multidict-6.7.1-cp314-cp314-win_arm64.whl", hash = "sha256:4885cb0e817aef5d00a2e8451d4665c1808378dc27c2705f1bf4ef8505c0d2e5", size = 42471 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/6d/f9293baa6146ba9507e360ea0292b6422b016907c393e2f63fc40ab7b7b5/multidict-6.7.1-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:0458c978acd8e6ea53c81eefaddbbee9c6c5e591f41b3f5e8e194780fe026581", size = 82401 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/68/53b5494738d83558d87c3c71a486504d8373421c3e0dbb6d0db48ad42ee0/multidict-6.7.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:c0abd12629b0af3cf590982c0b413b1e7395cd4ec026f30986818ab95bfaa94a", size = 48143 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/37/e8/5284c53310dcdc99ce5d66563f6e5773531a9b9fe9ec7a615e9bc306b05f/multidict-6.7.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:14525a5f61d7d0c94b368a42cff4c9a4e7ba2d52e2672a7b23d84dc86fb02b0c", size = 46507 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/fc/6800d0e5b3875568b4083ecf5f310dcf91d86d52573160834fb4bfcf5e4f/multidict-6.7.1-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:17307b22c217b4cf05033dabefe68255a534d637c6c9b0cc8382718f87be4262", size = 239358 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/75/4ad0973179361cdf3a113905e6e088173198349131be2b390f9fa4da5fc6/multidict-6.7.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7a7e590ff876a3eaf1c02a4dfe0724b6e69a9e9de6d8f556816f29c496046e59", size = 246884 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/9c/095bb28b5da139bd41fb9a5d5caff412584f377914bd8787c2aa98717130/multidict-6.7.1-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:5fa6a95dfee63893d80a34758cd0e0c118a30b8dcb46372bf75106c591b77889", size = 225878 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/07/d0/c0a72000243756e8f5a277b6b514fa005f2c73d481b7d9e47cd4568aa2e4/multidict-6.7.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a0543217a6a017692aa6ae5cc39adb75e587af0f3a82288b1492eb73dd6cc2a4", size = 253542 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/6b/f69da15289e384ecf2a68837ec8b5ad8c33e973aa18b266f50fe55f24b8c/multidict-6.7.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f99fe611c312b3c1c0ace793f92464d8cd263cc3b26b5721950d977b006b6c4d", size = 252403 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a2/76/b9669547afa5a1a25cd93eaca91c0da1c095b06b6d2d8ec25b713588d3a1/multidict-6.7.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9004d8386d133b7e6135679424c91b0b854d2d164af6ea3f289f8f2761064609", size = 244889 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/a9/a50d2669e506dad33cfc45b5d574a205587b7b8a5f426f2fbb2e90882588/multidict-6.7.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e628ef0e6859ffd8273c69412a2465c4be4a9517d07261b33334b5ec6f3c7489", size = 241982 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c5/bb/1609558ad8b456b4827d3c5a5b775c93b87878fd3117ed3db3423dfbce1b/multidict-6.7.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:841189848ba629c3552035a6a7f5bf3b02eb304e9fea7492ca220a8eda6b0e5c", size = 232415 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d8/59/6f61039d2aa9261871e03ab9dc058a550d240f25859b05b67fd70f80d4b3/multidict-6.7.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:ce1bbd7d780bb5a0da032e095c951f7014d6b0a205f8318308140f1a6aba159e", size = 240337 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a1/29/fdc6a43c203890dc2ae9249971ecd0c41deaedfe00d25cb6564b2edd99eb/multidict-6.7.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b26684587228afed0d50cf804cc71062cc9c1cdf55051c4c6345d372947b268c", size = 248788 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a9/14/a153a06101323e4cf086ecee3faadba52ff71633d471f9685c42e3736163/multidict-6.7.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:9f9af11306994335398293f9958071019e3ab95e9a707dc1383a35613f6abcb9", size = 242842 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/5f/604ae839e64a4a6efc80db94465348d3b328ee955e37acb24badbcd24d83/multidict-6.7.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b4938326284c4f1224178a560987b6cf8b4d38458b113d9b8c1db1a836e640a2", size = 240237 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/60/c3a5187bf66f6fb546ff4ab8fb5a077cbdd832d7b1908d4365c7f74a1917/multidict-6.7.1-cp314-cp314t-win32.whl", hash = "sha256:98655c737850c064a65e006a3df7c997cd3b220be4ec8fe26215760b9697d4d7", size = 48008 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/f7/addf1087b860ac60e6f382240f64fb99f8bfb532bb06f7c542b83c29ca61/multidict-6.7.1-cp314-cp314t-win_amd64.whl", hash = "sha256:497bde6223c212ba11d462853cfa4f0ae6ef97465033e7dc9940cdb3ab5b48e5", size = 53542 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4c/81/4629d0aa32302ef7b2ec65c75a728cc5ff4fa410c50096174c1632e70b3e/multidict-6.7.1-cp314-cp314t-win_arm64.whl", hash = "sha256:2bbd113e0d4af5db41d5ebfe9ccaff89de2120578164f86a5d17d5a576d1e5b2", size = 44719 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/81/08/7036c080d7117f28a4af526d794aab6a84463126db031b007717c1a6676e/multidict-6.7.1-py3-none-any.whl", hash = "sha256:55d97cc6dae627efa6a6e548885712d4864b81110ac76fa4e534c03819fa4a56", size = 12319 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "nodeenv"
|
name = "nodeenv"
|
||||||
version = "1.10.0"
|
version = "1.10.0"
|
||||||
@@ -544,6 +967,117 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/ee/8c/83087ebc47ab0396ce092363001fa37c17153119ee282700c0713a195853/prettytable-3.17.0-py3-none-any.whl", hash = "sha256:aad69b294ddbe3e1f95ef8886a060ed1666a0b83018bbf56295f6f226c43d287", size = 34433 },
|
{ url = "https://files.pythonhosted.org/packages/ee/8c/83087ebc47ab0396ce092363001fa37c17153119ee282700c0713a195853/prettytable-3.17.0-py3-none-any.whl", hash = "sha256:aad69b294ddbe3e1f95ef8886a060ed1666a0b83018bbf56295f6f226c43d287", size = 34433 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "propcache"
|
||||||
|
version = "0.4.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/9e/da/e9fc233cf63743258bff22b3dfa7ea5baef7b5bc324af47a0ad89b8ffc6f/propcache-0.4.1.tar.gz", hash = "sha256:f48107a8c637e80362555f37ecf49abe20370e557cc4ab374f04ec4423c97c3d", size = 46442 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8c/d4/4e2c9aaf7ac2242b9358f98dccd8f90f2605402f5afeff6c578682c2c491/propcache-0.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:60a8fda9644b7dfd5dece8c61d8a85e271cb958075bfc4e01083c148b61a7caf", size = 80208 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c2/21/d7b68e911f9c8e18e4ae43bdbc1e1e9bbd971f8866eb81608947b6f585ff/propcache-0.4.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c30b53e7e6bda1d547cabb47c825f3843a0a1a42b0496087bb58d8fedf9f41b5", size = 45777 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/1d/11605e99ac8ea9435651ee71ab4cb4bf03f0949586246476a25aadfec54a/propcache-0.4.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6918ecbd897443087a3b7cd978d56546a812517dcaaca51b49526720571fa93e", size = 47647 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/58/1a/3c62c127a8466c9c843bccb503d40a273e5cc69838805f322e2826509e0d/propcache-0.4.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3d902a36df4e5989763425a8ab9e98cd8ad5c52c823b34ee7ef307fd50582566", size = 214929 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/56/b9/8fa98f850960b367c4b8fe0592e7fc341daa7a9462e925228f10a60cf74f/propcache-0.4.1-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a9695397f85973bb40427dedddf70d8dc4a44b22f1650dd4af9eedf443d45165", size = 221778 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/46/a6/0ab4f660eb59649d14b3d3d65c439421cf2f87fe5dd68591cbe3c1e78a89/propcache-0.4.1-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2bb07ffd7eaad486576430c89f9b215f9e4be68c4866a96e97db9e97fead85dc", size = 228144 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/52/6a/57f43e054fb3d3a56ac9fc532bc684fc6169a26c75c353e65425b3e56eef/propcache-0.4.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fd6f30fdcf9ae2a70abd34da54f18da086160e4d7d9251f81f3da0ff84fc5a48", size = 210030 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/40/e2/27e6feebb5f6b8408fa29f5efbb765cd54c153ac77314d27e457a3e993b7/propcache-0.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fc38cba02d1acba4e2869eef1a57a43dfbd3d49a59bf90dda7444ec2be6a5570", size = 208252 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/f8/91c27b22ccda1dbc7967f921c42825564fa5336a01ecd72eb78a9f4f53c2/propcache-0.4.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:67fad6162281e80e882fb3ec355398cf72864a54069d060321f6cd0ade95fe85", size = 202064 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f2/26/7f00bd6bd1adba5aafe5f4a66390f243acab58eab24ff1a08bebb2ef9d40/propcache-0.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f10207adf04d08bec185bae14d9606a1444715bc99180f9331c9c02093e1959e", size = 212429 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/89/fd108ba7815c1117ddca79c228f3f8a15fc82a73bca8b142eb5de13b2785/propcache-0.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:e9b0d8d0845bbc4cfcdcbcdbf5086886bc8157aa963c31c777ceff7846c77757", size = 216727 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/37/3ec3f7e3173e73f1d600495d8b545b53802cbf35506e5732dd8578db3724/propcache-0.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:981333cb2f4c1896a12f4ab92a9cc8f09ea664e9b7dbdc4eff74627af3a11c0f", size = 205097 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/61/b0/b2631c19793f869d35f47d5a3a56fb19e9160d3c119f15ac7344fc3ccae7/propcache-0.4.1-cp311-cp311-win32.whl", hash = "sha256:f1d2f90aeec838a52f1c1a32fe9a619fefd5e411721a9117fbf82aea638fe8a1", size = 38084 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/78/6cce448e2098e9f3bfc91bb877f06aa24b6ccace872e39c53b2f707c4648/propcache-0.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:364426a62660f3f699949ac8c621aad6977be7126c5807ce48c0aeb8e7333ea6", size = 41637 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/e9/754f180cccd7f51a39913782c74717c581b9cc8177ad0e949f4d51812383/propcache-0.4.1-cp311-cp311-win_arm64.whl", hash = "sha256:e53f3a38d3510c11953f3e6a33f205c6d1b001129f972805ca9b42fc308bc239", size = 38064 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a2/0f/f17b1b2b221d5ca28b4b876e8bb046ac40466513960646bda8e1853cdfa2/propcache-0.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e153e9cd40cc8945138822807139367f256f89c6810c2634a4f6902b52d3b4e2", size = 80061 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/47/8ccf75935f51448ba9a16a71b783eb7ef6b9ee60f5d14c7f8a8a79fbeed7/propcache-0.4.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cd547953428f7abb73c5ad82cbb32109566204260d98e41e5dfdc682eb7f8403", size = 46037 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/b6/5c9a0e42df4d00bfb4a3cbbe5cf9f54260300c88a0e9af1f47ca5ce17ac0/propcache-0.4.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f048da1b4f243fc44f205dfd320933a951b8d89e0afd4c7cacc762a8b9165207", size = 47324 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/d3/6c7ee328b39a81ee877c962469f1e795f9db87f925251efeb0545e0020d0/propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ec17c65562a827bba85e3872ead335f95405ea1674860d96483a02f5c698fa72", size = 225505 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/5d/1c53f4563490b1d06a684742cc6076ef944bc6457df6051b7d1a877c057b/propcache-0.4.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:405aac25c6394ef275dee4c709be43745d36674b223ba4eb7144bf4d691b7367", size = 230242 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/e1/ce4620633b0e2422207c3cb774a0ee61cac13abc6217763a7b9e2e3f4a12/propcache-0.4.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0013cb6f8dde4b2a2f66903b8ba740bdfe378c943c4377a200551ceb27f379e4", size = 238474 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/46/4b/3aae6835b8e5f44ea6a68348ad90f78134047b503765087be2f9912140ea/propcache-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:15932ab57837c3368b024473a525e25d316d8353016e7cc0e5ba9eb343fbb1cf", size = 221575 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6e/a5/8a5e8678bcc9d3a1a15b9a29165640d64762d424a16af543f00629c87338/propcache-0.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:031dce78b9dc099f4c29785d9cf5577a3faf9ebf74ecbd3c856a7b92768c3df3", size = 216736 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/63/b7b215eddeac83ca1c6b934f89d09a625aa9ee4ba158338854c87210cc36/propcache-0.4.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:ab08df6c9a035bee56e31af99be621526bd237bea9f32def431c656b29e41778", size = 213019 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/57/74/f580099a58c8af587cac7ba19ee7cb418506342fbbe2d4a4401661cca886/propcache-0.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4d7af63f9f93fe593afbf104c21b3b15868efb2c21d07d8732c0c4287e66b6a6", size = 220376 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/ee/542f1313aff7eaf19c2bb758c5d0560d2683dac001a1c96d0774af799843/propcache-0.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cfc27c945f422e8b5071b6e93169679e4eb5bf73bbcbf1ba3ae3a83d2f78ebd9", size = 226988 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/18/9c6b015dd9c6930f6ce2229e1f02fb35298b847f2087ea2b436a5bfa7287/propcache-0.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:35c3277624a080cc6ec6f847cbbbb5b49affa3598c4535a0a4682a697aaa5c75", size = 215615 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/80/9e/e7b85720b98c45a45e1fca6a177024934dc9bc5f4d5dd04207f216fc33ed/propcache-0.4.1-cp312-cp312-win32.whl", hash = "sha256:671538c2262dadb5ba6395e26c1731e1d52534bfe9ae56d0b5573ce539266aa8", size = 38066 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/09/d19cff2a5aaac632ec8fc03737b223597b1e347416934c1b3a7df079784c/propcache-0.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:cb2d222e72399fcf5890d1d5cc1060857b9b236adff2792ff48ca2dfd46c81db", size = 41655 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/68/ab/6b5c191bb5de08036a8c697b265d4ca76148efb10fa162f14af14fb5f076/propcache-0.4.1-cp312-cp312-win_arm64.whl", hash = "sha256:204483131fb222bdaaeeea9f9e6c6ed0cac32731f75dfc1d4a567fc1926477c1", size = 37789 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bf/df/6d9c1b6ac12b003837dde8a10231a7344512186e87b36e855bef32241942/propcache-0.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:43eedf29202c08550aac1d14e0ee619b0430aaef78f85864c1a892294fbc28cf", size = 77750 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/e8/677a0025e8a2acf07d3418a2e7ba529c9c33caf09d3c1f25513023c1db56/propcache-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d62cdfcfd89ccb8de04e0eda998535c406bf5e060ffd56be6c586cbcc05b3311", size = 44780 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/a4/92380f7ca60f99ebae761936bc48a72a639e8a47b29050615eef757cb2a7/propcache-0.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cae65ad55793da34db5f54e4029b89d3b9b9490d8abe1b4c7ab5d4b8ec7ebf74", size = 46308 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/48/c5ac64dee5262044348d1d78a5f85dd1a57464a60d30daee946699963eb3/propcache-0.4.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:333ddb9031d2704a301ee3e506dc46b1fe5f294ec198ed6435ad5b6a085facfe", size = 208182 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c6/0c/cd762dd011a9287389a6a3eb43aa30207bde253610cca06824aeabfe9653/propcache-0.4.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:fd0858c20f078a32cf55f7e81473d96dcf3b93fd2ccdb3d40fdf54b8573df3af", size = 211215 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/30/3e/49861e90233ba36890ae0ca4c660e95df565b2cd15d4a68556ab5865974e/propcache-0.4.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:678ae89ebc632c5c204c794f8dab2837c5f159aeb59e6ed0539500400577298c", size = 218112 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/8b/544bc867e24e1bd48f3118cecd3b05c694e160a168478fa28770f22fd094/propcache-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d472aeb4fbf9865e0c6d622d7f4d54a4e101a89715d8904282bb5f9a2f476c3f", size = 204442 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/50/a6/4282772fd016a76d3e5c0df58380a5ea64900afd836cec2c2f662d1b9bb3/propcache-0.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4d3df5fa7e36b3225954fba85589da77a0fe6a53e3976de39caf04a0db4c36f1", size = 199398 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3e/ec/d8a7cd406ee1ddb705db2139f8a10a8a427100347bd698e7014351c7af09/propcache-0.4.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ee17f18d2498f2673e432faaa71698032b0127ebf23ae5974eeaf806c279df24", size = 196920 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f6/6c/f38ab64af3764f431e359f8baf9e0a21013e24329e8b85d2da32e8ed07ca/propcache-0.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:580e97762b950f993ae618e167e7be9256b8353c2dcd8b99ec100eb50f5286aa", size = 203748 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d6/e3/fa846bd70f6534d647886621388f0a265254d30e3ce47e5c8e6e27dbf153/propcache-0.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:501d20b891688eb8e7aa903021f0b72d5a55db40ffaab27edefd1027caaafa61", size = 205877 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/39/8163fc6f3133fea7b5f2827e8eba2029a0277ab2c5beee6c1db7b10fc23d/propcache-0.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a0bd56e5b100aef69bd8562b74b46254e7c8812918d3baa700c8a8009b0af66", size = 199437 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/93/89/caa9089970ca49c7c01662bd0eeedfe85494e863e8043565aeb6472ce8fe/propcache-0.4.1-cp313-cp313-win32.whl", hash = "sha256:bcc9aaa5d80322bc2fb24bb7accb4a30f81e90ab8d6ba187aec0744bc302ad81", size = 37586 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/ab/f76ec3c3627c883215b5c8080debb4394ef5a7a29be811f786415fc1e6fd/propcache-0.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:381914df18634f5494334d201e98245c0596067504b9372d8cf93f4bb23e025e", size = 40790 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/1b/e71ae98235f8e2ba5004d8cb19765a74877abf189bc53fc0c80d799e56c3/propcache-0.4.1-cp313-cp313-win_arm64.whl", hash = "sha256:8873eb4460fd55333ea49b7d189749ecf6e55bf85080f11b1c4530ed3034cba1", size = 37158 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/83/ce/a31bbdfc24ee0dcbba458c8175ed26089cf109a55bbe7b7640ed2470cfe9/propcache-0.4.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:92d1935ee1f8d7442da9c0c4fa7ac20d07e94064184811b685f5c4fada64553b", size = 81451 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/25/9c/442a45a470a68456e710d96cacd3573ef26a1d0a60067e6a7d5e655621ed/propcache-0.4.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:473c61b39e1460d386479b9b2f337da492042447c9b685f28be4f74d3529e566", size = 46374 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/bf/b1d5e21dbc3b2e889ea4327044fb16312a736d97640fb8b6aa3f9c7b3b65/propcache-0.4.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c0ef0aaafc66fbd87842a3fe3902fd889825646bc21149eafe47be6072725835", size = 48396 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/04/5b4c54a103d480e978d3c8a76073502b18db0c4bc17ab91b3cb5092ad949/propcache-0.4.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f95393b4d66bfae908c3ca8d169d5f79cd65636ae15b5e7a4f6e67af675adb0e", size = 275950 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b4/c1/86f846827fb969c4b78b0af79bba1d1ea2156492e1b83dea8b8a6ae27395/propcache-0.4.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c07fda85708bc48578467e85099645167a955ba093be0a2dcba962195676e859", size = 273856 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/1d/fc272a63c8d3bbad6878c336c7a7dea15e8f2d23a544bda43205dfa83ada/propcache-0.4.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:af223b406d6d000830c6f65f1e6431783fc3f713ba3e6cc8c024d5ee96170a4b", size = 280420 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/07/0c/01f2219d39f7e53d52e5173bcb09c976609ba30209912a0680adfb8c593a/propcache-0.4.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a78372c932c90ee474559c5ddfffd718238e8673c340dc21fe45c5b8b54559a0", size = 263254 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/18/cd28081658ce597898f0c4d174d4d0f3c5b6d4dc27ffafeef835c95eb359/propcache-0.4.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:564d9f0d4d9509e1a870c920a89b2fec951b44bf5ba7d537a9e7c1ccec2c18af", size = 261205 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/71/1f9e22eb8b8316701c2a19fa1f388c8a3185082607da8e406a803c9b954e/propcache-0.4.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:17612831fda0138059cc5546f4d12a2aacfb9e47068c06af35c400ba58ba7393", size = 247873 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4a/65/3d4b61f36af2b4eddba9def857959f1016a51066b4f1ce348e0cf7881f58/propcache-0.4.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:41a89040cb10bd345b3c1a873b2bf36413d48da1def52f268a055f7398514874", size = 262739 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/42/26746ab087faa77c1c68079b228810436ccd9a5ce9ac85e2b7307195fd06/propcache-0.4.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e35b88984e7fa64aacecea39236cee32dd9bd8c55f57ba8a75cf2399553f9bd7", size = 263514 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/94/13/630690fe201f5502d2403dd3cfd451ed8858fe3c738ee88d095ad2ff407b/propcache-0.4.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f8b465489f927b0df505cbe26ffbeed4d6d8a2bbc61ce90eb074ff129ef0ab1", size = 257781 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/f7/1d4ec5841505f423469efbfc381d64b7b467438cd5a4bbcbb063f3b73d27/propcache-0.4.1-cp313-cp313t-win32.whl", hash = "sha256:2ad890caa1d928c7c2965b48f3a3815c853180831d0e5503d35cf00c472f4717", size = 41396 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/f0/615c30622316496d2cbbc29f5985f7777d3ada70f23370608c1d3e081c1f/propcache-0.4.1-cp313-cp313t-win_amd64.whl", hash = "sha256:f7ee0e597f495cf415bcbd3da3caa3bd7e816b74d0d52b8145954c5e6fd3ff37", size = 44897 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/ca/6002e46eccbe0e33dcd4069ef32f7f1c9e243736e07adca37ae8c4830ec3/propcache-0.4.1-cp313-cp313t-win_arm64.whl", hash = "sha256:929d7cbe1f01bb7baffb33dc14eb5691c95831450a26354cd210a8155170c93a", size = 39789 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8e/5c/bca52d654a896f831b8256683457ceddd490ec18d9ec50e97dfd8fc726a8/propcache-0.4.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3f7124c9d820ba5548d431afb4632301acf965db49e666aa21c305cbe8c6de12", size = 78152 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/65/9b/03b04e7d82a5f54fb16113d839f5ea1ede58a61e90edf515f6577c66fa8f/propcache-0.4.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c0d4b719b7da33599dfe3b22d3db1ef789210a0597bc650b7cee9c77c2be8c5c", size = 44869 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/fa/89a8ef0468d5833a23fff277b143d0573897cf75bd56670a6d28126c7d68/propcache-0.4.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9f302f4783709a78240ebc311b793f123328716a60911d667e0c036bc5dcbded", size = 46596 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/bd/47816020d337f4a746edc42fe8d53669965138f39ee117414c7d7a340cfe/propcache-0.4.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c80ee5802e3fb9ea37938e7eecc307fb984837091d5fd262bb37238b1ae97641", size = 206981 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/f6/c5fa1357cc9748510ee55f37173eb31bfde6d94e98ccd9e6f033f2fc06e1/propcache-0.4.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ed5a841e8bb29a55fb8159ed526b26adc5bdd7e8bd7bf793ce647cb08656cdf4", size = 211490 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/80/1e/e5889652a7c4a3846683401a48f0f2e5083ce0ec1a8a5221d8058fbd1adf/propcache-0.4.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:55c72fd6ea2da4c318e74ffdf93c4fe4e926051133657459131a95c846d16d44", size = 215371 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/f2/889ad4b2408f72fe1a4f6a19491177b30ea7bf1a0fd5f17050ca08cfc882/propcache-0.4.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8326e144341460402713f91df60ade3c999d601e7eb5ff8f6f7862d54de0610d", size = 201424 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/73/033d63069b57b0812c8bd19f311faebeceb6ba31b8f32b73432d12a0b826/propcache-0.4.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:060b16ae65bc098da7f6d25bf359f1f31f688384858204fe5d652979e0015e5b", size = 197566 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/89/ce24f3dc182630b4e07aa6d15f0ff4b14ed4b9955fae95a0b54c58d66c05/propcache-0.4.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:89eb3fa9524f7bec9de6e83cf3faed9d79bffa560672c118a96a171a6f55831e", size = 193130 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a9/24/ef0d5fd1a811fb5c609278d0209c9f10c35f20581fcc16f818da959fc5b4/propcache-0.4.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:dee69d7015dc235f526fe80a9c90d65eb0039103fe565776250881731f06349f", size = 202625 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/02/98ec20ff5546f68d673df2f7a69e8c0d076b5abd05ca882dc7ee3a83653d/propcache-0.4.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5558992a00dfd54ccbc64a32726a3357ec93825a418a401f5cc67df0ac5d9e49", size = 204209 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a0/87/492694f76759b15f0467a2a93ab68d32859672b646aa8a04ce4864e7932d/propcache-0.4.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c9b822a577f560fbd9554812526831712c1436d2c046cedee4c3796d3543b144", size = 197797 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ee/36/66367de3575db1d2d3f3d177432bd14ee577a39d3f5d1b3d5df8afe3b6e2/propcache-0.4.1-cp314-cp314-win32.whl", hash = "sha256:ab4c29b49d560fe48b696cdcb127dd36e0bc2472548f3bf56cc5cb3da2b2984f", size = 38140 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/2a/a758b47de253636e1b8aef181c0b4f4f204bf0dd964914fb2af90a95b49b/propcache-0.4.1-cp314-cp314-win_amd64.whl", hash = "sha256:5a103c3eb905fcea0ab98be99c3a9a5ab2de60228aa5aceedc614c0281cf6153", size = 41257 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/34/5e/63bd5896c3fec12edcbd6f12508d4890d23c265df28c74b175e1ef9f4f3b/propcache-0.4.1-cp314-cp314-win_arm64.whl", hash = "sha256:74c1fb26515153e482e00177a1ad654721bf9207da8a494a0c05e797ad27b992", size = 38097 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/99/85/9ff785d787ccf9bbb3f3106f79884a130951436f58392000231b4c737c80/propcache-0.4.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:824e908bce90fb2743bd6b59db36eb4f45cd350a39637c9f73b1c1ea66f5b75f", size = 81455 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/85/2431c10c8e7ddb1445c1f7c4b54d886e8ad20e3c6307e7218f05922cad67/propcache-0.4.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c2b5e7db5328427c57c8e8831abda175421b709672f6cfc3d630c3b7e2146393", size = 46372 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/20/b0972d902472da9bcb683fa595099911f4d2e86e5683bcc45de60dd05dc3/propcache-0.4.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6f6ff873ed40292cd4969ef5310179afd5db59fdf055897e282485043fc80ad0", size = 48411 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/e3/7dc89f4f21e8f99bad3d5ddb3a3389afcf9da4ac69e3deb2dcdc96e74169/propcache-0.4.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:49a2dc67c154db2c1463013594c458881a069fcf98940e61a0569016a583020a", size = 275712 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/67/89800c8352489b21a8047c773067644e3897f02ecbbd610f4d46b7f08612/propcache-0.4.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:005f08e6a0529984491e37d8dbc3dd86f84bd78a8ceb5fa9a021f4c48d4984be", size = 273557 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/a1/b52b055c766a54ce6d9c16d9aca0cad8059acd9637cdf8aa0222f4a026ef/propcache-0.4.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5c3310452e0d31390da9035c348633b43d7e7feb2e37be252be6da45abd1abcc", size = 280015 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/c8/33cee30bd890672c63743049f3c9e4be087e6780906bfc3ec58528be59c1/propcache-0.4.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c3c70630930447f9ef1caac7728c8ad1c56bc5015338b20fed0d08ea2480b3a", size = 262880 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/b1/8f08a143b204b418285c88b83d00edbd61afbc2c6415ffafc8905da7038b/propcache-0.4.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e57061305815dfc910a3634dcf584f08168a8836e6999983569f51a8544cd89", size = 260938 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/12/96e4664c82ca2f31e1c8dff86afb867348979eb78d3cb8546a680287a1e9/propcache-0.4.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:521a463429ef54143092c11a77e04056dd00636f72e8c45b70aaa3140d639726", size = 247641 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/ed/e7a9cfca28133386ba52278136d42209d3125db08d0a6395f0cba0c0285c/propcache-0.4.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:120c964da3fdc75e3731aa392527136d4ad35868cc556fd09bb6d09172d9a367", size = 262510 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/76/16d8bf65e8845dd62b4e2b57444ab81f07f40caa5652b8969b87ddcf2ef6/propcache-0.4.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:d8f353eb14ee3441ee844ade4277d560cdd68288838673273b978e3d6d2c8f36", size = 263161 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e7/70/c99e9edb5d91d5ad8a49fa3c1e8285ba64f1476782fed10ab251ff413ba1/propcache-0.4.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ab2943be7c652f09638800905ee1bab2c544e537edb57d527997a24c13dc1455", size = 257393 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/08/02/87b25304249a35c0915d236575bc3574a323f60b47939a2262b77632a3ee/propcache-0.4.1-cp314-cp314t-win32.whl", hash = "sha256:05674a162469f31358c30bcaa8883cb7829fa3110bf9c0991fe27d7896c42d85", size = 42546 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/ef/3c6ecf8b317aa982f309835e8f96987466123c6e596646d4e6a1dfcd080f/propcache-0.4.1-cp314-cp314t-win_amd64.whl", hash = "sha256:990f6b3e2a27d683cb7602ed6c86f15ee6b43b1194736f9baaeb93d0016633b1", size = 46259 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/2d/346e946d4951f37eca1e4f55be0f0174c52cd70720f84029b02f296f4a38/propcache-0.4.1-cp314-cp314t-win_arm64.whl", hash = "sha256:ecef2343af4cc68e05131e45024ba34f6095821988a9d0a02aa7c73fcc448aa9", size = 40428 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305 },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "py-vapid"
|
||||||
|
version = "1.9.4"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "cryptography" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a3/ed/c648c8018fab319951764f4babe68ddcbbff7f2bbcd7ff7e531eac1788c8/py_vapid-1.9.4.tar.gz", hash = "sha256:a004023560cbc54e34fc06380a0580f04ffcc788e84fb6d19e9339eeb6551a28", size = 74750 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7f/15/f9d0171e1ad863ca49e826d5afb6b50566f20dc9b4f76965096d3555ce9e/py_vapid-1.9.4-py2.py3-none-any.whl", hash = "sha256:f165a5bf90dcf966b226114f01f178f137579a09784c7f0628fa2f0a299741b6", size = 23912 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pycayennelpp"
|
name = "pycayennelpp"
|
||||||
version = "2.4.0"
|
version = "2.4.0"
|
||||||
@@ -926,6 +1460,22 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230 },
|
{ url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pywebpush"
|
||||||
|
version = "2.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "aiohttp" },
|
||||||
|
{ name = "cryptography" },
|
||||||
|
{ name = "http-ece" },
|
||||||
|
{ name = "py-vapid" },
|
||||||
|
{ name = "requests" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/87/d9/e497a24bc9f659bfc0e570382a41e6b2d6726fbcfa4d85aaa23fe9c81ba2/pywebpush-2.3.0.tar.gz", hash = "sha256:d1e27db8de9e6757c1875f67292554bd54c41874c36f4b5c4ebb5442dce204f2", size = 28489 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3d/d8/ac21241cf8007cb93255eabf318da4f425ec0f75d28c366992253aa8c1b2/pywebpush-2.3.0-py3-none-any.whl", hash = "sha256:3d97469fb14d4323c362319d438183737249a4115b50e146ce233e7f01e3cf98", size = 22851 },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pyyaml"
|
name = "pyyaml"
|
||||||
version = "6.0.3"
|
version = "6.0.3"
|
||||||
@@ -983,7 +1533,7 @@ wheels = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "remoteterm-meshcore"
|
name = "remoteterm-meshcore"
|
||||||
version = "3.11.2"
|
version = "3.11.3"
|
||||||
source = { virtual = "." }
|
source = { virtual = "." }
|
||||||
dependencies = [
|
dependencies = [
|
||||||
{ name = "aiomqtt" },
|
{ name = "aiomqtt" },
|
||||||
@@ -996,6 +1546,7 @@ dependencies = [
|
|||||||
{ name = "pycryptodome" },
|
{ name = "pycryptodome" },
|
||||||
{ name = "pydantic-settings" },
|
{ name = "pydantic-settings" },
|
||||||
{ name = "pynacl" },
|
{ name = "pynacl" },
|
||||||
|
{ name = "pywebpush" },
|
||||||
{ name = "uvicorn", extra = ["standard"] },
|
{ name = "uvicorn", extra = ["standard"] },
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -1034,6 +1585,7 @@ requires-dist = [
|
|||||||
{ name = "pytest", marker = "extra == 'test'", specifier = ">=8.0.0" },
|
{ name = "pytest", marker = "extra == 'test'", specifier = ">=8.0.0" },
|
||||||
{ name = "pytest-asyncio", marker = "extra == 'test'", specifier = ">=0.24.0" },
|
{ name = "pytest-asyncio", marker = "extra == 'test'", specifier = ">=0.24.0" },
|
||||||
{ name = "pytest-xdist", marker = "extra == 'test'", specifier = ">=3.0" },
|
{ name = "pytest-xdist", marker = "extra == 'test'", specifier = ">=3.0" },
|
||||||
|
{ name = "pywebpush", specifier = ">=0.14.0" },
|
||||||
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.32.0" },
|
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.32.0" },
|
||||||
]
|
]
|
||||||
provides-extras = ["test"]
|
provides-extras = ["test"]
|
||||||
@@ -1582,3 +2134,125 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/f4/3d/aae50b1d0e37b5a61055759aedd42c6c99d7c17ab8c3e568ab33c0288938/winrt_windows_storage_streams-3.2.1-cp314-cp314-win_amd64.whl", hash = "sha256:3c5bf41d725369b9986e6d64bad7079372b95c329897d684f955d7028c7f27a0", size = 135566 },
|
{ url = "https://files.pythonhosted.org/packages/f4/3d/aae50b1d0e37b5a61055759aedd42c6c99d7c17ab8c3e568ab33c0288938/winrt_windows_storage_streams-3.2.1-cp314-cp314-win_amd64.whl", hash = "sha256:3c5bf41d725369b9986e6d64bad7079372b95c329897d684f955d7028c7f27a0", size = 135566 },
|
||||||
{ url = "https://files.pythonhosted.org/packages/bb/c3/6d3ce7a58e6c828e0795c9db8790d0593dd7fdf296e513c999150deb98d4/winrt_windows_storage_streams-3.2.1-cp314-cp314-win_arm64.whl", hash = "sha256:293e09825559d0929bbe5de01e1e115f7a6283d8996ab55652e5af365f032987", size = 134393 },
|
{ url = "https://files.pythonhosted.org/packages/bb/c3/6d3ce7a58e6c828e0795c9db8790d0593dd7fdf296e513c999150deb98d4/winrt_windows_storage_streams-3.2.1-cp314-cp314-win_arm64.whl", hash = "sha256:293e09825559d0929bbe5de01e1e115f7a6283d8996ab55652e5af365f032987", size = 134393 },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "yarl"
|
||||||
|
version = "1.23.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "multidict" },
|
||||||
|
{ name = "propcache" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/23/6e/beb1beec874a72f23815c1434518bfc4ed2175065173fb138c3705f658d4/yarl-1.23.0.tar.gz", hash = "sha256:53b1ea6ca88ebd4420379c330aea57e258408dd0df9af0992e5de2078dc9f5d5", size = 194676 }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a2/aa/60da938b8f0997ba3a911263c40d82b6f645a67902a490b46f3355e10fae/yarl-1.23.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b35d13d549077713e4414f927cdc388d62e543987c572baee613bf82f11a4b99", size = 123641 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/84/e237607faf4e099dbb8a4f511cfd5efcb5f75918baad200ff7380635631b/yarl-1.23.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cbb0fef01f0c6b38cb0f39b1f78fc90b807e0e3c86a7ff3ce74ad77ce5c7880c", size = 86248 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/0d/71ceabc14c146ba8ee3804ca7b3d42b1664c8440439de5214d366fec7d3a/yarl-1.23.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dc52310451fc7c629e13c4e061cbe2dd01684d91f2f8ee2821b083c58bd72432", size = 85988 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8c/6c/4a90d59c572e46b270ca132aca66954f1175abd691f74c1ef4c6711828e2/yarl-1.23.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b2c6b50c7b0464165472b56b42d4c76a7b864597007d9c085e8b63e185cf4a7a", size = 100566 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/49/fb/c438fb5108047e629f6282a371e6e91cf3f97ee087c4fb748a1f32ceef55/yarl-1.23.0-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:aafe5dcfda86c8af00386d7781d4c2181b5011b7be3f2add5e99899ea925df05", size = 92079 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d9/13/d269aa1aed3e4f50a5a103f96327210cc5fa5dd2d50882778f13c7a14606/yarl-1.23.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9ee33b875f0b390564c1fb7bc528abf18c8ee6073b201c6ae8524aca778e2d83", size = 108741 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/85/fb/115b16f22c37ea4437d323e472945bea97301c8ec6089868fa560abab590/yarl-1.23.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4c41e021bc6d7affb3364dc1e1e5fa9582b470f283748784bd6ea0558f87f42c", size = 108099 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/64/c53487d9f4968045b8afa51aed7ca44f58b2589e772f32745f3744476c82/yarl-1.23.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:99c8a9ed30f4164bc4c14b37a90208836cbf50d4ce2a57c71d0f52c7fb4f7598", size = 102678 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/85/59/cd98e556fbb2bf8fab29c1a722f67ad45c5f3447cac798ab85620d1e70af/yarl-1.23.0-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f2af5c81a1f124609d5f33507082fc3f739959d4719b56877ab1ee7e7b3d602b", size = 100803 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/c0/b39770b56d4a9f0bb5f77e2f1763cd2d75cc2f6c0131e3b4c360348fcd65/yarl-1.23.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6b41389c19b07c760c7e427a3462e8ab83c4bb087d127f0e854c706ce1b9215c", size = 100163 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e7/64/6980f99ab00e1f0ff67cb84766c93d595b067eed07439cfccfc8fb28c1a6/yarl-1.23.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:1dc702e42d0684f42d6519c8d581e49c96cefaaab16691f03566d30658ee8788", size = 93859 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/38/69/912e6c5e146793e5d4b5fe39ff5b00f4d22463dfd5a162bec565ac757673/yarl-1.23.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:0e40111274f340d32ebcc0a5668d54d2b552a6cca84c9475859d364b380e3222", size = 108202 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/97/35ca6767524687ad64e5f5c31ad54bc76d585585a9fcb40f649e7e82ffed/yarl-1.23.0-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:4764a6a7588561a9aef92f65bda2c4fb58fe7c675c0883862e6df97559de0bfb", size = 99866 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/1c/1a3387ee6d73589f6f2a220ae06f2984f6c20b40c734989b0a44f5987308/yarl-1.23.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:03214408cfa590df47728b84c679ae4ef00be2428e11630277be0727eba2d7cc", size = 107852 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a4/b8/35c0750fcd5a3f781058bfd954515dd4b1eab45e218cbb85cf11132215f1/yarl-1.23.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:170e26584b060879e29fac213e4228ef063f39128723807a312e5c7fec28eff2", size = 102919 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/1c/9a1979aec4a81896d597bcb2177827f2dbee3f5b7cc48b2d0dadb644b41d/yarl-1.23.0-cp311-cp311-win32.whl", hash = "sha256:51430653db848d258336cfa0244427b17d12db63d42603a55f0d4546f50f25b5", size = 82602 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/93/22/b85eca6fa2ad9491af48c973e4c8cf6b103a73dbb271fe3346949449fca0/yarl-1.23.0-cp311-cp311-win_amd64.whl", hash = "sha256:bf49a3ae946a87083ef3a34c8f677ae4243f5b824bfc4c69672e72b3d6719d46", size = 87461 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/93/95/07e3553fe6f113e6864a20bdc53a78113cda3b9ced8784ee52a52c9f80d8/yarl-1.23.0-cp311-cp311-win_arm64.whl", hash = "sha256:b39cb32a6582750b6cc77bfb3c49c0f8760dc18dc96ec9fb55fbb0f04e08b928", size = 82336 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/88/8a/94615bc31022f711add374097ad4144d569e95ff3c38d39215d07ac153a0/yarl-1.23.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:1932b6b8bba8d0160a9d1078aae5838a66039e8832d41d2992daa9a3a08f7860", size = 124737 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e3/6f/c6554045d59d64052698add01226bc867b52fe4a12373415d7991fdca95d/yarl-1.23.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:411225bae281f114067578891bc75534cfb3d92a3b4dfef7a6ca78ba354e6069", size = 87029 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/19/2a/725ecc166d53438bc88f76822ed4b1e3b10756e790bafd7b523fe97c322d/yarl-1.23.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:13a563739ae600a631c36ce096615fe307f131344588b0bc0daec108cdb47b25", size = 86310 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/99/30/58260ed98e6ff7f90ba84442c1ddd758c9170d70327394a6227b310cd60f/yarl-1.23.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9cbf44c5cb4a7633d078788e1b56387e3d3cf2b8139a3be38040b22d6c3221c8", size = 97587 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/0a/8b08aac08b50682e65759f7f8dde98ae8168f72487e7357a5d684c581ef9/yarl-1.23.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53ad387048f6f09a8969631e4de3f1bf70c50e93545d64af4f751b2498755072", size = 92528 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/52/07/0b7179101fe5f8385ec6c6bb5d0cb9f76bd9fb4a769591ab6fb5cdbfc69a/yarl-1.23.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4a59ba56f340334766f3a4442e0efd0af895fae9e2b204741ef885c446b3a1a8", size = 105339 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/8a/36d82869ab5ec829ca8574dfcb92b51286fcfb1e9c7a73659616362dc880/yarl-1.23.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:803a3c3ce4acc62eaf01eaca1208dcf0783025ef27572c3336502b9c232005e7", size = 105061 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/3e/868e5c3364b6cee19ff3e1a122194fa4ce51def02c61023970442162859e/yarl-1.23.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3d2bff8f37f8d0f96c7ec554d16945050d54462d6e95414babaa18bfafc7f51", size = 100132 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/26/9c89acf82f08a52cb52d6d39454f8d18af15f9d386a23795389d1d423823/yarl-1.23.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c75eb09e8d55bceb4367e83496ff8ef2bc7ea6960efb38e978e8073ea59ecb67", size = 99289 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6f/54/5b0db00d2cb056922356104468019c0a132e89c8d3ab67d8ede9f4483d2a/yarl-1.23.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:877b0738624280e34c55680d6054a307aa94f7d52fa0e3034a9cc6e790871da7", size = 96950 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f6/40/10fa93811fd439341fad7e0718a86aca0de9548023bbb403668d6555acab/yarl-1.23.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b5405bb8f0e783a988172993cfc627e4d9d00432d6bbac65a923041edacf997d", size = 93960 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bc/d2/8ae2e6cd77d0805f4526e30ec43b6f9a3dfc542d401ac4990d178e4bf0cf/yarl-1.23.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1c3a3598a832590c5a3ce56ab5576361b5688c12cb1d39429cf5dba30b510760", size = 104703 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2f/0c/b3ceacf82c3fe21183ce35fa2acf5320af003d52bc1fcf5915077681142e/yarl-1.23.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:8419ebd326430d1cbb7efb5292330a2cf39114e82df5cc3d83c9a0d5ebeaf2f2", size = 98325 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9d/e0/12900edd28bdab91a69bd2554b85ad7b151f64e8b521fe16f9ad2f56477a/yarl-1.23.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:be61f6fff406ca40e3b1d84716fde398fc08bc63dd96d15f3a14230a0973ed86", size = 105067 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/61/74bb1182cf79c9bbe4eb6b1f14a57a22d7a0be5e9cedf8e2d5c2086474c3/yarl-1.23.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3ceb13c5c858d01321b5d9bb65e4cf37a92169ea470b70fec6f236b2c9dd7e34", size = 100285 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/7f/cd5ef733f2550de6241bd8bd8c3febc78158b9d75f197d9c7baa113436af/yarl-1.23.0-cp312-cp312-win32.whl", hash = "sha256:fffc45637bcd6538de8b85f51e3df3223e4ad89bccbfca0481c08c7fc8b7ed7d", size = 82359 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/be/25216a49daeeb7af2bec0db22d5e7df08ed1d7c9f65d78b14f3b74fd72fc/yarl-1.23.0-cp312-cp312-win_amd64.whl", hash = "sha256:f69f57305656a4852f2a7203efc661d8c042e6cc67f7acd97d8667fb448a426e", size = 87674 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/35/aeab955d6c425b227d5b7247eafb24f2653fedc32f95373a001af5dfeb9e/yarl-1.23.0-cp312-cp312-win_arm64.whl", hash = "sha256:6e87a6e8735b44816e7db0b2fbc9686932df473c826b0d9743148432e10bb9b9", size = 81879 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/4b/a0a6e5d0ee8a2f3a373ddef8a4097d74ac901ac363eea1440464ccbe0898/yarl-1.23.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:16c6994ac35c3e74fb0ae93323bf8b9c2a9088d55946109489667c510a7d010e", size = 123796 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/b6/8925d68af039b835ae876db5838e82e76ec87b9782ecc97e192b809c4831/yarl-1.23.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4a42e651629dafb64fd5b0286a3580613702b5809ad3f24934ea87595804f2c5", size = 86547 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ae/50/06d511cc4b8e0360d3c94af051a768e84b755c5eb031b12adaaab6dec6e5/yarl-1.23.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7c6b9461a2a8b47c65eef63bb1c76a4f1c119618ffa99ea79bc5bb1e46c5821b", size = 85854 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/f4/4e30b250927ffdab4db70da08b9b8d2194d7c7b400167b8fbeca1e4701ca/yarl-1.23.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2569b67d616eab450d262ca7cb9f9e19d2f718c70a8b88712859359d0ab17035", size = 98351 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/fc/4118c5671ea948208bdb1492d8b76bdf1453d3e73df051f939f563e7dcc5/yarl-1.23.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e9d9a4d06d3481eab79803beb4d9bd6f6a8e781ec078ac70d7ef2dcc29d1bea5", size = 92711 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/56/11/1ed91d42bd9e73c13dc9e7eb0dd92298d75e7ac4dd7f046ad0c472e231cd/yarl-1.23.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f514f6474e04179d3d33175ed3f3e31434d3130d42ec153540d5b157deefd735", size = 106014 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ce/c9/74e44e056a23fbc33aca71779ef450ca648a5bc472bdad7a82339918f818/yarl-1.23.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:fda207c815b253e34f7e1909840fd14299567b1c0eb4908f8c2ce01a41265401", size = 105557 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/fe/b1e10b08d287f518994f1e2ff9b6d26f0adeecd8dd7d533b01bab29a3eda/yarl-1.23.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34b6cf500e61c90f305094911f9acc9c86da1a05a7a3f5be9f68817043f486e4", size = 101559 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/72/59/c5b8d94b14e3d3c2a9c20cb100119fd534ab5a14b93673ab4cc4a4141ea5/yarl-1.23.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:d7504f2b476d21653e4d143f44a175f7f751cd41233525312696c76aa3dbb23f", size = 100502 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/77/4f/96976cb54cbfc5c9fd73ed4c51804f92f209481d1fb190981c0f8a07a1d7/yarl-1.23.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:578110dd426f0d209d1509244e6d4a3f1a3e9077655d98c5f22583d63252a08a", size = 98027 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/63/6e/904c4f476471afdbad6b7e5b70362fb5810e35cd7466529a97322b6f5556/yarl-1.23.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:609d3614d78d74ebe35f54953c5bbd2ac647a7ddb9c30a5d877580f5e86b22f2", size = 95369 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9d/40/acfcdb3b5f9d68ef499e39e04d25e141fe90661f9d54114556cf83be8353/yarl-1.23.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4966242ec68afc74c122f8459abd597afd7d8a60dc93d695c1334c5fd25f762f", size = 105565 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5e/c6/31e28f3a6ba2869c43d124f37ea5260cac9c9281df803c354b31f4dd1f3c/yarl-1.23.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:e0fd068364a6759bc794459f0a735ab151d11304346332489c7972bacbe9e72b", size = 99813 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/08/1f/6f65f59e72d54aa467119b63fc0b0b1762eff0232db1f4720cd89e2f4a17/yarl-1.23.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:39004f0ad156da43e86aa71f44e033de68a44e5a31fc53507b36dd253970054a", size = 105632 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a3/c4/18b178a69935f9e7a338127d5b77d868fdc0f0e49becd286d51b3a18c61d/yarl-1.23.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e5723c01a56c5028c807c701aa66722916d2747ad737a046853f6c46f4875543", size = 101895 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/54/f5b870b5505663911dba950a8e4776a0dbd51c9c54c0ae88e823e4b874a0/yarl-1.23.0-cp313-cp313-win32.whl", hash = "sha256:1b6b572edd95b4fa8df75de10b04bc81acc87c1c7d16bcdd2035b09d30acc957", size = 82356 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/84/266e8da36879c6edcd37b02b547e2d9ecdfea776be49598e75696e3316e1/yarl-1.23.0-cp313-cp313-win_amd64.whl", hash = "sha256:baaf55442359053c7d62f6f8413a62adba3205119bcb6f49594894d8be47e5e3", size = 87515 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/00/fd/7e1c66efad35e1649114fa13f17485f62881ad58edeeb7f49f8c5e748bf9/yarl-1.23.0-cp313-cp313-win_arm64.whl", hash = "sha256:fb4948814a2a98e3912505f09c9e7493b1506226afb1f881825368d6fb776ee3", size = 81785 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/fc/119dd07004f17ea43bb91e3ece6587759edd7519d6b086d16bfbd3319982/yarl-1.23.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:aecfed0b41aa72b7881712c65cf764e39ce2ec352324f5e0837c7048d9e6daaa", size = 130719 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e6/0d/9f2348502fbb3af409e8f47730282cd6bc80dec6630c1e06374d882d6eb2/yarl-1.23.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a41bcf68efd19073376eb8cf948b8d9be0af26256403e512bb18f3966f1f9120", size = 89690 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/50/93/e88f3c80971b42cfc83f50a51b9d165a1dbf154b97005f2994a79f212a07/yarl-1.23.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cde9a2ecd91668bcb7f077c4966d8ceddb60af01b52e6e3e2680e4cf00ad1a59", size = 89851 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1c/07/61c9dd8ba8f86473263b4036f70fb594c09e99c0d9737a799dfd8bc85651/yarl-1.23.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5023346c4ee7992febc0068e7593de5fa2bf611848c08404b35ebbb76b1b0512", size = 95874 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/e9/f9ff8ceefba599eac6abddcfb0b3bee9b9e636e96dbf54342a8577252379/yarl-1.23.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d1009abedb49ae95b136a8904a3f71b342f849ffeced2d3747bf29caeda218c4", size = 88710 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/eb/78/0231bfcc5d4c8eec220bc2f9ef82cb4566192ea867a7c5b4148f44f6cbcd/yarl-1.23.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a8d00f29b42f534cc8aa3931cfe773b13b23e561e10d2b26f27a8d309b0e82a1", size = 101033 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cd/9b/30ea5239a61786f18fd25797151a17fbb3be176977187a48d541b5447dd4/yarl-1.23.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:95451e6ce06c3e104556d73b559f5da6c34a069b6b62946d3ad66afcd51642ea", size = 100817 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/e2/a4980481071791bc83bce2b7a1a1f7adcabfa366007518b4b845e92eeee3/yarl-1.23.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:531ef597132086b6cf96faa7c6c1dcd0361dd5f1694e5cc30375907b9b7d3ea9", size = 97482 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/1e/304a00cf5f6100414c4b5a01fc7ff9ee724b62158a08df2f8170dfc72a2d/yarl-1.23.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:88f9fb0116fbfcefcab70f85cf4b74a2b6ce5d199c41345296f49d974ddb4123", size = 95949 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/68/03/093f4055ed4cae649ac53bca3d180bd37102e9e11d048588e9ab0c0108d0/yarl-1.23.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e7b0460976dc75cb87ad9cc1f9899a4b97751e7d4e77ab840fc9b6d377b8fd24", size = 95839 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b9/28/4c75ebb108f322aa8f917ae10a8ffa4f07cae10a8a627b64e578617df6a0/yarl-1.23.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:115136c4a426f9da976187d238e84139ff6b51a20839aa6e3720cd1026d768de", size = 90696 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/23/9c/42c2e2dd91c1a570402f51bdf066bfdb1241c2240ba001967bad778e77b7/yarl-1.23.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:ead11956716a940c1abc816b7df3fa2b84d06eaed8832ca32f5c5e058c65506b", size = 100865 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/05/1bcd60a8a0a914d462c305137246b6f9d167628d73568505fce3f1cb2e65/yarl-1.23.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:fe8f8f5e70e6dbdfca9882cd9deaac058729bcf323cf7a58660901e55c9c94f6", size = 96234 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/b2/f52381aac396d6778ce516b7bc149c79e65bfc068b5de2857ab69eeea3b7/yarl-1.23.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:a0e317df055958a0c1e79e5d2aa5a5eaa4a6d05a20d4b0c9c3f48918139c9fc6", size = 100295 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/e8/638bae5bbf1113a659b2435d8895474598afe38b4a837103764f603aba56/yarl-1.23.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f0fd84de0c957b2d280143522c4f91a73aada1923caee763e24a2b3fda9f8a5", size = 97784 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/80/25/a3892b46182c586c202629fc2159aa13975d3741d52ebd7347fd501d48d5/yarl-1.23.0-cp313-cp313t-win32.whl", hash = "sha256:93a784271881035ab4406a172edb0faecb6e7d00f4b53dc2f55919d6c9688595", size = 88313 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/43/68/8c5b36aa5178900b37387937bc2c2fe0e9505537f713495472dcf6f6fccc/yarl-1.23.0-cp313-cp313t-win_amd64.whl", hash = "sha256:dd00607bffbf30250fe108065f07453ec124dbf223420f57f5e749b04295e090", size = 94932 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c6/cc/d79ba8292f51f81f4dc533a8ccfb9fc6992cabf0998ed3245de7589dc07c/yarl-1.23.0-cp313-cp313t-win_arm64.whl", hash = "sha256:ac09d42f48f80c9ee1635b2fcaa819496a44502737660d3c0f2ade7526d29144", size = 84786 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/98/b85a038d65d1b92c3903ab89444f48d3cee490a883477b716d7a24b1a78c/yarl-1.23.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:21d1b7305a71a15b4794b5ff22e8eef96ff4a6d7f9657155e5aa419444b28912", size = 124455 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/39/54/bc2b45559f86543d163b6e294417a107bb87557609007c007ad889afec18/yarl-1.23.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:85610b4f27f69984932a7abbe52703688de3724d9f72bceb1cca667deff27474", size = 86752 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/f9/e8242b68362bffe6fb536c8db5076861466fc780f0f1b479fc4ffbebb128/yarl-1.23.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:23f371bd662cf44a7630d4d113101eafc0cfa7518a2760d20760b26021454719", size = 86291 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ea/d8/d1cb2378c81dd729e98c716582b1ccb08357e8488e4c24714658cc6630e8/yarl-1.23.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4a80f77dc1acaaa61f0934176fccca7096d9b1ff08c8ba9cddf5ae034a24319", size = 99026 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/ff/7196790538f31debe3341283b5b0707e7feb947620fc5e8236ef28d44f72/yarl-1.23.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:bd654fad46d8d9e823afbb4f87c79160b5a374ed1ff5bde24e542e6ba8f41434", size = 92355 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/56/25d58c3eddde825890a5fe6aa1866228377354a3c39262235234ab5f616b/yarl-1.23.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:682bae25f0a0dd23a056739f23a134db9f52a63e2afd6bfb37ddc76292bbd723", size = 106417 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/8a/882c0e7bc8277eb895b31bce0138f51a1ba551fc2e1ec6753ffc1e7c1377/yarl-1.23.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a82836cab5f197a0514235aaf7ffccdc886ccdaa2324bc0aafdd4ae898103039", size = 106422 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/42/2b/fef67d616931055bf3d6764885990a3ac647d68734a2d6a9e1d13de437a2/yarl-1.23.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1c57676bdedc94cd3bc37724cf6f8cd2779f02f6aba48de45feca073e714fe52", size = 101915 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/6a/530e16aebce27c5937920f3431c628a29a4b6b430fab3fd1c117b26ff3f6/yarl-1.23.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c7f8dc16c498ff06497c015642333219871effba93e4a2e8604a06264aca5c5c", size = 100690 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/88/08/93749219179a45e27b036e03260fda05190b911de8e18225c294ac95bbc9/yarl-1.23.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:5ee586fb17ff8f90c91cf73c6108a434b02d69925f44f5f8e0d7f2f260607eae", size = 98750 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d9/cf/ea424a004969f5d81a362110a6ac1496d79efdc6d50c2c4b2e3ea0fc2519/yarl-1.23.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:17235362f580149742739cc3828b80e24029d08cbb9c4bda0242c7b5bc610a8e", size = 94685 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/b7/14341481fe568e2b0408bcf1484c652accafe06a0ade9387b5d3fd9df446/yarl-1.23.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:0793e2bd0cf14234983bbb371591e6bea9e876ddf6896cdcc93450996b0b5c85", size = 106009 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0a/e6/5c744a9b54f4e8007ad35bce96fbc9218338e84812d36f3390cea616881a/yarl-1.23.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:3650dc2480f94f7116c364096bc84b1d602f44224ef7d5c7208425915c0475dd", size = 100033 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/23/e3bfc188d0b400f025bc49d99793d02c9abe15752138dcc27e4eaf0c4a9e/yarl-1.23.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f40e782d49630ad384db66d4d8b73ff4f1b8955dc12e26b09a3e3af064b3b9d6", size = 106483 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/72/42/f0505f949a90b3f8b7a363d6cbdf398f6e6c58946d85c6d3a3bc70595b26/yarl-1.23.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:94f8575fbdf81749008d980c17796097e645574a3b8c28ee313931068dad14fe", size = 102175 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/65/b39290f1d892a9dd671d1c722014ca062a9c35d60885d57e5375db0404b5/yarl-1.23.0-cp314-cp314-win32.whl", hash = "sha256:c8aa34a5c864db1087d911a0b902d60d203ea3607d91f615acd3f3108ac32169", size = 83871 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a9/5b/9b92f54c784c26e2a422e55a8d2607ab15b7ea3349e28359282f84f01d43/yarl-1.23.0-cp314-cp314-win_amd64.whl", hash = "sha256:63e92247f383c85ab00dd0091e8c3fa331a96e865459f5ee80353c70a4a42d70", size = 89093 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/7d/8a84dc9381fd4412d5e7ff04926f9865f6372b4c2fd91e10092e65d29eb8/yarl-1.23.0-cp314-cp314-win_arm64.whl", hash = "sha256:70efd20be968c76ece7baa8dafe04c5be06abc57f754d6f36f3741f7aa7a208e", size = 83384 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dd/8d/d2fad34b1c08aa161b74394183daa7d800141aaaee207317e82c790b418d/yarl-1.23.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:9a18d6f9359e45722c064c97464ec883eb0e0366d33eda61cb19a244bf222679", size = 131019 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/19/ff/33009a39d3ccf4b94d7d7880dfe17fb5816c5a4fe0096d9b56abceea9ac7/yarl-1.23.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:2803ed8b21ca47a43da80a6fd1ed3019d30061f7061daa35ac54f63933409412", size = 89894 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/f1/dab7ac5e7306fb79c0190766a3c00b4cb8d09a1f390ded68c85a5934faf5/yarl-1.23.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:394906945aa8b19fc14a61cf69743a868bb8c465efe85eee687109cc540b98f4", size = 89979 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/b1/08e95f3caee1fad6e65017b9f26c1d79877b502622d60e517de01e72f95d/yarl-1.23.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:71d006bee8397a4a89f469b8deb22469fe7508132d3c17fa6ed871e79832691c", size = 95943 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/cc/6409f9018864a6aa186c61175b977131f373f1988e198e031236916e87e4/yarl-1.23.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:62694e275c93d54f7ccedcfef57d42761b2aad5234b6be1f3e3026cae4001cd4", size = 88786 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/40/cc22d1d7714b717fde2006fad2ced5efe5580606cb059ae42117542122f3/yarl-1.23.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31de1613658308efdb21ada98cbc86a97c181aa050ba22a808120bb5be3ab94", size = 101307 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8f/0d/476c38e85ddb4c6ec6b20b815bdd779aa386a013f3d8b85516feee55c8dc/yarl-1.23.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:fb1e8b8d66c278b21d13b0a7ca22c41dd757a7c209c6b12c313e445c31dd3b28", size = 100904 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/72/32/0abe4a76d59adf2081dcb0397168553ece4616ada1c54d1c49d8936c74f8/yarl-1.23.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50f9d8d531dfb767c565f348f33dd5139a6c43f5cbdf3f67da40d54241df93f6", size = 97728 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/35/7b30f4810fba112f60f5a43237545867504e15b1c7647a785fbaf588fac2/yarl-1.23.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:575aa4405a656e61a540f4a80eaa5260f2a38fff7bfdc4b5f611840d76e9e277", size = 95964 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/86/ed7a73ab85ef00e8bb70b0cb5421d8a2a625b81a333941a469a6f4022828/yarl-1.23.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:041b1a4cefacf65840b4e295c6985f334ba83c30607441ae3cf206a0eed1a2e4", size = 95882 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/19/90/d56967f61a29d8498efb7afb651e0b2b422a1e9b47b0ab5f4e40a19b699b/yarl-1.23.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:d38c1e8231722c4ce40d7593f28d92b5fc72f3e9774fe73d7e800ec32299f63a", size = 90797 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/72/00/8b8f76909259f56647adb1011d7ed8b321bcf97e464515c65016a47ecdf0/yarl-1.23.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:d53834e23c015ee83a99377db6e5e37d8484f333edb03bd15b4bc312cc7254fb", size = 101023 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/e2/cab11b126fb7d440281b7df8e9ddbe4851e70a4dde47a202b6642586b8d9/yarl-1.23.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:2e27c8841126e017dd2a054a95771569e6070b9ee1b133366d8b31beb5018a41", size = 96227 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c2/9b/2c893e16bfc50e6b2edf76c1a9eb6cb0c744346197e74c65e99ad8d634d0/yarl-1.23.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:76855800ac56f878847a09ce6dba727c93ca2d89c9e9d63002d26b916810b0a2", size = 100302 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/28/ec/5498c4e3a6d5f1003beb23405671c2eb9cdbf3067d1c80f15eeafe301010/yarl-1.23.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e09fd068c2e169a7070d83d3bde728a4d48de0549f975290be3c108c02e499b4", size = 98202 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/c3/cd737e2d45e70717907f83e146f6949f20cc23cd4bf7b2688727763aa458/yarl-1.23.0-cp314-cp314t-win32.whl", hash = "sha256:73309162a6a571d4cbd3b6a1dcc703c7311843ae0d1578df6f09be4e98df38d4", size = 90558 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e1/19/3774d162f6732d1cfb0b47b4140a942a35ca82bb19b6db1f80e9e7bdc8f8/yarl-1.23.0-cp314-cp314t-win_amd64.whl", hash = "sha256:4503053d296bc6e4cbd1fad61cf3b6e33b939886c4f249ba7c78b602214fabe2", size = 97610 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/47/3fa2286c3cb162c71cdb34c4224d5745a1ceceb391b2bd9b19b668a8d724/yarl-1.23.0-cp314-cp314t-win_arm64.whl", hash = "sha256:44bb7bef4ea409384e3f8bc36c063d77ea1b8d4a5b2706956c0d6695f07dcc25", size = 86041 },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/68/c8739671f5699c7dc470580a4f821ef37c32c4cb0b047ce223a7f115757f/yarl-1.23.0-py3-none-any.whl", hash = "sha256:a2df6afe50dea8ae15fa34c9f824a3ee958d785fd5d089063d960bae1daa0a3f", size = 48288 },
|
||||||
|
]
|
||||||
|
|||||||
Reference in New Issue
Block a user