22 Commits

Author SHA1 Message Date
pablorevilla-meshtastic 52f1a1e788 updated the version number and date 2026-01-24 11:14:06 -08:00
pablorevilla-meshtastic f44a78730a Added the ablility to skip packets with specific from_id and have secondary enccryption key for mqtt_reader. 2026-01-23 21:49:03 -08:00
pablorevilla-meshtastic a9a5e046ea more container test 2026-01-23 13:02:34 -08:00
pablorevilla-meshtastic 37386f9e28 change to container.yml 2026-01-23 11:58:48 -08:00
pablorevilla-meshtastic b66bfb1ee9 Fix error on container build and update README 2026-01-23 11:42:03 -08:00
pablorevilla-meshtastic caf9cd1596 Updated list of sites runing meshview 2026-01-22 07:42:24 -08:00
pablorevilla-meshtastic a4ebd2b23c work on net.html to limit packets to last 12 hours instead of 48 hours. 2026-01-21 20:11:17 -08:00
pablorevilla-meshtastic 5676ade6b7 fix api query so that weekly mesh works. 2026-01-21 17:19:19 -08:00
pablorevilla-meshtastic 319f8eac06 optimization 2026-01-20 14:48:33 -08:00
pablorevilla-meshtastic d85132133a fix bug 2026-01-20 11:27:42 -08:00
pablorevilla-meshtastic b6d8af409c fix bug on backwards compatibility 2026-01-20 10:10:39 -08:00
pablorevilla-meshtastic 896a0980d5 Update Scripts for PortgreSQL 2026-01-15 16:24:42 -08:00
pablorevilla-meshtastic 7d395e5e27 Correct documentation error 2026-01-15 14:42:18 -08:00
pablorevilla-meshtastic c3cc01d7e7 Docuement Update 2026-01-15 14:30:04 -08:00
pablorevilla-meshtastic ecbadc6087 configure "Wal" for sqlite 2026-01-15 14:10:49 -08:00
pablorevilla-meshtastic ff30623bdf Documentation updte 2026-01-15 11:55:07 -08:00
pablorevilla-meshtastic a43433ccb4 Update documentation 2026-01-15 11:51:03 -08:00
pablorevilla-meshtastic 4d9db2a52c Update instructions 2026-01-15 11:49:25 -08:00
pablorevilla-meshtastic e30b59851f Update to 2026-01-15 11:39:24 -08:00
pablorevilla-meshtastic 36dd91be63 Merge branch 'db_updates' 2026-01-15 09:04:09 -08:00
Pablo Revilla 4516c84128 Modify cleanup.sh to use import_time_us for queries
Updated cleanup script to use import_time_us for deletions.
2026-01-14 22:11:52 -08:00
Pablo Revilla a882bc22dd Update README with version 3.0.2 details
Added notes about database changes for version 3.0.2.
2026-01-12 10:38:55 -08:00
13 changed files with 213 additions and 46 deletions
+4 -2
View File
@@ -17,13 +17,15 @@ jobs:
# list of Docker images to use as base name for tags # list of Docker images to use as base name for tags
images: | images: |
ghcr.io/${{ github.repository }} ghcr.io/${{ github.repository }}
# latest tag is only set for semver/tag-based builds (default behavior)
flavor: |
latest=auto
# generate Docker tags based on the following events/attributes # generate Docker tags based on the following events/attributes
tags: | tags: |
type=ref,event=branch type=ref,event=branch
type=semver,pattern={{version}} type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}} type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}} type=semver,pattern={{major}}
type=match,pattern=v\d.\d.\d,value=latest
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
if: github.event_name != 'pull_request' if: github.event_name != 'pull_request'
uses: docker/login-action@v3 uses: docker/login-action@v3
@@ -49,4 +51,4 @@ jobs:
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
# optional cache (speeds up rebuilds) # optional cache (speeds up rebuilds)
cache-from: type=gha cache-from: type=gha
cache-to: type=gha,mode=max cache-to: type=gha,mode=max
+1 -2
View File
@@ -35,7 +35,7 @@ RUN uv pip install --no-cache-dir --upgrade pip \
COPY --chown=${APP_USER}:${APP_USER} . . COPY --chown=${APP_USER}:${APP_USER} . .
# Patch config # Patch config
RUN patch sample.config.ini < container/config.patch RUN patch -p1 < container/config.patch
# Clean # Clean
RUN rm -rf /app/.git* && \ RUN rm -rf /app/.git* && \
@@ -77,4 +77,3 @@ CMD ["--pid_dir", "/tmp", "--py_exec", "/opt/venv/bin/python", "--config", "/etc
EXPOSE 8081 EXPOSE 8081
VOLUME [ "/etc/meshview", "/var/lib/meshview", "/var/log/meshview" ] VOLUME [ "/etc/meshview", "/var/lib/meshview", "/var/log/meshview" ]
+1 -1
View File
@@ -132,7 +132,7 @@ password =
# Examples: # Examples:
# sqlite+aiosqlite:///var/lib/meshview/packets.db # sqlite+aiosqlite:///var/lib/meshview/packets.db
# postgresql+asyncpg://user:pass@host:5432/meshview # postgresql+asyncpg://user:pass@host:5432/meshview
connection_string = sqlite+aiosqlite:///var/lib/meshview/packets.db connection_string = sqlite+aiosqlite:////var/lib/meshview/packets.db
``` ```
### Database Backups ### Database Backups
+91 -21
View File
@@ -4,6 +4,13 @@
The project serves as a real-time monitoring and diagnostic tool for the Meshtastic mesh network. It provides detailed insights into network activity, including message traffic, node positions, and telemetry data. The project serves as a real-time monitoring and diagnostic tool for the Meshtastic mesh network. It provides detailed insights into network activity, including message traffic, node positions, and telemetry data.
### Version 3.0.3 — January 2026
- Added database support for MySQL (not tested, would love to have someone test!) and PostgreSQL (alongside SQLite) for larger or shared deployments.
- Configuration updated to allow selecting the database backend via connection string.
### Version 3.0.2 — January 2026
- Changes to the Database to will make it so that there is a need for space when updating to the latest. SQlite requires to rebuild the database when droping a column. ( we are droping some of the old columns) so make sure you have 1.2x the size of the db of space in your environment. Depending on how big your db is it would take a long time.
### Version 3.0.1 — December 2025 ### Version 3.0.1 — December 2025
#### 🌐 Multi-Language Support (i18n) #### 🌐 Multi-Language Support (i18n)
@@ -82,23 +89,27 @@ Samples of currently running instances:
- https://meshview.bayme.sh (SF Bay Area) - https://meshview.bayme.sh (SF Bay Area)
- https://www.svme.sh (Sacramento Valley) - https://www.svme.sh (Sacramento Valley)
- https://meshview.nyme.sh (New York) - https://meshview.nyme.sh (New York)
- https://meshview.socalmesh.org (LA Area) - https://meshview.socalmesh.org (LA Area)
- https://map.wpamesh.net (Western Pennsylvania) - https://map.wpamesh.net (Western Pennsylvania)
- https://meshview.chicagolandmesh.org (Chicago) - https://meshview.chicagolandmesh.org (Chicago)
- https://meshview.mt.gt (Canadaverse) - https://meshview.mt.gt (Canada)
- https://canadaverse.org (Canadaverse) - https://canadaverse.org (Canada)
- https://meshview.meshtastic.es (Spain) - https://meshview.meshtastic.es (Spain)
- https://view.mtnme.sh (North Georgia / East Tennessee) - https://view.mtnme.sh (North Georgia / East Tennessee)
- https://meshview.lsinfra.de (Hessen - Germany) - https://meshview.lsinfra.de (Hessen - Germany)
- https://meshview.pvmesh.org (Pioneer Valley, Massachusetts) - https://meshview.pvmesh.org (Pioneer Valley, Massachusetts)
- https://meshview.louisianamesh.org (Louisiana) - https://meshview.louisianamesh.org (Louisiana)
- https://www.swlamesh.com/map (Southwest Louisiana) - https://www.swlamesh.com (Southwest Louisiana)
- https://meshview.meshcolombia.co/ (Colombia) - https://meshview.meshcolombia.co (Colombia)
- https://meshview-salzburg.jmt.gr/ (Salzburg / Austria) - https://meshview-salzburg.jmt.gr (Salzburg / Austria)
- https://map.cromesh.eu (Coatia)
- https://view.meshdresden.eu (Dresden / Germany)
--- ---
### Updating from 2.x to 3.x ### Updating from 2.x to 3.x
We are adding the use of Alembic. If using GitHub We are adding the use of Alembic. If using GitHub
Update your codebase by running the pull command Update your codebase by running the pull command
@@ -278,18 +289,6 @@ password = large4cats
# postgresql+asyncpg://user:pass@host:5432/meshview # postgresql+asyncpg://user:pass@host:5432/meshview
connection_string = sqlite+aiosqlite:///packets.db connection_string = sqlite+aiosqlite:///packets.db
> **NOTE (PostgreSQL setup)**
> If you want to use PostgreSQL instead of SQLite:
>
> 1) Install PostgreSQL for your OS.
> 2) Create a user and database:
> - `CREATE USER meshview WITH PASSWORD 'change_me';`
> - `CREATE DATABASE meshview OWNER meshview;`
> 3) Update `config.ini`:
> - `connection_string = postgresql+asyncpg://meshview:change_me@localhost:5432/meshview`
> 4) Initialize the schema:
> - `./env/bin/python startdb.py`
# ------------------------- # -------------------------
# Database Cleanup Configuration # Database Cleanup Configuration
@@ -321,6 +320,20 @@ db_cleanup_logfile = dbcleanup.log
--- ---
## NOTE (PostgreSQL setup)**
If you want to use PostgreSQL instead of SQLite:
Install PostgreSQL for your OS.
Create a user and database:
```
`CREATE USER meshview WITH PASSWORD 'change_me';`
`CREATE DATABASE meshview OWNER meshview;`
```
Update `config.ini` example:
```
`connection_string = postgresql+asyncpg://meshview:change_me@localhost:5432/meshview`
```
## Running Meshview ## Running Meshview
Start the database manager: Start the database manager:
@@ -490,16 +503,15 @@ db_cleanup_logfile = dbcleanup.log
``` ```
Once changes are done you need to restart the script for changes to load. Once changes are done you need to restart the script for changes to load.
### Alternatively we can do it via your OS ### Alternatively we can do it via your OS (This example is Ubuntu like OS)
- Create and save bash script below. (Modify /path/to/file/ to the correct path) - Create and save bash script below. (Modify /path/to/file/ to the correct path)
- Name it cleanup.sh - Name it cleanup.sh
- Make it executable. - Make it executable.
```bash ```bash
#!/bin/bash #!/bin/bash
DB_FILE="/path/to/file/packets.db" DB_FILE="/path/to/file/packets.db"
# Stop DB service # Stop DB service
sudo systemctl stop meshview-db.service sudo systemctl stop meshview-db.service
sudo systemctl stop meshview-web.service sudo systemctl stop meshview-web.service
@@ -533,6 +545,64 @@ sudo systemctl start meshview-web.service
echo "Database cleanup completed on $(date)" echo "Database cleanup completed on $(date)"
```
- If you are using PostgreSQL, use this version instead (adjust credentials/DB name):
```bash
#!/bin/bash
DB_NAME="meshview"
DB_USER="meshview"
DB_HOST="localhost"
DB_PORT="5432"
# Stop DB service
sudo systemctl stop meshview-db.service
sudo systemctl stop meshview-web.service
sleep 5
echo "Run cleanup..."
# Run cleanup queries
psql "postgresql://${DB_USER}@${DB_HOST}:${DB_PORT}/${DB_NAME}" <<'EOF'
WITH deleted AS (
DELETE FROM packet
WHERE import_time_us IS NOT NULL
AND import_time_us < (EXTRACT(EPOCH FROM (NOW() - INTERVAL '14 days')) * 1000000)
RETURNING 1
)
SELECT 'packet deleted: ' || COUNT(*) FROM deleted;
WITH deleted AS (
DELETE FROM packet_seen
WHERE import_time_us IS NOT NULL
AND import_time_us < (EXTRACT(EPOCH FROM (NOW() - INTERVAL '14 days')) * 1000000)
RETURNING 1
)
SELECT 'packet_seen deleted: ' || COUNT(*) FROM deleted;
WITH deleted AS (
DELETE FROM traceroute
WHERE import_time_us IS NOT NULL
AND import_time_us < (EXTRACT(EPOCH FROM (NOW() - INTERVAL '14 days')) * 1000000)
RETURNING 1
)
SELECT 'traceroute deleted: ' || COUNT(*) FROM deleted;
WITH deleted AS (
DELETE FROM node
WHERE last_seen_us IS NULL
OR last_seen_us < (EXTRACT(EPOCH FROM (NOW() - INTERVAL '14 days')) * 1000000)
RETURNING 1
)
SELECT 'node deleted: ' || COUNT(*) FROM deleted;
VACUUM;
EOF
# Start DB service
sudo systemctl start meshview-db.service
sudo systemctl start meshview-web.service
echo "Database cleanup completed on $(date)"
``` ```
- Schedule running the script on a regular basis. - Schedule running the script on a regular basis.
- In this example it runs every night at 2:00am. - In this example it runs every night at 2:00am.
+5 -4
View File
@@ -200,7 +200,7 @@ Response Example
### GET `/api/edges` ### GET `/api/edges`
Returns network edges (connections between nodes) based on traceroutes and neighbor info. Returns network edges (connections between nodes) based on traceroutes and neighbor info.
Traceroute edges are collected over the last 48 hours. Neighbor edges are based on Traceroute edges are collected over the last 12 hours. Neighbor edges are based on
port 71 packets. port 71 packets.
Query Parameters Query Parameters
@@ -366,7 +366,7 @@ Response Example
{ {
"status": "healthy", "status": "healthy",
"timestamp": "2025-07-22T12:45:00+00:00", "timestamp": "2025-07-22T12:45:00+00:00",
"version": "3.0.0", "version": "3.0.3",
"git_revision": "abc1234", "git_revision": "abc1234",
"database": "connected", "database": "connected",
"database_size": "12.34 MB", "database_size": "12.34 MB",
@@ -384,8 +384,9 @@ Returns version metadata.
Response Example Response Example
```json ```json
{ {
"version": "3.0.0", "version": "3.0.3",
"release_date": "2026-1-15",
"git_revision": "abc1234", "git_revision": "abc1234",
"build_time": "2025-11-01T12:00:00+00:00" "git_revision_short": "abc1234"
} }
``` ```
+2 -2
View File
@@ -3,8 +3,8 @@
import subprocess import subprocess
from pathlib import Path from pathlib import Path
__version__ = "3.0.2" __version__ = "3.0.4"
__release_date__ = "2026-1-9" __release_date__ = "2026-1-24"
def get_git_revision(): def get_git_revision():
+5 -1
View File
@@ -99,4 +99,8 @@ class Traceroute(Base):
route_return: Mapped[bytes] = mapped_column(nullable=True) route_return: Mapped[bytes] = mapped_column(nullable=True)
import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True) import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
__table_args__ = (Index("idx_traceroute_import_time_us", "import_time_us"),) __table_args__ = (
Index("idx_traceroute_packet_id", "packet_id"),
Index("idx_traceroute_import_time_us", "import_time_us"),
)
+17 -1
View File
@@ -1,16 +1,32 @@
from sqlalchemy.engine.url import make_url from sqlalchemy.engine.url import make_url
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
from sqlalchemy import event
from meshview import models from meshview import models
def init_database(database_connection_string): def init_database(database_connection_string):
global engine, async_session global engine, async_session
url = make_url(database_connection_string) url = make_url(database_connection_string)
kwargs = {"echo": False} kwargs = {"echo": False}
if url.drivername.startswith("sqlite"): if url.drivername.startswith("sqlite"):
kwargs["connect_args"] = {"timeout": 900} kwargs["connect_args"] = {"timeout": 900} # seconds
engine = create_async_engine(url, **kwargs) engine = create_async_engine(url, **kwargs)
# Enforce SQLite pragmas on every new DB connection
if url.drivername.startswith("sqlite"):
@event.listens_for(engine.sync_engine, "connect")
def _set_sqlite_pragmas(dbapi_conn, _):
cursor = dbapi_conn.cursor()
cursor.execute("PRAGMA journal_mode=WAL;")
cursor.execute("PRAGMA busy_timeout=900000;") # ms
cursor.execute("PRAGMA synchronous=NORMAL;")
cursor.close()
async_session = async_sessionmaker(engine, expire_on_commit=False) async_session = async_sessionmaker(engine, expire_on_commit=False)
+68 -9
View File
@@ -9,8 +9,9 @@ from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from google.protobuf.message import DecodeError from google.protobuf.message import DecodeError
from meshtastic.protobuf.mqtt_pb2 import ServiceEnvelope from meshtastic.protobuf.mqtt_pb2 import ServiceEnvelope
from meshview.config import CONFIG
KEY = base64.b64decode("1PG7OiApB1nwvP+rz05pAQ==") PRIMARY_KEY = base64.b64decode("1PG7OiApB1nwvP+rz05pAQ==")
logging.basicConfig( logging.basicConfig(
level=logging.INFO, level=logging.INFO,
@@ -21,20 +22,79 @@ logging.basicConfig(
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def decrypt(packet): def _parse_skip_node_ids():
if packet.HasField("decoded"): mqtt_config = CONFIG.get("mqtt", {})
return raw_value = mqtt_config.get("skip_node_ids", "")
if not raw_value:
return set()
if isinstance(raw_value, str):
raw_value = raw_value.strip()
if not raw_value:
return set()
values = [v.strip() for v in raw_value.split(",") if v.strip()]
else:
values = [raw_value]
skip_ids = set()
for value in values:
try:
skip_ids.add(int(value, 0))
except (TypeError, ValueError):
logger.warning("Invalid node id in mqtt.skip_node_ids: %s", value)
return skip_ids
def _parse_secondary_keys():
mqtt_config = CONFIG.get("mqtt", {})
raw_value = mqtt_config.get("secondary_keys", "")
if not raw_value:
return []
if isinstance(raw_value, str):
raw_value = raw_value.strip()
if not raw_value:
return []
values = [v.strip() for v in raw_value.split(",") if v.strip()]
else:
values = [raw_value]
keys = []
for value in values:
try:
keys.append(base64.b64decode(value))
except (TypeError, ValueError):
logger.warning("Invalid base64 key in mqtt.secondary_keys: %s", value)
return keys
SKIP_NODE_IDS = _parse_skip_node_ids()
SECONDARY_KEYS = _parse_secondary_keys()
def _try_decrypt(packet, key):
packet_id = packet.id.to_bytes(8, "little") packet_id = packet.id.to_bytes(8, "little")
from_node_id = getattr(packet, "from").to_bytes(8, "little") from_node_id = getattr(packet, "from").to_bytes(8, "little")
nonce = packet_id + from_node_id nonce = packet_id + from_node_id
cipher = Cipher(algorithms.AES(KEY), modes.CTR(nonce)) cipher = Cipher(algorithms.AES(key), modes.CTR(nonce))
decryptor = cipher.decryptor() decryptor = cipher.decryptor()
raw_proto = decryptor.update(packet.encrypted) + decryptor.finalize() raw_proto = decryptor.update(packet.encrypted) + decryptor.finalize()
try: try:
packet.decoded.ParseFromString(raw_proto) packet.decoded.ParseFromString(raw_proto)
except DecodeError: except DecodeError:
pass return False
return True
def decrypt(packet):
if packet.HasField("decoded"):
return
if _try_decrypt(packet, PRIMARY_KEY):
return
for key in SECONDARY_KEYS:
if _try_decrypt(packet, key):
return
async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_passwd): async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_passwd):
@@ -70,9 +130,8 @@ async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_pa
if not envelope.packet.decoded: if not envelope.packet.decoded:
continue continue
# Skip packets from specific node # Skip packets from configured node IDs
# FIXME: make this configurable as a list of node IDs to skip if getattr(envelope.packet, "from", None) in SKIP_NODE_IDS:
if getattr(envelope.packet, "from", None) == 2144342101:
continue continue
msg_count += 1 msg_count += 1
+1 -1
View File
@@ -178,7 +178,7 @@ document.addEventListener("DOMContentLoaded", async () => {
const sinceUs = Math.floor(sixDaysAgoMs * 1000); const sinceUs = Math.floor(sixDaysAgoMs * 1000);
const url = const url =
`/api/packets?portnum=1&contains=${encodeURIComponent(tag)}&since=${sinceUs}`; `/api/packets?portnum=1&contains=${encodeURIComponent(tag)}&since=${sinceUs}&limit=1000`;
const resp = await fetch(url); const resp = await fetch(url);
const data = await resp.json(); const data = await resp.json();
+6
View File
@@ -203,6 +203,12 @@ async def index(request):
raise web.HTTPFound(location=starting_url) raise web.HTTPFound(location=starting_url)
# redirect for backwards compatibility
@routes.get("/packet_list/{packet_id}")
async def redirect_packet_list(request):
packet_id = request.match_info["packet_id"]
raise web.HTTPFound(location=f"/node/{packet_id}")
# Generic static HTML route # Generic static HTML route
@routes.get("/{page}") @routes.get("/{page}")
async def serve_page(request): async def serve_page(request):
+6 -2
View File
@@ -180,13 +180,17 @@ async def api_packets(request):
logger.warning(f"Invalid node_id: {node_id_str}") logger.warning(f"Invalid node_id: {node_id_str}")
# --- Fetch packets using explicit filters --- # --- Fetch packets using explicit filters ---
contains_for_query = contains
if portnum == PortNum.TEXT_MESSAGE_APP and contains:
contains_for_query = None
packets = await store.get_packets( packets = await store.get_packets(
from_node_id=from_node_id, from_node_id=from_node_id,
to_node_id=to_node_id, to_node_id=to_node_id,
node_id=node_id, node_id=node_id,
portnum=portnum, portnum=portnum,
after=since, after=since,
contains=contains, contains=contains_for_query,
limit=limit, limit=limit,
) )
@@ -414,7 +418,7 @@ async def api_stats_count(request):
@routes.get("/api/edges") @routes.get("/api/edges")
async def api_edges(request): async def api_edges(request):
since = datetime.datetime.now() - datetime.timedelta(hours=48) since = datetime.datetime.now() - datetime.timedelta(hours=12)
filter_type = request.query.get("type") filter_type = request.query.get("type")
# NEW → optional single-node filter # NEW → optional single-node filter
+6
View File
@@ -76,6 +76,12 @@ port = 1883
username = meshdev username = meshdev
password = large4cats password = large4cats
# Optional list of node IDs to ignore. Comma-separated.
skip_node_ids =
# Optional list of secondary AES keys (base64), comma-separated.
secondary_keys =
# ------------------------- # -------------------------
# Database Configuration # Database Configuration