v2.3.0: Import-Jobs, Ordner-Loeschen, Serien-Konvertierung, Server-Log

Features:
- Import-Jobs: Persistierung in DB, Jobs beim Laden wiederherstellen
- Ordner loeschen: Button in Browser-Ansicht mit Modal-Dialog
- Serien konvertieren: Alle Episoden einer Serie in Queue senden
- Serien aufraumen: Alte Codec-Versionen nach Konvertierung loeschen
- Server-Log: Live-Ansicht in Admin mit Auto-Scroll
- Toast-Benachrichtigungen statt Browser-Alerts
- Bessere Fehlerbehandlung und Feedback

API:
- POST /api/library/delete-folder
- POST /api/library/series/{id}/convert
- GET /api/library/series/{id}/convert-status
- POST /api/library/series/{id}/cleanup
- GET /api/logs

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
Eduard Wisch 2026-02-24 14:48:30 +01:00
parent 08dcf34f5d
commit d65ca027e0
13 changed files with 1563 additions and 44 deletions

View file

@ -2,6 +2,102 @@
Alle relevanten Aenderungen am VideoKonverter-Projekt. Alle relevanten Aenderungen am VideoKonverter-Projekt.
## [2.3.0] - 2026-02-24
### Import-System Verbesserungen
**Bestehende Import-Jobs laden**
- Neue `GET /api/library/import` API liefert alle Import-Jobs
- Import-Modal zeigt jetzt offene Jobs oben an (Buttons mit Status)
- Klick auf Job laedt und zeigt Vorschau zum Fortsetzen
- Verhindert doppelte Importe der gleichen Quelle
**Import-Fortschritt mit Byte-Level**
- Neue DB-Felder: `current_file_name`, `current_file_bytes`, `current_file_total`
- Kopieren in 64MB-Chunks mit Progress-Updates alle 50MB
- UI zeigt aktuelle Datei und Byte-Fortschritt
**Gezielter Rescan nach Import**
- Nach Import wird nur der Ziel-Library-Pfad gescannt
- `imported_series` Liste im Job-Status fuer betroffene Ordner
- Statt `reloadAllSections()` nur `loadSectionData(targetPathId)`
### Ordner-Verwaltung
**Ordner-Loeschen Button**
- Neuer Muelleimer-Button (SVG-Icon) oben rechts bei Ordnern
- Erscheint nur bei Hover, rot bei Mouse-Over
- Schoener Bestaetigungs-Dialog statt Browser-confirm()
- Toast-Benachrichtigung statt alert()
**Delete-Folder API**
- `POST /api/library/delete-folder` mit Sicherheitspruefung
- Prueft ob Pfad unter einem Library-Pfad liegt
- Loescht Ordner + alle DB-Eintraege (library_videos)
- Gibt geloeschte Dateien/Ordner/DB-Eintraege zurueck
### Serie konvertieren
**Batch-Konvertierung fuer Serien**
- Neuer Button "Serie konvertieren" im Serien-Modal
- Modal mit Codec-Auswahl (AV1/HEVC/H.264)
- Option: Alle neu konvertieren (auch bereits passende)
- Option: Quelldateien nach Konvertierung loeschen
- `POST /api/library/series/{id}/convert` API
- `GET /api/library/series/{id}/convert-status` fuer Codec-Statistik
**Cleanup-Funktion fuer Serien**
- "Alte Dateien loeschen" Button im Serien-Modal
- Loescht alles ausser: registrierte Videos, .metadata, .nfo, Bilder
- `POST /api/library/series/{id}/cleanup` API
### Server-Log System
**Benachrichtigungs-Glocke**
- Glocken-Icon unten rechts auf allen Seiten
- Badge zeigt ungelesene Fehler-Anzahl (rot)
- Log-Panel mit allen Server-Meldungen
- Fehler/Warnings farblich hervorgehoben
**Log-API**
- `GET /api/logs?since=ID` liefert neue Log-Eintraege
- In-Memory-Buffer (max 200 Eintraege)
- Polling alle 2 Sekunden
### UI-Verbesserungen
**Toast-Benachrichtigungen**
- Verbesserte Styling mit Slide-Animation
- Farbige linke Border (success/error/info)
- 4 Sekunden Anzeigedauer
**TVDB-Suche**
- Checkbox "Englische Titel durchsuchen" im TVDB-Modal
- Ermoeglicht Suche nach englischen Originaltiteln
### Technische Aenderungen
**Neue/Geaenderte Dateien**
- `app/routes/library_api.py` - 6 neue Endpoints (+200 Z.)
- `app/services/importer.py` - get_all_jobs(), Progress-Tracking (+100 Z.)
- `app/services/queue.py` - delete_source Option bei add_paths()
- `app/static/js/library.js` - Dialog-System, Toast, Import-Jobs (+150 Z.)
- `app/static/css/style.css` - Toast, Delete-Button, Dialog-Styles (+50 Z.)
- `app/templates/library.html` - Confirm-Modal, Convert-Modal (+50 Z.)
- `app/templates/base.html` - Benachrichtigungs-Glocke + Log-Panel (+100 Z.)
- `app/routes/api.py` - /api/logs Endpoint, WebLogHandler (+40 Z.)
- `app/models/job.py` - delete_source Flag
**Neue API-Endpoints**
- `GET /api/library/import` - Liste aller Import-Jobs
- `POST /api/library/delete-folder` - Ordner loeschen
- `POST /api/library/series/{id}/convert` - Serie konvertieren
- `GET /api/library/series/{id}/convert-status` - Codec-Status
- `POST /api/library/series/{id}/cleanup` - Alte Dateien loeschen
- `GET /api/logs` - Server-Log abrufen
---
## [2.2.0] - 2026-02-21 ## [2.2.0] - 2026-02-21
### Bugfixes ### Bugfixes

View file

@ -224,8 +224,12 @@ Web-UI: http://localhost:8080
| GET | `/api/library/series/{id}` | Serie mit Episoden | | GET | `/api/library/series/{id}` | Serie mit Episoden |
| GET | `/api/library/series/{id}/missing` | Fehlende Episoden | | GET | `/api/library/series/{id}/missing` | Fehlende Episoden |
| POST | `/api/library/series/{id}/tvdb-match` | TVDB-ID zuordnen | | POST | `/api/library/series/{id}/tvdb-match` | TVDB-ID zuordnen |
| POST | `/api/library/series/{id}/convert` | Alle Episoden konvertieren |
| GET | `/api/library/series/{id}/convert-status` | Codec-Status der Serie |
| POST | `/api/library/series/{id}/cleanup` | Alte Dateien loeschen |
| GET | `/api/library/duplicates` | Duplikate finden | | GET | `/api/library/duplicates` | Duplikate finden |
| POST | `/api/library/videos/{id}/convert` | Direkt konvertieren | | POST | `/api/library/videos/{id}/convert` | Direkt konvertieren |
| POST | `/api/library/delete-folder` | Ordner komplett loeschen |
| GET | `/api/library/stats` | Bibliotheks-Statistiken | | GET | `/api/library/stats` | Bibliotheks-Statistiken |
| GET | `/api/library/movies` | Filme auflisten | | GET | `/api/library/movies` | Filme auflisten |
| POST | `/api/library/movies/{id}/tvdb-match` | Film-TVDB-Zuordnung | | POST | `/api/library/movies/{id}/tvdb-match` | Film-TVDB-Zuordnung |
@ -236,6 +240,21 @@ Web-UI: http://localhost:8080
| GET | `/api/tvdb/language` | TVDB-Sprache lesen | | GET | `/api/tvdb/language` | TVDB-Sprache lesen |
| PUT | `/api/tvdb/language` | TVDB-Sprache aendern | | PUT | `/api/tvdb/language` | TVDB-Sprache aendern |
### Import
| Methode | Pfad | Beschreibung |
|---------|------|-------------|
| GET | `/api/library/import` | Alle Import-Jobs auflisten |
| POST | `/api/library/import` | Neuen Import-Job erstellen |
| GET | `/api/library/import/{id}` | Import-Job Status mit Items |
| POST | `/api/library/import/{id}/analyze` | Import analysieren |
| POST | `/api/library/import/{id}/execute` | Import ausfuehren |
### System
| Methode | Pfad | Beschreibung |
|---------|------|-------------|
| GET | `/api/logs` | Server-Logs abrufen |
| GET | `/api/system` | System-Info (GPU, Jobs) |
### Video-Filter (`/api/library/videos`) ### Video-Filter (`/api/library/videos`)
``` ```
?video_codec=hevc # h264, hevc, av1, mpeg4 ?video_codec=hevc # h264, hevc, av1, mpeg4

View file

@ -33,6 +33,9 @@ class ConversionJob:
target_filename: str = "" target_filename: str = ""
target_container: str = "webm" target_container: str = "webm"
# Optionen
delete_source: bool = False # Quelldatei nach Konvertierung loeschen
# ffmpeg Prozess # ffmpeg Prozess
ffmpeg_cmd: list[str] = field(default_factory=list) ffmpeg_cmd: list[str] = field(default_factory=list)
process: Optional[asyncio.subprocess.Process] = field(default=None, repr=False) process: Optional[asyncio.subprocess.Process] = field(default=None, repr=False)
@ -198,4 +201,5 @@ class ConversionJob:
"preset_name": self.preset_name, "preset_name": self.preset_name,
"status": self.status.value, "status": self.status.value,
"created_at": self.created_at, "created_at": self.created_at,
"delete_source": self.delete_source,
} }

View file

@ -335,7 +335,42 @@ def setup_api_routes(app: web.Application, config: Config,
"jobs": [{"id": j.id, "file": j.media.source_filename} for j in jobs], "jobs": [{"id": j.id, "file": j.media.source_filename} for j in jobs],
}) })
# --- Logs ---
# In-Memory Log-Buffer
_log_buffer = []
_log_id = 0
_MAX_LOGS = 200
class WebLogHandler(logging.Handler):
"""Handler der Logs an den Buffer sendet"""
def emit(self, record):
nonlocal _log_id
_log_id += 1
entry = {
"id": _log_id,
"level": record.levelname,
"message": record.getMessage(),
"time": record.created,
}
_log_buffer.append(entry)
# Buffer begrenzen
while len(_log_buffer) > _MAX_LOGS:
_log_buffer.pop(0)
# Handler registrieren
web_handler = WebLogHandler()
web_handler.setLevel(logging.INFO)
logging.getLogger().addHandler(web_handler)
async def get_logs(request: web.Request) -> web.Response:
"""GET /api/logs?since=123 - Logs seit ID"""
since = int(request.query.get("since", 0))
logs = [l for l in _log_buffer if l["id"] > since]
return web.json_response({"logs": logs})
# --- Routes registrieren --- # --- Routes registrieren ---
app.router.add_get("/api/logs", get_logs)
app.router.add_get("/api/browse", get_browse) app.router.add_get("/api/browse", get_browse)
app.router.add_post("/api/upload", post_upload) app.router.add_post("/api/upload", post_upload)
app.router.add_post("/api/convert", post_convert) app.router.add_post("/api/convert", post_convert)

View file

@ -234,8 +234,12 @@ def setup_library_routes(app: web.Application, config: Config,
return web.json_response(result) return web.json_response(result)
async def get_tvdb_search(request: web.Request) -> web.Response: async def get_tvdb_search(request: web.Request) -> web.Response:
"""GET /api/tvdb/search?q=Breaking+Bad""" """GET /api/tvdb/search?q=Breaking+Bad&lang=eng
lang: Sprache fuer Ergebnisse (deu, eng, etc.)
Standard: konfigurierte Sprache
"""
query = request.query.get("q", "").strip() query = request.query.get("q", "").strip()
lang = request.query.get("lang", "").strip() or None
if not query: if not query:
return web.json_response( return web.json_response(
{"error": "Suchbegriff erforderlich"}, status=400 {"error": "Suchbegriff erforderlich"}, status=400
@ -245,7 +249,7 @@ def setup_library_routes(app: web.Application, config: Config,
{"error": "TVDB nicht konfiguriert (API Key fehlt)"}, {"error": "TVDB nicht konfiguriert (API Key fehlt)"},
status=400, status=400,
) )
results = await tvdb_service.search_series(query) results = await tvdb_service.search_series(query, language=lang)
return web.json_response({"results": results}) return web.json_response({"results": results})
# === TVDB Metadaten === # === TVDB Metadaten ===
@ -638,6 +642,326 @@ def setup_library_routes(app: web.Application, config: Config,
{"error": "Job konnte nicht erstellt werden"}, status=500 {"error": "Job konnte nicht erstellt werden"}, status=500
) )
# === Batch-Konvertierung Serie ===
async def post_convert_series(request: web.Request) -> web.Response:
"""POST /api/library/series/{series_id}/convert
Konvertiert alle Episoden einer Serie die nicht im Zielformat sind.
Body: {preset, target_codec, force_all, delete_old}
- preset: Encoding-Preset (optional, nimmt default)
- target_codec: Ziel-Codec zum Vergleich (z.B. 'av1', 'hevc')
- force_all: true = alle konvertieren, false = nur nicht-Zielformat
- delete_old: true = alte Quelldateien nach Konvertierung loeschen
"""
import os
series_id = int(request.match_info["series_id"])
try:
data = await request.json()
except Exception:
data = {}
preset = data.get("preset")
target_codec = data.get("target_codec", "av1").lower()
force_all = data.get("force_all", False)
delete_old = data.get("delete_old", False)
pool = await library_service._get_pool()
if not pool:
return web.json_response(
{"error": "Keine DB-Verbindung"}, status=500
)
try:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
# Alle Videos der Serie laden
await cur.execute(
"SELECT id, file_path, video_codec "
"FROM library_videos WHERE series_id = %s",
(series_id,)
)
videos = await cur.fetchall()
# Serien-Ordner fuer Cleanup
await cur.execute(
"SELECT folder_path FROM library_series WHERE id = %s",
(series_id,)
)
series_row = await cur.fetchone()
series_folder = series_row[0] if series_row else None
except Exception as e:
return web.json_response({"error": str(e)}, status=500)
if not videos:
return web.json_response(
{"error": "Keine Videos gefunden"}, status=404
)
# Codec-Mapping fuer Vergleich
codec_aliases = {
"av1": ["av1", "libaom-av1", "libsvtav1", "av1_vaapi"],
"hevc": ["hevc", "h265", "libx265", "hevc_vaapi"],
"h264": ["h264", "avc", "libx264", "h264_vaapi"],
}
target_codecs = codec_aliases.get(target_codec, [target_codec])
to_convert = []
already_done = 0
for vid_id, file_path, current_codec in videos:
current = (current_codec or "").lower()
is_target = any(tc in current for tc in target_codecs)
if force_all or not is_target:
to_convert.append(file_path)
else:
already_done += 1
if not to_convert:
return web.json_response({
"message": "Alle Episoden sind bereits im Zielformat",
"already_done": already_done,
"queued": 0,
})
# Jobs erstellen mit delete_source Option
jobs = await queue_service.add_paths(
to_convert, preset, delete_source=delete_old
)
return web.json_response({
"message": f"{len(jobs)} Episoden zur Konvertierung hinzugefuegt",
"queued": len(jobs),
"already_done": already_done,
"skipped": len(videos) - len(jobs) - already_done,
"delete_old": delete_old,
})
async def post_cleanup_series_folder(request: web.Request) -> web.Response:
"""POST /api/library/series/{series_id}/cleanup
Loescht alle Dateien im Serien-Ordner AUSSER:
- Videos die in der Bibliothek sind
- .metadata Verzeichnis und dessen Inhalt
- .nfo Dateien
"""
import os
series_id = int(request.match_info["series_id"])
pool = await library_service._get_pool()
if not pool:
return web.json_response(
{"error": "Keine DB-Verbindung"}, status=500
)
try:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
# Serien-Ordner
await cur.execute(
"SELECT folder_path FROM library_series WHERE id = %s",
(series_id,)
)
row = await cur.fetchone()
if not row:
return web.json_response(
{"error": "Serie nicht gefunden"}, status=404
)
series_folder = row[0]
# Alle Videos der Serie (diese behalten)
await cur.execute(
"SELECT file_path FROM library_videos WHERE series_id = %s",
(series_id,)
)
keep_files = {r[0] for r in await cur.fetchall()}
except Exception as e:
return web.json_response({"error": str(e)}, status=500)
if not series_folder or not os.path.isdir(series_folder):
return web.json_response(
{"error": "Serien-Ordner nicht gefunden"}, status=404
)
# Geschuetzte Pfade/Dateien
protected_dirs = {".metadata", "@eaDir", ".AppleDouble"}
protected_extensions = {".nfo", ".jpg", ".jpeg", ".png", ".xml"}
deleted = 0
errors = []
for root, dirs, files in os.walk(series_folder, topdown=True):
# Geschuetzte Verzeichnisse ueberspringen
dirs[:] = [d for d in dirs if d not in protected_dirs]
for f in files:
file_path = os.path.join(root, f)
ext = os.path.splitext(f)[1].lower()
# Behalten wenn:
# - In der Bibliothek registriert
# - Geschuetzte Extension
# - Versteckte Datei
if file_path in keep_files:
continue
if ext in protected_extensions:
continue
if f.startswith("."):
continue
# Loeschen
try:
os.remove(file_path)
deleted += 1
logging.info(f"Cleanup geloescht: {file_path}")
except Exception as e:
errors.append(f"{f}: {e}")
return web.json_response({
"deleted": deleted,
"errors": len(errors),
"error_details": errors[:10], # Max 10 Fehler anzeigen
})
async def post_delete_folder(request: web.Request) -> web.Response:
"""POST /api/library/delete-folder
Loescht einen kompletten Ordner (Season-Ordner etc.) inkl. DB-Eintraege.
Body: {folder_path: "/mnt/.../Season 01"}
ACHTUNG: Unwiderruflich!
"""
import shutil
try:
data = await request.json()
except Exception:
return web.json_response(
{"error": "Ungueltiges JSON"}, status=400
)
folder_path = data.get("folder_path", "").strip()
if not folder_path:
return web.json_response(
{"error": "folder_path erforderlich"}, status=400
)
# Sicherheitspruefung: Muss unter einem Library-Pfad liegen
pool = await library_service._get_pool()
if not pool:
return web.json_response(
{"error": "Keine DB-Verbindung"}, status=500
)
allowed = False
try:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute(
"SELECT path FROM library_paths WHERE enabled = 1"
)
paths = await cur.fetchall()
for (lib_path,) in paths:
if folder_path.startswith(lib_path):
allowed = True
break
except Exception as e:
return web.json_response({"error": str(e)}, status=500)
if not allowed:
return web.json_response(
{"error": "Ordner liegt nicht in einem Bibliothekspfad"},
status=403
)
if not os.path.isdir(folder_path):
return web.json_response(
{"error": "Ordner nicht gefunden"}, status=404
)
# Zaehlen was geloescht wird
deleted_files = 0
deleted_dirs = 0
errors = []
# Zuerst alle Dateien zaehlen
for root, dirs, files in os.walk(folder_path):
deleted_files += len(files)
deleted_dirs += len(dirs)
# DB-Eintraege loeschen (Videos in diesem Ordner)
db_removed = 0
try:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
# Videos loeschen deren file_path mit folder_path beginnt
await cur.execute(
"DELETE FROM library_videos "
"WHERE file_path LIKE %s",
(folder_path + "%",)
)
db_removed = cur.rowcount
except Exception as e:
errors.append(f"DB-Fehler: {e}")
# Ordner loeschen
try:
shutil.rmtree(folder_path)
logging.info(f"Ordner geloescht: {folder_path}")
except Exception as e:
logging.error(f"Ordner loeschen fehlgeschlagen: {e}")
return web.json_response(
{"error": f"Loeschen fehlgeschlagen: {e}"}, status=500
)
return web.json_response({
"deleted_files": deleted_files,
"deleted_dirs": deleted_dirs,
"db_removed": db_removed,
"errors": errors,
})
async def get_series_convert_status(request: web.Request) -> web.Response:
"""GET /api/library/series/{series_id}/convert-status
Zeigt Codec-Status aller Episoden einer Serie."""
series_id = int(request.match_info["series_id"])
pool = await library_service._get_pool()
if not pool:
return web.json_response(
{"error": "Keine DB-Verbindung"}, status=500
)
try:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute(
"SELECT id, file_name, video_codec, season_number, "
"episode_number FROM library_videos "
"WHERE series_id = %s ORDER BY season_number, episode_number",
(series_id,)
)
videos = await cur.fetchall()
except Exception as e:
return web.json_response({"error": str(e)}, status=500)
# Codec-Statistik
codec_counts = {}
episodes = []
for vid_id, name, codec, season, episode in videos:
codec_lower = (codec or "unknown").lower()
codec_counts[codec_lower] = codec_counts.get(codec_lower, 0) + 1
episodes.append({
"id": vid_id,
"name": name,
"codec": codec,
"season": season,
"episode": episode,
})
return web.json_response({
"total": len(videos),
"codec_counts": codec_counts,
"episodes": episodes,
})
# === Statistiken === # === Statistiken ===
async def get_library_stats(request: web.Request) -> web.Response: async def get_library_stats(request: web.Request) -> web.Response:
@ -823,6 +1147,15 @@ def setup_library_routes(app: web.Application, config: Config,
result = await importer_service.analyze_job(job_id) result = await importer_service.analyze_job(job_id)
return web.json_response(result) return web.json_response(result)
async def get_import_jobs(request: web.Request) -> web.Response:
"""GET /api/library/import - Liste aller Import-Jobs"""
if not importer_service:
return web.json_response(
{"error": "Import-Service nicht verfuegbar"}, status=500
)
jobs = await importer_service.get_all_jobs()
return web.json_response({"jobs": jobs})
async def get_import_status(request: web.Request) -> web.Response: async def get_import_status(request: web.Request) -> web.Response:
"""GET /api/library/import/{job_id}""" """GET /api/library/import/{job_id}"""
if not importer_service: if not importer_service:
@ -952,6 +1285,18 @@ def setup_library_routes(app: web.Application, config: Config,
app.router.add_post( app.router.add_post(
"/api/library/videos/{video_id}/convert", post_convert_video "/api/library/videos/{video_id}/convert", post_convert_video
) )
app.router.add_post(
"/api/library/series/{series_id}/convert", post_convert_series
)
app.router.add_get(
"/api/library/series/{series_id}/convert-status",
get_series_convert_status
)
app.router.add_post(
"/api/library/series/{series_id}/cleanup",
post_cleanup_series_folder
)
app.router.add_post("/api/library/delete-folder", post_delete_folder)
# Statistiken # Statistiken
app.router.add_get("/api/library/stats", get_library_stats) app.router.add_get("/api/library/stats", get_library_stats)
# Clean # Clean
@ -963,6 +1308,7 @@ def setup_library_routes(app: web.Application, config: Config,
# Filesystem-Browser # Filesystem-Browser
app.router.add_get("/api/library/browse-fs", get_browse_fs) app.router.add_get("/api/library/browse-fs", get_browse_fs)
# Import # Import
app.router.add_get("/api/library/import", get_import_jobs)
app.router.add_post("/api/library/import", post_create_import) app.router.add_post("/api/library/import", post_create_import)
app.router.add_post( app.router.add_post(
"/api/library/import/{job_id}/analyze", post_analyze_import "/api/library/import/{job_id}/analyze", post_analyze_import

View file

@ -503,7 +503,7 @@ class ImporterService:
return "" return ""
async def execute_import(self, job_id: int) -> dict: async def execute_import(self, job_id: int) -> dict:
"""Fuehrt den Import aus (Kopieren/Verschieben)""" """Fuehrt den Import aus (Kopieren/Verschieben + TVDB-Link)"""
if not self._db_pool: if not self._db_pool:
return {"error": "Keine DB-Verbindung"} return {"error": "Keine DB-Verbindung"}
@ -537,10 +537,16 @@ class ImporterService:
errors = 0 errors = 0
mode = job.get("mode", "copy") mode = job.get("mode", "copy")
# TVDB-IDs sammeln fuer spaetere Verknuepfung
tvdb_links = {} # series_name -> tvdb_id
for item in items: for item in items:
ok = await self._process_item(item, mode) ok = await self._process_item(item, mode, job_id)
if ok: if ok:
done += 1 done += 1
# TVDB-Link merken
if item.get("tvdb_series_id") and item.get("tvdb_series_name"):
tvdb_links[item["tvdb_series_name"]] = item["tvdb_series_id"]
else: else:
errors += 1 errors += 1
@ -561,14 +567,62 @@ class ImporterService:
"WHERE id = %s", (status, job_id) "WHERE id = %s", (status, job_id)
) )
return {"done": done, "errors": errors} # TVDB-Zuordnungen in library_series uebernehmen
linked_series = 0
if tvdb_links:
linked_series = await self._link_tvdb_to_series(tvdb_links)
return {
"done": done,
"errors": errors,
"tvdb_linked": linked_series,
}
except Exception as e: except Exception as e:
logging.error(f"Import ausfuehren fehlgeschlagen: {e}") logging.error(f"Import ausfuehren fehlgeschlagen: {e}")
return {"error": str(e)} return {"error": str(e)}
async def _process_item(self, item: dict, mode: str) -> bool: async def _link_tvdb_to_series(self, tvdb_links: dict) -> int:
"""Einzelnes Item importieren (kopieren/verschieben)""" """Verknuepft importierte Serien mit TVDB in library_series"""
if not self._db_pool or not self.tvdb:
return 0
linked = 0
for series_name, tvdb_id in tvdb_links.items():
try:
async with self._db_pool.acquire() as conn:
async with conn.cursor() as cur:
# Serie in library_series finden (nach Namen)
await cur.execute(
"SELECT id, tvdb_id FROM library_series "
"WHERE (folder_name = %s OR title = %s) "
"AND tvdb_id IS NULL "
"LIMIT 1",
(series_name, series_name)
)
row = await cur.fetchone()
if row:
series_id = row[0]
# TVDB-Daten laden und verknuepfen
result = await self.tvdb.match_and_update_series(
series_id, int(tvdb_id), self.library
)
if not result.get("error"):
linked += 1
logging.info(
f"Import: TVDB verknuepft - "
f"{series_name} -> {tvdb_id}"
)
except Exception as e:
logging.warning(
f"TVDB-Link fehlgeschlagen fuer {series_name}: {e}"
)
return linked
async def _process_item(self, item: dict, mode: str,
job_id: int = 0) -> bool:
"""Einzelnes Item importieren (kopieren/verschieben + Metadaten)"""
src = item["source_file"] src = item["source_file"]
target_dir = item["target_path"] target_dir = item["target_path"]
target_file = item["target_filename"] target_file = item["target_filename"]
@ -578,19 +632,39 @@ class ImporterService:
return False return False
target = os.path.join(target_dir, target_file) target = os.path.join(target_dir, target_file)
src_size = item.get("source_size", 0) or os.path.getsize(src)
try: try:
# Zielordner erstellen # Zielordner erstellen
os.makedirs(target_dir, exist_ok=True) os.makedirs(target_dir, exist_ok=True)
# Fortschritt-Tracking in DB setzen
if job_id and self._db_pool:
await self._update_file_progress(
job_id, target_file, 0, src_size
)
if mode == "move": if mode == "move":
shutil.move(src, target) shutil.move(src, target)
# Bei Move sofort fertig
if job_id and self._db_pool:
await self._update_file_progress(
job_id, target_file, src_size, src_size
)
else: else:
shutil.copy2(src, target) # Kopieren mit Fortschritt
await self._copy_with_progress(
src, target, job_id, target_file, src_size
)
logging.info( logging.info(
f"Import: {os.path.basename(src)} -> {target}" f"Import: {os.path.basename(src)} -> {target}"
) )
# Metadaten in Datei einbetten (falls TVDB-Infos vorhanden)
if item.get("tvdb_series_name") or item.get("detected_series"):
await self._embed_metadata(target, item)
await self._update_item_status(item["id"], "done") await self._update_item_status(item["id"], "done")
return True return True
@ -599,6 +673,79 @@ class ImporterService:
await self._update_item_status(item["id"], "error") await self._update_item_status(item["id"], "error")
return False return False
async def _embed_metadata(self, file_path: str, item: dict) -> bool:
"""Bettet Metadaten mit ffmpeg in die Datei ein"""
import asyncio
import tempfile
series_name = item.get("tvdb_series_name") or item.get("detected_series") or ""
season = item.get("detected_season") or 0
episode = item.get("detected_episode") or 0
episode_title = item.get("tvdb_episode_title") or ""
if not series_name:
return False
# Temporaere Ausgabedatei
base, ext = os.path.splitext(file_path)
temp_file = f"{base}_temp{ext}"
# ffmpeg Metadaten-Befehl
cmd = [
"ffmpeg", "-y", "-i", file_path,
"-map", "0",
"-c", "copy",
"-metadata", f"title={episode_title}" if episode_title else f"S{season:02d}E{episode:02d}",
"-metadata", f"show={series_name}",
"-metadata", f"season_number={season}",
"-metadata", f"episode_sort={episode}",
"-metadata", f"episode_id=S{season:02d}E{episode:02d}",
]
# Fuer MKV zusaetzliche Tags
if file_path.lower().endswith(".mkv"):
cmd.extend([
"-metadata:s:v:0", f"title={series_name} - S{season:02d}E{episode:02d}",
])
cmd.append(temp_file)
try:
process = await asyncio.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
)
_, stderr = await asyncio.wait_for(
process.communicate(), timeout=600 # 10 Min fuer grosse Dateien
)
if process.returncode == 0:
# Temporaere Datei ueber Original verschieben
os.replace(temp_file, file_path)
logging.info(f"Metadaten eingebettet: {os.path.basename(file_path)}")
return True
else:
logging.warning(
f"Metadaten einbetten fehlgeschlagen: "
f"{stderr.decode()[:200]}"
)
# Temp-Datei loeschen falls vorhanden
if os.path.exists(temp_file):
os.remove(temp_file)
return False
except asyncio.TimeoutError:
logging.warning(f"Metadaten einbetten Timeout: {file_path}")
if os.path.exists(temp_file):
os.remove(temp_file)
return False
except Exception as e:
logging.warning(f"Metadaten einbetten Fehler: {e}")
if os.path.exists(temp_file):
os.remove(temp_file)
return False
async def _update_item_status(self, item_id: int, async def _update_item_status(self, item_id: int,
status: str) -> None: status: str) -> None:
if not self._db_pool: if not self._db_pool:
@ -613,6 +760,79 @@ class ImporterService:
except Exception: except Exception:
pass pass
async def _update_file_progress(self, job_id: int, filename: str,
bytes_done: int, bytes_total: int) -> None:
"""Aktualisiert Byte-Fortschritt fuer aktuelle Datei"""
if not self._db_pool:
return
try:
async with self._db_pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute(
"UPDATE import_jobs SET "
"current_file_name = %s, "
"current_file_bytes = %s, "
"current_file_total = %s "
"WHERE id = %s",
(filename, bytes_done, bytes_total, job_id)
)
except Exception:
pass
async def _copy_with_progress(self, src: str, dst: str,
job_id: int, filename: str,
total_size: int) -> None:
"""Kopiert Datei mit Fortschritts-Updates in DB"""
import asyncio
chunk_size = 64 * 1024 * 1024 # 64 MB Chunks
bytes_copied = 0
last_update = 0
loop = asyncio.get_event_loop()
def copy_chunk():
nonlocal bytes_copied
with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
while True:
chunk = fsrc.read(chunk_size)
if not chunk:
break
fdst.write(chunk)
bytes_copied += len(chunk)
# Kopieren in Thread ausfuehren (nicht blockierend)
# Aber wir brauchen trotzdem Progress-Updates...
# Alternative: Chunk-weise mit Updates
with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
while True:
# Chunk lesen (in Thread um nicht zu blockieren)
chunk = await loop.run_in_executor(
None, fsrc.read, chunk_size
)
if not chunk:
break
# Chunk schreiben
await loop.run_in_executor(None, fdst.write, chunk)
bytes_copied += len(chunk)
# Progress nur alle 50 MB updaten (weniger DB-Last)
if bytes_copied - last_update >= 50 * 1024 * 1024:
await self._update_file_progress(
job_id, filename, bytes_copied, total_size
)
last_update = bytes_copied
# Finales Update
await self._update_file_progress(
job_id, filename, total_size, total_size
)
# Metadaten kopieren (Zeitstempel etc.)
shutil.copystat(src, dst)
async def resolve_conflict(self, item_id: int, async def resolve_conflict(self, item_id: int,
action: str) -> bool: action: str) -> bool:
"""Konflikt loesen: overwrite, skip, rename""" """Konflikt loesen: overwrite, skip, rename"""
@ -685,6 +905,24 @@ class ImporterService:
logging.error(f"Import-Item aktualisieren fehlgeschlagen: {e}") logging.error(f"Import-Item aktualisieren fehlgeschlagen: {e}")
return False return False
async def get_all_jobs(self) -> list:
"""Liste aller Import-Jobs (neueste zuerst)"""
if not self._db_pool:
return []
try:
async with self._db_pool.acquire() as conn:
async with conn.cursor(aiomysql.DictCursor) as cur:
await cur.execute(
"SELECT id, source_path, status, total_files, "
"processed_files, created_at FROM import_jobs "
"ORDER BY id DESC LIMIT 20"
)
jobs = await cur.fetchall()
return [self._serialize(j) for j in jobs]
except Exception as e:
logging.error(f"Import-Jobs laden fehlgeschlagen: {e}")
return []
async def get_job_status(self, job_id: int) -> dict: async def get_job_status(self, job_id: int) -> dict:
"""Status eines Import-Jobs mit allen Items""" """Status eines Import-Jobs mit allen Items"""
if not self._db_pool: if not self._db_pool:
@ -706,9 +944,19 @@ class ImporterService:
) )
items = await cur.fetchall() items = await cur.fetchall()
# Bei abgeschlossenen Jobs: Importierte Serien-Ordner sammeln
imported_series = []
if job.get("status") in ("done", "error"):
series_folders = set()
for item in items:
if item.get("status") == "done" and item.get("target_path"):
series_folders.add(item["target_path"])
imported_series = list(series_folders)
return { return {
"job": self._serialize(job), "job": self._serialize(job),
"items": [self._serialize(i) for i in items], "items": [self._serialize(i) for i in items],
"imported_series": imported_series,
} }
except Exception as e: except Exception as e:
return {"error": str(e)} return {"error": str(e)}

View file

@ -66,7 +66,8 @@ class QueueService:
logging.info("Queue gestoppt") logging.info("Queue gestoppt")
async def add_job(self, media: MediaFile, async def add_job(self, media: MediaFile,
preset_name: Optional[str] = None) -> Optional[ConversionJob]: preset_name: Optional[str] = None,
delete_source: bool = False) -> Optional[ConversionJob]:
"""Fuegt neuen Job zur Queue hinzu""" """Fuegt neuen Job zur Queue hinzu"""
if self._is_duplicate(media.source_path): if self._is_duplicate(media.source_path):
logging.info(f"Duplikat uebersprungen: {media.source_filename}") logging.info(f"Duplikat uebersprungen: {media.source_filename}")
@ -76,6 +77,7 @@ class QueueService:
preset_name = self.config.default_preset_name preset_name = self.config.default_preset_name
job = ConversionJob(media=media, preset_name=preset_name) job = ConversionJob(media=media, preset_name=preset_name)
job.delete_source = delete_source
job.build_target_path(self.config) job.build_target_path(self.config)
self.jobs[job.id] = job self.jobs[job.id] = job
self._save_queue() self._save_queue()
@ -83,6 +85,7 @@ class QueueService:
logging.info( logging.info(
f"Job hinzugefuegt: {media.source_filename} " f"Job hinzugefuegt: {media.source_filename} "
f"-> {job.target_filename} (Preset: {preset_name})" f"-> {job.target_filename} (Preset: {preset_name})"
f"{' [delete_source]' if delete_source else ''}"
) )
await self.ws_manager.broadcast_queue_update() await self.ws_manager.broadcast_queue_update()
@ -90,7 +93,8 @@ class QueueService:
async def add_paths(self, paths: list[str], async def add_paths(self, paths: list[str],
preset_name: Optional[str] = None, preset_name: Optional[str] = None,
recursive: Optional[bool] = None) -> list[ConversionJob]: recursive: Optional[bool] = None,
delete_source: bool = False) -> list[ConversionJob]:
"""Fuegt mehrere Pfade hinzu (Dateien und Ordner)""" """Fuegt mehrere Pfade hinzu (Dateien und Ordner)"""
jobs = [] jobs = []
all_files = [] all_files = []
@ -107,7 +111,7 @@ class QueueService:
for file_path in all_files: for file_path in all_files:
media = await ProbeService.analyze(file_path) media = await ProbeService.analyze(file_path)
if media: if media:
job = await self.add_job(media, preset_name) job = await self.add_job(media, preset_name, delete_source)
if job: if job:
jobs.append(job) jobs.append(job)
@ -281,7 +285,10 @@ class QueueService:
"""Cleanup nach erfolgreicher Konvertierung""" """Cleanup nach erfolgreicher Konvertierung"""
files_cfg = self.config.files_config files_cfg = self.config.files_config
if files_cfg.get("delete_source", False): # Quelldatei loeschen: Global per Config ODER per Job-Option
should_delete = files_cfg.get("delete_source", False) or job.delete_source
if should_delete:
target_exists = os.path.exists(job.target_path) target_exists = os.path.exists(job.target_path)
target_size = os.path.getsize(job.target_path) if target_exists else 0 target_size = os.path.getsize(job.target_path) if target_exists else 0
if target_exists and target_size > 0: if target_exists and target_size > 0:

View file

@ -179,23 +179,51 @@ class TVDBService:
return name, overview return name, overview
async def search_series(self, query: str) -> list[dict]: async def search_series(self, query: str,
"""Sucht Serien auf TVDB""" language: Optional[str] = None) -> list[dict]:
"""Sucht Serien auf TVDB.
Args:
query: Suchbegriff
language: Sprache fuer Ergebnisse (z.B. 'deu', 'eng').
None = konfigurierte Sprache verwenden.
"""
client = self._get_client() client = self._get_client()
if not client: if not client:
return [] return []
# Sprache fuer Lokalisierung
display_lang = language or self._language
try: try:
results = client.search(query, type="series") results = client.search(query, type="series")
if not results: if not results:
return [] return []
series_list = [] series_list = []
for item in results[:10]: for item in results[:20]: # 20 statt 10 Ergebnisse
name, overview = self._localize_search_result(item) # Lokalisierung mit gewaehlter Sprache
name = item.get("name", "")
overview = item.get("overview", "")
trans = item.get("translations") or {}
if isinstance(trans, dict):
# Gewaehlte Sprache oder Original
name = trans.get(display_lang) or name
overviews = item.get("overviews") or {}
if isinstance(overviews, dict):
overview = (overviews.get(display_lang)
or overviews.get("eng")
or overview)
# Original-Name fuer Anzeige wenn anders
original_name = item.get("name", "")
series_list.append({ series_list.append({
"tvdb_id": item.get("tvdb_id") or item.get("objectID"), "tvdb_id": item.get("tvdb_id") or item.get("objectID"),
"name": name, "name": name,
"original_name": original_name if original_name != name else "",
"overview": overview, "overview": overview,
"first_air_date": item.get("first_air_time") "first_air_date": item.get("first_air_time")
or item.get("firstAirDate", ""), or item.get("firstAirDate", ""),

View file

@ -369,14 +369,22 @@ legend {
} }
.toast { .toast {
padding: 0.6rem 1rem; padding: 0.7rem 1.2rem;
border-radius: 6px; border-radius: 8px;
font-size: 0.8rem; font-size: 0.85rem;
margin-bottom: 0.5rem; margin-bottom: 0.5rem;
animation: fadeIn 0.3s ease, fadeOut 0.3s ease 2.7s; opacity: 0;
transform: translateX(20px);
transition: opacity 0.3s ease, transform 0.3s ease;
box-shadow: 0 4px 12px rgba(0,0,0,0.3);
} }
.toast.success { background: #1b5e20; color: #81c784; } .toast.show {
.toast.error { background: #b71c1c; color: #ef9a9a; } opacity: 1;
transform: translateX(0);
}
.toast-success { background: #1b5e20; color: #a5d6a7; border-left: 3px solid #4caf50; }
.toast-error { background: #b71c1c; color: #ef9a9a; border-left: 3px solid #f44336; }
.toast-info { background: #1565c0; color: #90caf9; border-left: 3px solid #2196f3; }
@keyframes fadeIn { from { opacity: 0; transform: translateY(-10px); } to { opacity: 1; } } @keyframes fadeIn { from { opacity: 0; transform: translateY(-10px); } to { opacity: 1; } }
@keyframes fadeOut { from { opacity: 1; } to { opacity: 0; } } @keyframes fadeOut { from { opacity: 1; } to { opacity: 0; } }
@ -1107,6 +1115,40 @@ legend {
font-size: 0.75rem; font-size: 0.75rem;
color: #888; color: #888;
} }
.folder-main {
display: flex;
align-items: center;
gap: 0.6rem;
flex: 1;
min-width: 0;
cursor: pointer;
}
.btn-folder-delete {
position: absolute;
top: 0.4rem;
right: 0.4rem;
background: rgba(0,0,0,0.5);
border: none;
color: #888;
padding: 0.35rem;
border-radius: 4px;
cursor: pointer;
opacity: 0;
transition: opacity 0.15s, color 0.15s, background 0.15s;
display: flex;
align-items: center;
justify-content: center;
}
.browser-folder {
position: relative;
}
.browser-folder:hover .btn-folder-delete {
opacity: 1;
}
.btn-folder-delete:hover {
color: #e74c3c;
background: rgba(231, 76, 60, 0.2);
}
.browser-videos { .browser-videos {
margin-top: 0.5rem; margin-top: 0.5rem;
} }
@ -1534,6 +1576,142 @@ legend {
margin-top: 0.3rem; margin-top: 0.3rem;
} }
/* === Codec-Stats (Konvertierung) === */
.codec-stats {
display: flex;
flex-wrap: wrap;
gap: 0.3rem;
margin: 0.5rem 0;
}
.codec-stats .tag {
font-size: 0.75rem;
padding: 0.2rem 0.5rem;
}
/* === Benachrichtigungs-Glocke === */
.notification-bell {
position: fixed;
bottom: 20px;
left: 20px;
width: 48px;
height: 48px;
background: #2a2a2a;
border: 1px solid #444;
border-radius: 50%;
display: flex;
align-items: center;
justify-content: center;
cursor: pointer;
color: #888;
transition: all 0.2s ease;
z-index: 1000;
box-shadow: 0 2px 8px rgba(0,0,0,0.3);
}
.notification-bell:hover {
background: #333;
color: #fff;
transform: scale(1.05);
}
.notification-bell.has-error {
color: #ff6b6b;
animation: bell-shake 0.5s ease;
}
@keyframes bell-shake {
0%, 100% { transform: rotate(0); }
25% { transform: rotate(-10deg); }
75% { transform: rotate(10deg); }
}
.notification-badge {
position: absolute;
top: -4px;
right: -4px;
background: #e74c3c;
color: #fff;
font-size: 0.65rem;
font-weight: bold;
min-width: 18px;
height: 18px;
border-radius: 9px;
display: flex;
align-items: center;
justify-content: center;
padding: 0 4px;
}
/* === Log-Panel === */
.notification-panel {
position: fixed;
bottom: 80px;
left: 20px;
width: 400px;
max-height: 50vh;
background: #1e1e1e;
border: 1px solid #444;
border-radius: 8px;
display: flex;
flex-direction: column;
z-index: 1001;
box-shadow: 0 4px 20px rgba(0,0,0,0.4);
}
.notification-header {
display: flex;
justify-content: space-between;
align-items: center;
padding: 0.6rem 0.8rem;
border-bottom: 1px solid #333;
font-weight: 500;
color: #ddd;
}
.notification-header > div {
display: flex;
gap: 0.3rem;
align-items: center;
}
.notification-list {
flex: 1;
overflow-y: auto;
max-height: 45vh;
}
.notification-item {
padding: 0.5rem 0.8rem;
border-bottom: 1px solid #2a2a2a;
font-size: 0.8rem;
display: flex;
gap: 0.5rem;
align-items: flex-start;
}
.notification-item:hover {
background: #252525;
}
.notification-item.error {
background: rgba(231, 76, 60, 0.1);
border-left: 3px solid #e74c3c;
}
.notification-item.warning {
background: rgba(241, 196, 15, 0.1);
border-left: 3px solid #f1c40f;
}
.notification-time {
color: #666;
font-size: 0.7rem;
white-space: nowrap;
min-width: 55px;
}
.notification-msg {
color: #ccc;
word-break: break-word;
flex: 1;
}
.notification-item.error .notification-msg {
color: #ff8a8a;
}
.notification-empty {
padding: 2rem;
text-align: center;
color: #666;
font-size: 0.85rem;
}
/* === Responsive === */ /* === Responsive === */
@media (max-width: 768px) { @media (max-width: 768px) {
header { flex-direction: column; gap: 0.5rem; } header { flex-direction: column; gap: 0.5rem; }

View file

@ -412,12 +412,23 @@ function renderBrowser(folders, videos, pathId) {
html += '<div class="browser-folders">'; html += '<div class="browser-folders">';
for (const f of folders) { for (const f of folders) {
const size = formatSize(f.total_size || 0); const size = formatSize(f.total_size || 0);
html += `<div class="browser-folder" onclick="loadSectionBrowser(${pathId}, '${escapeHtml(f.path)}')"> const pathEsc = f.path.replace(/'/g, "\\'");
<span class="folder-icon">&#128193;</span> html += `<div class="browser-folder">
<div class="folder-info"> <div class="folder-main" onclick="loadSectionBrowser(${pathId}, '${pathEsc}')">
<span class="folder-name">${escapeHtml(f.name)}</span> <span class="folder-icon">&#128193;</span>
<span class="folder-meta">${f.video_count} Videos, ${size}</span> <div class="folder-info">
<span class="folder-name">${escapeHtml(f.name)}</span>
<span class="folder-meta">${f.video_count} Videos, ${size}</span>
</div>
</div> </div>
<button class="btn-folder-delete" onclick="event.stopPropagation(); showDeleteFolderDialog('${pathEsc}', ${pathId}, ${f.video_count})" title="Ordner loeschen">
<svg width="16" height="16" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
<polyline points="3 6 5 6 21 6"/>
<path d="M19 6v14a2 2 0 0 1-2 2H7a2 2 0 0 1-2-2V6m3 0V4a2 2 0 0 1 2-2h4a2 2 0 0 1 2 2v2"/>
<line x1="10" y1="11" x2="10" y2="17"/>
<line x1="14" y1="11" x2="14" y2="17"/>
</svg>
</button>
</div>`; </div>`;
} }
html += '</div>'; html += '</div>';
@ -704,6 +715,77 @@ function deleteSeries(withFiles) {
.catch(e => alert("Fehler: " + e)); .catch(e => alert("Fehler: " + e));
} }
// === Bestaetigungs-Dialog ===
let pendingConfirmAction = null;
function showDeleteFolderDialog(folderPath, pathId, videoCount) {
const folderName = folderPath.split('/').pop();
document.getElementById("confirm-title").textContent = "Ordner loeschen";
document.getElementById("confirm-icon").innerHTML = `
<svg width="48" height="48" viewBox="0 0 24 24" fill="none" stroke="#e74c3c" stroke-width="1.5">
<polyline points="3 6 5 6 21 6"/>
<path d="M19 6v14a2 2 0 0 1-2 2H7a2 2 0 0 1-2-2V6m3 0V4a2 2 0 0 1 2-2h4a2 2 0 0 1 2 2v2"/>
<line x1="10" y1="11" x2="10" y2="17"/>
<line x1="14" y1="11" x2="14" y2="17"/>
</svg>`;
document.getElementById("confirm-message").innerHTML = `
<strong>${escapeHtml(folderName)}</strong><br>
wirklich loeschen?`;
document.getElementById("confirm-detail").innerHTML = `
${videoCount} Video${videoCount !== 1 ? 's' : ''} werden unwiderruflich geloescht.<br>
<span style="color:#e74c3c">Dieser Vorgang kann nicht rueckgaengig gemacht werden!</span>`;
document.getElementById("confirm-btn-ok").textContent = "Endgueltig loeschen";
document.getElementById("confirm-modal").style.display = "flex";
pendingConfirmAction = () => executeDeleteFolder(folderPath, pathId);
}
function closeConfirmModal() {
document.getElementById("confirm-modal").style.display = "none";
pendingConfirmAction = null;
}
function confirmAction() {
if (pendingConfirmAction) {
pendingConfirmAction();
}
closeConfirmModal();
}
function executeDeleteFolder(folderPath, pathId) {
fetch("/api/library/delete-folder", {
method: "POST",
headers: {"Content-Type": "application/json"},
body: JSON.stringify({folder_path: folderPath})
})
.then(r => r.json())
.then(data => {
if (data.error) {
showToast("Fehler: " + data.error, "error");
return;
}
const msg = `${data.deleted_files || 0} Dateien geloescht`;
showToast(msg, "success");
if (pathId) loadSectionData(pathId);
loadStats();
})
.catch(e => showToast("Fehler: " + e, "error"));
}
function showToast(message, type = "info") {
const container = document.getElementById("toast-container");
if (!container) return;
const toast = document.createElement("div");
toast.className = `toast toast-${type}`;
toast.textContent = message;
container.appendChild(toast);
setTimeout(() => toast.classList.add("show"), 10);
setTimeout(() => {
toast.classList.remove("show");
setTimeout(() => toast.remove(), 300);
}, 4000);
}
// === Film-Detail === // === Film-Detail ===
function openMovieDetail(movieId) { function openMovieDetail(movieId) {
@ -1217,6 +1299,9 @@ function openTvdbModal(seriesId, folderName) {
document.getElementById("tvdb-series-id").value = seriesId; document.getElementById("tvdb-series-id").value = seriesId;
document.getElementById("tvdb-search-input").value = cleanSearchTitle(folderName); document.getElementById("tvdb-search-input").value = cleanSearchTitle(folderName);
document.getElementById("tvdb-results").innerHTML = ""; document.getElementById("tvdb-results").innerHTML = "";
// Checkbox zuruecksetzen
const engCheckbox = document.getElementById("tvdb-search-english");
if (engCheckbox) engCheckbox.checked = false;
searchTvdb(); searchTvdb();
} }
@ -1231,21 +1316,30 @@ function searchTvdb() {
const results = document.getElementById("tvdb-results"); const results = document.getElementById("tvdb-results");
results.innerHTML = '<div class="loading-msg">Suche...</div>'; results.innerHTML = '<div class="loading-msg">Suche...</div>';
fetch(`/api/tvdb/search?q=${encodeURIComponent(query)}`) // Sprache: eng wenn Checkbox aktiv, sonst Standard (deu)
const useEnglish = document.getElementById("tvdb-search-english")?.checked;
const langParam = useEnglish ? "&lang=eng" : "";
fetch(`/api/tvdb/search?q=${encodeURIComponent(query)}${langParam}`)
.then(r => r.json()) .then(r => r.json())
.then(data => { .then(data => {
if (data.error) { results.innerHTML = `<div class="loading-msg">${escapeHtml(data.error)}</div>`; return; } if (data.error) { results.innerHTML = `<div class="loading-msg">${escapeHtml(data.error)}</div>`; return; }
if (!data.results || !data.results.length) { results.innerHTML = '<div class="loading-msg">Keine Ergebnisse</div>'; return; } if (!data.results || !data.results.length) { results.innerHTML = '<div class="loading-msg">Keine Ergebnisse</div>'; return; }
results.innerHTML = data.results.map(r => ` results.innerHTML = data.results.map(r => {
// Zeige Original-Namen wenn vorhanden und unterschiedlich
const origName = r.original_name && r.original_name !== r.name
? `<span class="text-muted" style="font-size:0.85em">(${escapeHtml(r.original_name)})</span>`
: "";
return `
<div class="tvdb-result" onclick="matchTvdb(${r.tvdb_id})"> <div class="tvdb-result" onclick="matchTvdb(${r.tvdb_id})">
${r.poster ? `<img src="${r.poster}" alt="" class="tvdb-thumb">` : ""} ${r.poster ? `<img src="${r.poster}" alt="" class="tvdb-thumb">` : ""}
<div> <div>
<strong>${escapeHtml(r.name)}</strong> <strong>${escapeHtml(r.name)}</strong> ${origName}
<span class="text-muted">${r.year || ""}</span> <span class="text-muted">${r.year || ""}</span>
<p class="tvdb-overview">${escapeHtml((r.overview || "").substring(0, 150))}</p> <p class="tvdb-overview">${escapeHtml((r.overview || "").substring(0, 150))}</p>
</div> </div>
</div> </div>
`).join(""); `}).join("");
}) })
.catch(e => { results.innerHTML = `<div class="loading-msg">Fehler: ${e}</div>`; }); .catch(e => { results.innerHTML = `<div class="loading-msg">Fehler: ${e}</div>`; });
} }
@ -1293,6 +1387,123 @@ function convertVideo(videoId) {
.catch(e => alert("Fehler: " + e)); .catch(e => alert("Fehler: " + e));
} }
// === Serie komplett konvertieren ===
function openConvertSeriesModal() {
if (!currentSeriesId) return;
document.getElementById("convert-series-modal").style.display = "flex";
document.getElementById("convert-series-status").innerHTML =
'<div class="loading-msg">Lade Codec-Status...</div>';
// Codec-Status laden
fetch(`/api/library/series/${currentSeriesId}/convert-status`)
.then(r => r.json())
.then(data => {
if (data.error) {
document.getElementById("convert-series-status").innerHTML =
`<div class="loading-msg">${escapeHtml(data.error)}</div>`;
return;
}
let html = `<div style="margin-bottom:0.5rem"><strong>${data.total} Episoden</strong></div>`;
html += '<div class="codec-stats">';
for (const [codec, count] of Object.entries(data.codec_counts || {})) {
const isTarget = codec.includes("av1") || codec.includes("hevc");
const cls = isTarget ? "tag ok" : "tag";
html += `<span class="${cls}">${codec}: ${count}</span> `;
}
html += '</div>';
document.getElementById("convert-series-status").innerHTML = html;
})
.catch(e => {
document.getElementById("convert-series-status").innerHTML =
`<div class="loading-msg">Fehler: ${e}</div>`;
});
}
function closeConvertSeriesModal() {
document.getElementById("convert-series-modal").style.display = "none";
}
function executeConvertSeries() {
if (!currentSeriesId) return;
const targetCodec = document.getElementById("convert-target-codec").value;
const forceAll = document.getElementById("convert-force-all").checked;
const deleteOld = document.getElementById("convert-delete-old").checked;
const btn = document.querySelector("#convert-series-modal .btn-primary");
btn.textContent = "Wird gestartet...";
btn.disabled = true;
fetch(`/api/library/series/${currentSeriesId}/convert`, {
method: "POST",
headers: {"Content-Type": "application/json"},
body: JSON.stringify({
target_codec: targetCodec,
force_all: forceAll,
delete_old: deleteOld,
}),
})
.then(r => r.json())
.then(data => {
btn.textContent = "Konvertierung starten";
btn.disabled = false;
if (data.error) {
alert("Fehler: " + data.error);
return;
}
let msg = data.message || "Konvertierung gestartet";
if (data.already_done > 0) {
msg += `\n${data.already_done} Episoden sind bereits im Zielformat.`;
}
alert(msg);
closeConvertSeriesModal();
})
.catch(e => {
btn.textContent = "Konvertierung starten";
btn.disabled = false;
alert("Fehler: " + e);
});
}
// === Serien-Ordner aufraeumen ===
function cleanupSeriesFolder() {
if (!currentSeriesId) return;
if (!confirm("Alle Dateien im Serien-Ordner loeschen, die NICHT in der Bibliothek sind?\n\n" +
"Behalten werden:\n" +
"- Alle Videos in der Bibliothek\n" +
"- .metadata Ordner\n" +
"- .nfo, .jpg, .png Dateien\n\n" +
"ACHTUNG: Dies kann nicht rueckgaengig gemacht werden!")) {
return;
}
fetch(`/api/library/series/${currentSeriesId}/cleanup`, {
method: "POST",
headers: {"Content-Type": "application/json"},
body: JSON.stringify({}),
})
.then(r => r.json())
.then(data => {
if (data.error) {
alert("Fehler: " + data.error);
return;
}
let msg = `${data.deleted} Dateien geloescht.`;
if (data.errors > 0) {
msg += `\n${data.errors} Fehler.`;
}
alert(msg);
})
.catch(e => alert("Fehler: " + e));
}
// === Duplikate === // === Duplikate ===
function showDuplicates() { function showDuplicates() {
@ -1556,10 +1767,78 @@ function openImportModal() {
}) })
.catch(() => {}); .catch(() => {});
// Bestehende Import-Jobs laden
loadExistingImportJobs();
// Standard-Pfad im Filebrowser oeffnen // Standard-Pfad im Filebrowser oeffnen
importBrowse("/mnt"); importBrowse("/mnt");
} }
function loadExistingImportJobs() {
fetch("/api/library/import")
.then(r => r.json())
.then(data => {
const jobs = (data.jobs || []).filter(j => j.status !== 'done');
const container = document.getElementById("import-existing");
const list = document.getElementById("import-jobs-list");
if (!jobs.length) {
container.style.display = "none";
return;
}
container.style.display = "";
list.innerHTML = jobs.map(j => {
const statusClass = j.status === 'ready' ? 'tag-success' :
j.status === 'error' ? 'tag-error' :
j.status === 'importing' ? 'tag-warning' : '';
const statusText = j.status === 'ready' ? 'Bereit' :
j.status === 'analyzing' ? 'Analyse...' :
j.status === 'importing' ? 'Laeuft' :
j.status === 'error' ? 'Fehler' : j.status;
const sourceName = j.source_path.split('/').pop();
return `<button class="btn-small ${j.status === 'ready' ? 'btn-primary' : 'btn-secondary'}"
onclick="loadImportJob(${j.id})"
title="${escapeHtml(j.source_path)}">
${escapeHtml(sourceName)} (${j.processed_files}/${j.total_files})
<span class="tag ${statusClass}" style="margin-left:0.3rem;font-size:0.7rem">${statusText}</span>
</button>`;
}).join("");
})
.catch(() => {
document.getElementById("import-existing").style.display = "none";
});
}
function loadImportJob(jobId) {
currentImportJobId = jobId;
document.getElementById("import-setup").style.display = "none";
document.getElementById("import-existing").style.display = "none";
document.getElementById("import-preview").style.display = "";
fetch(`/api/library/import/${jobId}`)
.then(r => r.json())
.then(data => {
if (data.error) {
alert("Fehler: " + data.error);
resetImport();
return;
}
renderImportItems(data);
// Falls Job bereits laeuft, Polling starten
if (data.job && data.job.status === 'importing') {
document.getElementById("import-preview").style.display = "none";
document.getElementById("import-progress").style.display = "";
startImportPolling();
}
})
.catch(e => {
alert("Fehler beim Laden: " + e);
resetImport();
});
}
function closeImportModal() { function closeImportModal() {
document.getElementById("import-modal").style.display = "none"; document.getElementById("import-modal").style.display = "none";
} }
@ -1828,6 +2107,8 @@ function refreshImportPreview() {
.catch(() => {}); .catch(() => {});
} }
let importPollingId = null;
function executeImport() { function executeImport() {
if (!currentImportJobId || !confirm("Import jetzt starten?")) return; if (!currentImportJobId || !confirm("Import jetzt starten?")) return;
@ -1836,22 +2117,105 @@ function executeImport() {
document.getElementById("import-status-text").textContent = "Importiere..."; document.getElementById("import-status-text").textContent = "Importiere...";
document.getElementById("import-bar").style.width = "0%"; document.getElementById("import-bar").style.width = "0%";
fetch(`/api/library/import/${currentImportJobId}/execute`, {method: "POST"}) // Starte Import (non-blocking - Server antwortet sofort)
.then(r => r.json()) fetch(`/api/library/import/${currentImportJobId}/execute`, {method: "POST"});
.then(data => {
document.getElementById("import-bar").style.width = "100%"; // Polling fuer Fortschritt starten
startImportPolling();
}
function startImportPolling() {
if (importPollingId) clearInterval(importPollingId);
importPollingId = setInterval(async () => {
try {
const r = await fetch(`/api/library/import/${currentImportJobId}`);
const data = await r.json();
if (data.error) { if (data.error) {
stopImportPolling();
document.getElementById("import-status-text").textContent = "Fehler: " + data.error; document.getElementById("import-status-text").textContent = "Fehler: " + data.error;
} else { return;
document.getElementById("import-status-text").textContent =
`Fertig: ${data.done || 0} importiert, ${data.errors || 0} Fehler`;
reloadAllSections();
loadStats();
} }
})
.catch(e => { const job = data.job;
document.getElementById("import-status-text").textContent = "Fehler: " + e; if (!job) return;
});
const total = job.total_files || 1;
const done = job.processed_files || 0;
// Byte-Fortschritt der aktuellen Datei
const curFile = job.current_file_name || "";
const curBytes = job.current_file_bytes || 0;
const curTotal = job.current_file_total || 0;
// Prozent: fertige Dateien + anteilig aktuelle Datei
let pct = (done / total) * 100;
if (curTotal > 0 && done < total) {
pct += (curBytes / curTotal) * (100 / total);
}
pct = Math.min(Math.round(pct), 100);
document.getElementById("import-bar").style.width = pct + "%";
// Status-Text mit Byte-Fortschritt
let statusText = `Importiere: ${done} / ${total} Dateien`;
if (curFile && curTotal > 0 && done < total) {
const curPct = Math.round((curBytes / curTotal) * 100);
statusText += ` - ${curFile.substring(0, 40)}... (${formatSize(curBytes)} / ${formatSize(curTotal)}, ${curPct}%)`;
} else {
statusText += ` (${pct}%)`;
}
document.getElementById("import-status-text").textContent = statusText;
// Fertig?
if (job.status === "done" || job.status === "error") {
stopImportPolling();
document.getElementById("import-bar").style.width = "100%";
// Zaehle Ergebnisse
const items = data.items || [];
const imported = items.filter(i => i.status === "done").length;
const errors = items.filter(i => i.status === "error").length;
const skipped = items.filter(i => i.status === "skipped").length;
document.getElementById("import-status-text").textContent =
`Fertig: ${imported} importiert, ${skipped} uebersprungen, ${errors} Fehler`;
// Nur Ziel-Pfad scannen und neu laden (statt alles)
const targetPathId = job.target_library_id;
if (targetPathId && imported > 0) {
// Gezielten Scan starten
fetch(`/api/library/scan/${targetPathId}`, {method: "POST"})
.then(() => {
// Warte kurz, dann nur diese Sektion neu laden
setTimeout(() => {
loadSectionData(targetPathId);
loadStats();
}, 2000);
})
.catch(() => {
// Fallback: Alles neu laden
reloadAllSections();
loadStats();
});
} else {
// Kein Import oder unbekannter Pfad: Alles neu laden
reloadAllSections();
loadStats();
}
}
} catch (e) {
console.error("Import-Polling Fehler:", e);
}
}, 500);
}
function stopImportPolling() {
if (importPollingId) {
clearInterval(importPollingId);
importPollingId = null;
}
} }
// === Hilfsfunktionen === // === Hilfsfunktionen ===

View file

@ -27,6 +27,128 @@
<div id="toast-container"></div> <div id="toast-container"></div>
<!-- Benachrichtigungs-Glocke -->
<div id="notification-bell" class="notification-bell" onclick="toggleNotificationPanel()">
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
<path d="M18 8A6 6 0 0 0 6 8c0 7-3 9-3 9h18s-3-2-3-9"/>
<path d="M13.73 21a2 2 0 0 1-3.46 0"/>
</svg>
<span id="notification-badge" class="notification-badge" style="display:none">0</span>
</div>
<!-- Log-Panel -->
<div id="notification-panel" class="notification-panel" style="display:none">
<div class="notification-header">
<span>Server-Log</span>
<div>
<button class="btn-small btn-secondary" onclick="clearNotifications()">Alle loeschen</button>
<button class="btn-close" onclick="toggleNotificationPanel()">&times;</button>
</div>
</div>
<div id="notification-list" class="notification-list">
<div class="notification-empty">Keine Nachrichten</div>
</div>
</div>
<script>
// === Benachrichtigungs-System ===
const notifications = [];
let unreadErrors = 0;
let lastLogId = 0;
function toggleNotificationPanel() {
const panel = document.getElementById("notification-panel");
const isOpen = panel.style.display !== "none";
panel.style.display = isOpen ? "none" : "flex";
if (!isOpen) {
// Panel geoeffnet - Fehler als gelesen markieren
unreadErrors = 0;
updateBadge();
}
}
function updateBadge() {
const badge = document.getElementById("notification-badge");
const bell = document.getElementById("notification-bell");
if (unreadErrors > 0) {
badge.textContent = unreadErrors > 99 ? "99+" : unreadErrors;
badge.style.display = "";
bell.classList.add("has-error");
} else {
badge.style.display = "none";
bell.classList.remove("has-error");
}
}
function addNotification(msg, level = "info") {
const time = new Date().toLocaleTimeString("de-DE", {hour: "2-digit", minute: "2-digit", second: "2-digit"});
notifications.unshift({msg, level, time});
if (notifications.length > 100) notifications.pop();
if (level === "error" || level === "ERROR") {
unreadErrors++;
updateBadge();
}
renderNotifications();
}
function renderNotifications() {
const list = document.getElementById("notification-list");
if (!notifications.length) {
list.innerHTML = '<div class="notification-empty">Keine Nachrichten</div>';
return;
}
list.innerHTML = notifications.map(n => {
const cls = n.level.toLowerCase() === "error" ? "notification-item error" :
n.level.toLowerCase() === "warning" ? "notification-item warning" :
"notification-item";
return `<div class="${cls}">
<span class="notification-time">${n.time}</span>
<span class="notification-msg">${escapeHtmlSimple(n.msg)}</span>
</div>`;
}).join("");
}
function clearNotifications() {
notifications.length = 0;
unreadErrors = 0;
updateBadge();
renderNotifications();
}
function escapeHtmlSimple(str) {
return String(str)
.replace(/&/g, "&amp;")
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;");
}
// Log-Polling vom Server
async function pollLogs() {
try {
const r = await fetch(`/api/logs?since=${lastLogId}`);
const data = await r.json();
if (data.logs && data.logs.length) {
for (const log of data.logs) {
addNotification(log.message, log.level);
if (log.id > lastLogId) lastLogId = log.id;
}
}
} catch (e) {
// Ignorieren falls Endpoint nicht existiert
}
}
// Polling starten
setInterval(pollLogs, 2000);
</script>
{% block scripts %}{% endblock %} {% block scripts %}{% endblock %}
</body> </body>
</html> </html>

View file

@ -192,6 +192,12 @@
<input type="text" id="tvdb-search-input" placeholder="Serienname..." <input type="text" id="tvdb-search-input" placeholder="Serienname..."
oninput="debounceTvdbSearch()"> oninput="debounceTvdbSearch()">
</div> </div>
<div class="form-group" style="margin-top:0.5rem">
<label style="display:inline-flex; align-items:center; gap:0.5rem; cursor:pointer">
<input type="checkbox" id="tvdb-search-english" onchange="searchTvdb()">
Englische Titel durchsuchen
</label>
</div>
<div id="tvdb-results" class="tvdb-results"></div> <div id="tvdb-results" class="tvdb-results"></div>
</div> </div>
</div> </div>
@ -221,9 +227,11 @@
<span id="series-modal-genres" class="series-genres-line"></span> <span id="series-modal-genres" class="series-genres-line"></span>
</div> </div>
<div class="modal-header-actions"> <div class="modal-header-actions">
<button class="btn-small btn-primary" id="btn-convert-series" onclick="openConvertSeriesModal()">Serie konvertieren</button>
<button class="btn-small btn-secondary" id="btn-tvdb-refresh" onclick="tvdbRefresh()" style="display:none">TVDB aktualisieren</button> <button class="btn-small btn-secondary" id="btn-tvdb-refresh" onclick="tvdbRefresh()" style="display:none">TVDB aktualisieren</button>
<button class="btn-small btn-secondary" id="btn-tvdb-unlink" onclick="tvdbUnlink()" style="display:none">TVDB loesen</button> <button class="btn-small btn-secondary" id="btn-tvdb-unlink" onclick="tvdbUnlink()" style="display:none">TVDB loesen</button>
<button class="btn-small btn-secondary" id="btn-metadata-dl" onclick="downloadMetadata()" style="display:none">Metadaten laden</button> <button class="btn-small btn-secondary" id="btn-metadata-dl" onclick="downloadMetadata()" style="display:none">Metadaten laden</button>
<button class="btn-small btn-secondary" id="btn-cleanup-series" onclick="cleanupSeriesFolder()">Alte Dateien loeschen</button>
<button class="btn-small btn-secondary" id="btn-series-delete-db" onclick="deleteSeries(false)">Aus DB loeschen</button> <button class="btn-small btn-secondary" id="btn-series-delete-db" onclick="deleteSeries(false)">Aus DB loeschen</button>
<button class="btn-small btn-danger" id="btn-series-delete-all" onclick="deleteSeries(true)">Komplett loeschen</button> <button class="btn-small btn-danger" id="btn-series-delete-all" onclick="deleteSeries(true)">Komplett loeschen</button>
<button class="btn-close" onclick="closeSeriesModal()">&times;</button> <button class="btn-close" onclick="closeSeriesModal()">&times;</button>
@ -320,6 +328,11 @@
<button class="btn-close" onclick="closeImportModal()">&times;</button> <button class="btn-close" onclick="closeImportModal()">&times;</button>
</div> </div>
<div class="modal-body" style="padding:0"> <div class="modal-body" style="padding:0">
<!-- Bestehende Import-Jobs -->
<div id="import-existing" style="display:none; padding:0.8rem; border-bottom:1px solid #2a2a2a; background:#1a1a1a;">
<div style="margin-bottom:0.5rem; font-size:0.85rem; color:#888;">Offene Import-Jobs:</div>
<div id="import-jobs-list" style="display:flex; flex-wrap:wrap; gap:0.5rem;"></div>
</div>
<!-- Schritt 1: Ordner waehlen --> <!-- Schritt 1: Ordner waehlen -->
<div id="import-setup"> <div id="import-setup">
<!-- Filebrowser --> <!-- Filebrowser -->
@ -385,6 +398,65 @@
</div> </div>
</div> </div>
</div> </div>
<!-- Serie konvertieren Modal -->
<div id="convert-series-modal" class="modal-overlay" style="display:none">
<div class="modal modal-small">
<div class="modal-header">
<h2>Serie konvertieren</h2>
<button class="btn-close" onclick="closeConvertSeriesModal()">&times;</button>
</div>
<div class="modal-body" style="padding:1rem">
<div id="convert-series-status" style="margin-bottom:1rem"></div>
<div class="form-group">
<label>Ziel-Codec</label>
<select id="convert-target-codec">
<option value="av1">AV1 (empfohlen)</option>
<option value="hevc">HEVC / H.265</option>
<option value="h264">H.264</option>
</select>
</div>
<div class="form-group">
<label>
<input type="checkbox" id="convert-force-all">
Alle Episoden neu konvertieren (auch bereits passende)
</label>
</div>
<div class="form-group">
<label>
<input type="checkbox" id="convert-delete-old">
Quelldateien nach Konvertierung loeschen
</label>
</div>
<div class="form-actions" style="margin-top:1rem">
<button class="btn-primary" onclick="executeConvertSeries()">Konvertierung starten</button>
<button class="btn-secondary" onclick="closeConvertSeriesModal()">Abbrechen</button>
</div>
</div>
</div>
</div>
<!-- Bestaetigungs-Dialog -->
<div id="confirm-modal" class="modal-overlay" style="display:none">
<div class="modal modal-small">
<div class="modal-header">
<h2 id="confirm-title">Bestaetigung</h2>
<button class="btn-close" onclick="closeConfirmModal()">&times;</button>
</div>
<div class="modal-body" style="padding:1.2rem">
<div id="confirm-icon" style="text-align:center; font-size:3rem; margin-bottom:0.8rem">&#9888;</div>
<div id="confirm-message" style="text-align:center; margin-bottom:1rem"></div>
<div id="confirm-detail" style="text-align:center; font-size:0.85rem; color:#888; margin-bottom:1.2rem"></div>
<div class="form-actions" style="justify-content:center">
<button class="btn-danger" id="confirm-btn-ok" onclick="confirmAction()">Loeschen</button>
<button class="btn-secondary" onclick="closeConfirmModal()">Abbrechen</button>
</div>
</div>
</div>
</div>
{% endblock %} {% endblock %}
{% block scripts %} {% block scripts %}

BIN
video-konverter-cpu.tar.gz Normal file

Binary file not shown.