Compare commits

..

No commits in common. "main" and "v1.4.51" have entirely different histories.

69 changed files with 1707 additions and 13670 deletions

52
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,52 @@
name: Build and Release
permissions:
contents: write
on:
push:
tags:
- "v*"
jobs:
build:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "22"
cache: "npm"
- name: Install dependencies
run: npm ci
- name: Apply tag version
shell: pwsh
run: |
$version = "${{ github.ref_name }}".TrimStart('v')
node scripts/set_version_node.mjs $version
- name: Build app
run: npm run build
- name: Build Windows artifacts
run: npm run release:win
- name: Pack portable zip
shell: pwsh
run: |
Compress-Archive -Path "release\win-unpacked\*" -DestinationPath "Real-Debrid-Downloader-win64.zip" -Force
- name: Publish GitHub Release
uses: softprops/action-gh-release@v2
with:
files: |
Real-Debrid-Downloader-win64.zip
release/*.exe
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

14
.gitignore vendored
View File

@ -17,23 +17,9 @@ rd_download_manifest.json
_update_staging/
apply_update.cmd
.claude/
.github/
docs/plans/
CHANGELOG.md
node_modules/
.vite/
coverage/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Forgejo deployment runtime files
deploy/forgejo/.env
deploy/forgejo/forgejo/
deploy/forgejo/postgres/
deploy/forgejo/caddy/data/
deploy/forgejo/caddy/config/
deploy/forgejo/caddy/logs/
deploy/forgejo/backups/

148
CHANGELOG.md Normal file
View File

@ -0,0 +1,148 @@
# Changelog
Alle nennenswerten Aenderungen werden in dieser Datei dokumentiert.
## 1.4.33 - 2026-03-02
Hotfix-Release fuer zwei reale Produktionsprobleme: falsche Gesamt-Statistik bei leerer Queue und stilles DLC-Import-Failure bei Drag-and-Drop.
### Fixes
- **Stats-Anzeige korrigiert ("Gesamt" bei leerer Queue):**
- Wenn keine Pakete/Items mehr vorhanden sind, werden persistierte Run-Bytes und Run-Timestamps jetzt sauber auf 0 zurueckgesetzt.
- Dadurch verschwindet die Ghost-Anzeige wie z. B. `Gesamt: 19.99 GB` bei `Pakete: 0 / Dateien: 0`.
- Reset greift in den relevanten Pfaden (`getStats`, `clearAll`, Paket-Entfernung, Startup-Normalisierung).
- **DLC Drag-and-Drop Import gehaertet:**
- Lokale DLC-Fehler wie `Ungültiges DLC-Padding` blockieren den Fallback zu dcrypt nicht mehr.
- Oversize/invalid-size DLCs werden weiterhin defensiv behandelt, aber valide Dateien im gleichen Batch werden nicht mehr still geschluckt.
- Wenn alle DLC-Imports fehlschlagen, wird jetzt ein klarer Fehler mit Ursache geworfen statt still `0 Paket(e), 0 Link(s)` zu melden.
- **UI-Rueckmeldung verbessert:**
- Bei DLC-Import mit `0` Treffern zeigt die UI jetzt eine klare Meldung (`Keine gültigen Links in den DLC-Dateien gefunden`) statt eines irrefuehrenden Erfolgs-Toast.
### Tests
- Neue/erweiterte Tests fuer:
- Reset von `totalDownloadedBytes`/Stats bei leerer Queue.
- DLC-Fallback-Pfad bei lokalen Decrypt-Exceptions.
- Fehlerausgabe bei vollstaendig fehlgeschlagenem DLC-Import.
- Validierung:
- `npx tsc --noEmit` erfolgreich
- `npm test` erfolgreich (`283/283`)
- `npm run self-check` erfolgreich
## 1.4.32 - 2026-03-01
Diese Version erweitert den Auto-Renamer stark fuer reale Scene-/TV-Release-Strukturen (nested und flat) und fuehrt eine intensive Renamer-Regression mit zusaetzlichen Edge-Case- und Stress-Checks ein.
### Renamer (Download-Manager)
- Erweiterte Mustererkennung fuer nested und flat Staffel-Ordner mit Group-Suffix (z. B. `-TMSF`, `-TVS`, `-TvR`, `-ZZGtv`, `-SunDry`).
- Episode-Token kann jetzt auch aus kompakten Codes im Source-Namen abgeleitet werden (z. B. `301` -> `S03E01`, `211` -> `S02E11`, `101` -> `S01E01`), sofern Staffel-Hinweise vorhanden sind.
- `Teil1/Teil2` bzw. `Part1/Part2` wird auf `SxxExx` gemappt, inklusive Staffel-Ableitung aus der Ordnerstruktur.
- Repack-Handling ueber Dateiname und Ordnerstruktur vereinheitlicht (`rp`/`repack` -> `REPACK`-Token konsistent im Zielnamen).
- Flat-Season-Ordner (Dateien direkt im Staffelordner) bekommen jetzt sauberes Episode-Inlining statt unspezifischer Season-Dateinamen.
- Pfadlaengen-Schutz auf Windows gehaertet: erst normaler Zielname, dann deterministischer Paket-Fallback (z. B. `Show.S08E20`), danach sicherer Skip mit Warnlog statt fehlerhaftem Rename.
### Abgedeckte reale Muster (Beispiele)
- Arrow / Gotham / Britannia / Legion / Lethal.Weapon / Agent.X / Last.Impact
- Nested Unterordner mit Episodentiteln und flache Staffelordner mit vielen Episoden-Dateien
- Uneinheitliche Source-Namen wie `tvs-...-301`, `...-211`, `...teil1...`, `...rp...`
### Intensive Bugtests
- Unit-Tests fuer Renamer deutlich ausgebaut (`tests/auto-rename.test.ts`) mit zusaetzlichen realen Pattern- und Compact-Code-Faellen.
- Zusätzliche intensive Szenario- und Stress-Checks mit temporaeren Testdateien ausgefuehrt (nested/flat, Repack, Teil/Part, Compact-Code, Pfadlaenge, Kollisionsschutz).
- TypeScript Typecheck erfolgreich.
- Voller Vitest Lauf erfolgreich (`279/279`).
- End-to-End Self-Check erfolgreich.
## 1.4.31 - 2026-03-01
Diese Version schliesst die komplette Bug-Audit-Runde (156 Punkte) ab und fokussiert auf Stabilitaet, Datenintegritaet, sauberes Abbruchverhalten und reproduzierbares Release-Verhalten.
### Audit-Abschluss
- Vollstaendige Abarbeitung der Audit-Liste `Bug-Audit-Komplett-156-Bugs.txt` ueber Main-Process, Renderer, Storage, Update, Integrity und Logger.
- Vereinheitlichte Fehlerbehandlung fuer Netzwerk-, Abort-, Retry- und Timeout-Pfade.
- Harte Regression-Absicherung ueber Typecheck, Unit-Tests und Release-Build.
### Download-Manager (Queue, Retry, Stop/Start, Post-Processing)
- Retry-Status ist jetzt item-gebunden statt call-lokal (kein Retry-Reset bei Requeue, keine Endlos-Retry-Schleifen mehr).
- Stop-zu-Start-Resume in derselben Session repariert (gestoppte Items werden wieder sauber gequeued).
- HTTP-416-Pfade gehaertet (Body-Konsum, korrektes Fehlerbild im letzten Attempt, Contribution-Reset bei Datei-Neustart).
- Target-Path-Reservierung gegen Race-Fenster verbessert (kein verfruehtes Release waehrend Retry-Delay).
- Scheduler-Verhalten bei Reconnect/Abort bereinigt, inklusive Status- und Speed-Resets in Abbruchpfaden.
- Post-Processing/Extraction-Abbruch und Paket-Lifecycle synchronisiert (inkl. Cleanup und Run-Finish-Konsistenz).
- `prepareForShutdown()` raeumt Persist- und State-Emitter-Timer jetzt vollstaendig auf.
- Read-only Queue-Checks entkoppelt von mutierenden Seiteneffekten.
### Extractor
- Cleanup-Modus `trash` ueberarbeitet (keine permanente Loeschung mehr im Trash-Pfad).
- Konfliktmodus-Weitergabe in ZIP- und External-Fallback-Pfaden konsistent gemacht.
- Fortschritts-Puls robust gegen callback-exceptions (kein unhandled crash durch `onProgress`).
- ZIP/Volume-Erkennung und Cleanup-Targets fuer Multi-Part-Archive erweitert.
- Schutz gegen gefaehrliche ZIP-Eintraege und Problemarchive weiter gehaertet.
### Debrid / RealDebrid
- Abort-signale werden in Filename-Resolution und Provider-Fallback konsequent respektiert.
- Provider-Fallback bricht bei Abort sofort ab statt weitere Provider zu probieren.
- Rapidgator-Filename-Resolution auf Content-Type, Retry-Klassen und Body-Handling gehaertet.
- AllDebrid/BestDebrid URL-Validierung verbessert (nur gueltige HTTP(S)-direct URLs).
- User-Agent-Versionsdrift beseitigt (nun zentral ueber `APP_VERSION`).
- RealDebrid-Retry-Backoff ist abort-freundlich (kein unnoetiges Warten nach Stop/Abort).
### Storage / Session / Settings
- Temp-Dateipfade fuer Session-Save gegen Race/Kollision gehaertet.
- Session-Normalisierung und PackageOrder-Deduplizierung stabilisiert.
- Settings-Normalisierung tightened (kein unkontrolliertes Property-Leaking).
- Import- und Update-Pfade robust gegen invalides Input-Shape.
### Main / App-Controller / IPC
- IPC-Validierung erweitert (Payload-Typen, String-Laengen, Import-Size-Limits).
- Auto-Resume Start-Reihenfolge korrigiert, damit der Renderer initiale States sicher erhaelt.
- Fenster-Lifecycle-Handler fuer neu erstellte Fenster vereinheitlicht (macOS activate-recreate eingeschlossen).
- Clipboard-Normalisierung unicode-sicher (kein Surrogate-Split bei Truncation).
- Container-Path-Filter so korrigiert, dass legitime Dateinamen mit `..` nicht falsch verworfen werden.
### Update-System
- Dateinamenhygiene fuer Setup-Assets gehaertet (`basename` + sanitize gegen Traversal/RCE-Pfade).
- Zielpfad-Kollisionen beseitigt (Timestamp + PID + UUID).
- `spawn`-Error-Handling hinzugefuegt (kein unhandled EventEmitter crash beim Installer-Start).
- Download-Pipeline auf Shutdown-abort vorbereitet; aktive Update-Downloads koennen sauber abgebrochen werden.
- Stream/Timeout/Retry-Handling bei Download und Release-Fetch konsolidiert.
### Integrity
- CRC32-Berechnung optimiert (Lookup-Table + Event-Loop-Yield), deutlich weniger UI-/Loop-Blockade bei grossen Dateien.
- Hash-Manifest-Lesen gecacht (reduzierte Disk-I/O bei Multi-File-Validierung).
- Manifest-Key-Matching fuer relative Pfade und Basenamen vereinheitlicht.
### Logger
- Rotation im async- und fallback-Pfad vervollstaendigt.
- Rotate-Checks pro Datei getrennt statt global geteilt.
- Async-Flush robust gegen Log-Loss bei Write-Fehlern (pending Lines werden erst nach erfolgreichem Write entfernt).
### Renderer (App.tsx)
- Theme-Toggle, Sortier-Optimismus und Picker-Busy-Lifecycle gegen Race Conditions gehaertet.
- Mounted-Guards fuer fruehe Unmount-Pfade ergaenzt.
- Drag-and-Drop nutzt aktive Tab-Referenz robust ueber async Grenzen.
- Confirm-Dialog-Text rendert Zeilenumbrueche korrekt.
- PackageCard-Memovergleich erweitert (inkl. Dateiname) fuer korrekte Re-Renders.
- Human-size Anzeige gegen negative/NaN Inputs gehaertet.
### QA / Build / Release
- TypeScript Typecheck erfolgreich.
- Voller Vitest Lauf erfolgreich (`262/262`).
- Windows Release-Build erfolgreich (NSIS + Portable).

View File

@ -1,23 +0,0 @@
## Release + Update Source (Wichtig)
- Primäre Plattform ist `https://git.24-music.de`
- Standard-Repo: `Administrator/real-debrid-downloader`
- Nicht mehr primär über Codeberg/GitHub releasen
## Releasen
1. Token setzen:
- PowerShell: `$env:GITEA_TOKEN="<token>"`
2. Release ausführen:
- `npm run release:gitea -- <version> [notes]`
Das Script:
- bumped `package.json`
- baut Windows-Artefakte
- pusht `main` + Tag
- erstellt Release auf `git.24-music.de`
- lädt Assets hoch
## Auto-Update
- Updater nutzt aktuell `git.24-music.de` als Standardquelle

View File

@ -1,6 +1,6 @@
# Multi Debrid Downloader
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** with fast queue management, automatic extraction, and robust error handling.
![Platform](https://img.shields.io/badge/platform-Windows%2010%2F11-0078D6)
![Electron](https://img.shields.io/badge/Electron-31.x-47848F)
@ -20,10 +20,8 @@ Desktop downloader with fast queue management, automatic extraction, and robust
- Package-based queue with file status, progress, ETA, speed, and retry counters.
- Start, pause, stop, and cancel for both single items and full packages.
- Multi-select via Ctrl+Click for batch operations on packages and items.
- Duplicate handling when adding links: keep, skip, or overwrite.
- Session recovery after restart, including optional auto-resume.
- Circuit breaker with escalating backoff cooldowns to handle provider outages gracefully.
### Debrid and link handling
@ -34,30 +32,10 @@ Desktop downloader with fast queue management, automatic extraction, and robust
### Extraction, cleanup, and quality
- JVM-based extraction backend using SevenZipJBinding + Zip4j (supports RAR, 7z, ZIP, and more).
- Automatic fallback to legacy UnRAR/7z CLI tools when JVM is unavailable.
- Auto-extract with separate target directory and conflict strategies.
- Hybrid extraction: simultaneous downloading and extracting with smart I/O priority throttling.
- Nested extraction: archives within archives are automatically extracted (one level deep).
- Pre-extraction disk space validation to prevent incomplete extracts.
- Right-click "Extract now" on any package with at least one completed item.
- Hybrid extraction, optional removal of link artifacts and sample files.
- Post-download integrity checks (`CRC32`, `MD5`, `SHA1`) with auto-retry on failures.
- Completed-item cleanup policy: `never`, `immediate`, `on_start`, `package_done`.
- Optional removal of link artifacts and sample files after extraction.
### Auto-rename
- Automatic renaming of extracted files based on series/episode patterns.
- Multi-episode token parsing for batch renames.
### UI and progress
- Visual progress bars with percentage overlay for packages and individual items.
- Real-time bandwidth chart showing current download speeds.
- Persistent download counters: all-time totals and per-session statistics.
- Download history for completed packages.
- Vertical sidebar with organized settings tabs.
- Hoster display showing both the original source and the debrid provider used.
### Convenience and automation
@ -65,18 +43,17 @@ Desktop downloader with fast queue management, automatic extraction, and robust
- Minimize-to-tray with tray menu controls.
- Speed limits globally or per download.
- Bandwidth schedules for time-based speed profiles.
- Built-in auto-updater via `git.24-music.de` Releases.
- Long path support (>260 characters) on Windows.
- Built-in update checks via Codeberg Releases.
## Installation
### Option A: prebuilt releases (recommended)
1. Download a release from the `git.24-music.de` Releases page.
1. Download a release from the Codeberg Releases page.
2. Run the installer or portable build.
3. Add your debrid tokens in Settings.
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
Releases: `https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases`
### Option B: build from source
@ -85,8 +62,7 @@ Requirements:
- Node.js `20+` (recommended `22+`)
- npm
- Windows `10/11` (for packaging and regular desktop use)
- Java Runtime `8+` (for SevenZipJBinding sidecar backend)
- Optional fallback: 7-Zip/UnRAR if you force legacy extraction mode
- Optional: 7-Zip/UnRAR for specific archive formats
```bash
npm install
@ -103,34 +79,21 @@ npm run dev
| `npm test` | Runs Vitest unit tests |
| `npm run self-check` | Runs integrated end-to-end self-checks |
| `npm run release:win` | Creates Windows installer and portable build |
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
| `npm run release:codeberg -- <version> [notes]` | One-command version bump + build + tag + Codeberg release upload |
### One-command git.24-music release
### One-command Codeberg release
```bash
npm run release:gitea -- 1.6.31 "- Maintenance update"
npm run release:codeberg -- 1.4.42 "- Maintenance update"
```
This command will:
1. Bump `package.json` version.
2. Build setup/portable artifacts (`npm run release:win`).
3. Commit and push `main` to your `git.24-music.de` remote.
3. Commit and push `main` to your Codeberg remote.
4. Create and push tag `v<version>`.
5. Create/update the Gitea release and upload required assets.
Required once before release:
```bash
git remote add gitea https://git.24-music.de/<user>/<repo>.git
```
PowerShell token setup:
```powershell
$env:GITEA_TOKEN="<dein-token>"
```
5. Create/update the Codeberg release and upload required assets.
## Typical workflow
@ -147,7 +110,6 @@ $env:GITEA_TOKEN="<dein-token>"
- `src/renderer` - React UI
- `src/shared` - shared types and IPC contracts
- `tests` - unit tests and self-check tests
- `resources/extractor-jvm` - SevenZipJBinding + Zip4j sidecar JAR and native libraries
## Data and logs
@ -160,43 +122,13 @@ The app stores runtime files in Electron's `userData` directory, including:
## Troubleshooting
- Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords and native extractor installation (7-Zip/WinRAR). Optional JVM extractor can be forced with `RD_EXTRACT_BACKEND=jvm`.
- Extraction fails: check archive passwords and extraction tool availability.
- Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers.
- Stalled downloads: the app auto-detects stalls within 10 seconds and retries automatically.
## Changelog
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
### v1.6.61 (2026-03-05)
- Fixed leftover empty package folders in `Downloader Unfertig` after successful extraction.
- Resume marker files (`.rd_extract_progress*.json`) are now treated as ignorable for empty-folder cleanup.
- Deferred post-processing now clears resume markers before running empty-directory removal.
### v1.6.60 (2026-03-05)
- Added package-scoped password cache for extraction: once the first archive in a package is solved, following archives in the same package reuse that password first.
- Kept fallback behavior intact (`""` and other candidates are still tested), but moved empty-password probing behind the learned password to reduce per-archive delays.
- Added cache invalidation on real `wrong_password` failures so stale passwords are automatically discarded.
### v1.6.59 (2026-03-05)
- Switched default extraction backend to native tools (`legacy`) for more stable archive-to-archive flow.
- Prioritized 7-Zip as primary native extractor, with WinRAR/UnRAR as fallback.
- JVM extractor remains available as opt-in via `RD_EXTRACT_BACKEND=jvm`.
### v1.6.58 (2026-03-05)
- Fixed extraction progress oscillation (`1% -> 100% -> 1%` loops) during password retries.
- Kept strict archive completion logic, but normalized in-progress archive percent to avoid false visual done states before real completion.
### v1.6.57 (2026-03-05)
- Fixed extraction flow so archives are marked done only on real completion, not on temporary `100%` progress spikes.
- Improved password handling: after the first successful archive, the discovered password is prioritized for subsequent archives.
- Fixed progress parsing for password retries (reset/restart handling), reducing visible and real gaps between archive extractions.
Release history is available in `CHANGELOG.md` and on Codeberg Releases.
## License

Binary file not shown.

Before

Width:  |  Height:  |  Size: 279 KiB

After

Width:  |  Height:  |  Size: 121 KiB

View File

@ -25,11 +25,11 @@ AppPublisher=Sucukdeluxe
DefaultDirName={autopf}\{#MyAppName}
DefaultGroupName={#MyAppName}
OutputDir={#MyOutputDir}
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
Compression=lzma
SolidCompression=yes
WizardStyle=modern
PrivilegesRequired=lowest
PrivilegesRequired=admin
ArchitecturesInstallIn64BitMode=x64compatible
UninstallDisplayIcon={app}\{#MyAppExeName}
SetupIconFile={#MyIconFile}
@ -39,8 +39,8 @@ Name: "german"; MessagesFile: "compiler:Languages\German.isl"
Name: "english"; MessagesFile: "compiler:Default.isl"
[Files]
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"; Flags: ignoreversion
[Icons]
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"

161
package-lock.json generated
View File

@ -1,12 +1,12 @@
{
"name": "real-debrid-downloader",
"version": "1.5.66",
"version": "1.4.33",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "real-debrid-downloader",
"version": "1.5.66",
"version": "1.4.33",
"license": "MIT",
"dependencies": {
"adm-zip": "^0.5.16",
@ -25,7 +25,6 @@
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
@ -65,6 +64,7 @@
"integrity": "sha512-CGOfOJqWjg2qW/Mb6zNsDm+u5vFQ8DxXfbM09z69p5Z6+mE1ikP2jUXw+j42Pf1XTYED2Rni5f95npYeuwMDQA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@babel/code-frame": "^7.29.0",
"@babel/generator": "^7.29.0",
@ -2043,6 +2043,7 @@
"integrity": "sha512-z9VXpC7MWrhfWipitjNdgCauoMLRdIILQsAEV+ZesIzBq/oUlxk0m3ApZuMFCXdnS4U7KrI+l3WRUEGQ8K1QKw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@types/prop-types": "*",
"csstype": "^3.2.2"
@ -2304,6 +2305,7 @@
"integrity": "sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"fast-deep-equal": "^3.1.1",
"fast-json-stable-stringify": "^2.0.0",
@ -2477,7 +2479,6 @@
"integrity": "sha512-+25nxyyznAXF7Nef3y0EbBeqmGZgeN/BxHX29Rs39djAfaFalmQ89SE6CWyDCHzGL0yt/ycBtNOmGTW0FyGWNw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"archiver-utils": "^2.1.0",
"async": "^3.2.4",
@ -2497,7 +2498,6 @@
"integrity": "sha512-bEL/yUb/fNNiNTuUz979Z0Yg5L+LzLxGJz8x79lYmR54fmTIb6ob/hNQgkQnIUDWIFjZVQwl9Xs356I6BAMHfw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"glob": "^7.1.4",
"graceful-fs": "^4.2.0",
@ -2520,7 +2520,6 @@
"integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"core-util-is": "~1.0.0",
"inherits": "~2.0.3",
@ -2536,8 +2535,7 @@
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/archiver-utils/node_modules/string_decoder": {
"version": "1.1.1",
@ -2545,7 +2543,6 @@
"integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "~5.1.0"
}
@ -2765,6 +2762,7 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"baseline-browser-mapping": "^2.9.0",
"caniuse-lite": "^1.0.30001759",
@ -3346,7 +3344,6 @@
"integrity": "sha512-D3uMHtGc/fcO1Gt1/L7i1e33VOvD4A9hfQLP+6ewd+BvG/gQ84Yh4oftEhAdjSMgBgwGL+jsppT7JYNpo6MHHg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"buffer-crc32": "^0.2.13",
"crc32-stream": "^4.0.2",
@ -3520,7 +3517,6 @@
"integrity": "sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==",
"dev": true,
"license": "Apache-2.0",
"peer": true,
"bin": {
"crc32": "bin/crc32.njs"
},
@ -3534,7 +3530,6 @@
"integrity": "sha512-NT7w2JVU7DFroFdYkeq8cywxrgjPHWkdX1wjpRQXPX5Asews3tA+Ght6lddQO5Mkumffp3X7GEqku3epj2toIw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"crc-32": "^1.2.0",
"readable-stream": "^3.4.0"
@ -3577,54 +3572,6 @@
"node": ">= 8"
}
},
"node_modules/cross-spawn-windows-exe": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/cross-spawn-windows-exe/-/cross-spawn-windows-exe-1.2.0.tgz",
"integrity": "sha512-mkLtJJcYbDCxEG7Js6eUnUNndWjyUZwJ3H7bErmmtOYU/Zb99DyUkpamuIZE0b3bhmJyZ7D90uS6f+CGxRRjOw==",
"dev": true,
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/malept"
},
{
"type": "tidelift",
"url": "https://tidelift.com/subscription/pkg/npm-cross-spawn-windows-exe?utm_medium=referral&utm_source=npm_fund"
}
],
"license": "Apache-2.0",
"dependencies": {
"@malept/cross-spawn-promise": "^1.1.0",
"is-wsl": "^2.2.0",
"which": "^2.0.2"
},
"engines": {
"node": ">= 10"
}
},
"node_modules/cross-spawn-windows-exe/node_modules/@malept/cross-spawn-promise": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@malept/cross-spawn-promise/-/cross-spawn-promise-1.1.1.tgz",
"integrity": "sha512-RTBGWL5FWQcg9orDOCcp4LvItNzUPcyEU9bwaeJX0rJ1IQxzucC48Y0/sQLp/g6t99IQgAlGIaesJS+gTn7tVQ==",
"dev": true,
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/malept"
},
{
"type": "tidelift",
"url": "https://tidelift.com/subscription/pkg/npm-.malept-cross-spawn-promise?utm_medium=referral&utm_source=npm_fund"
}
],
"license": "Apache-2.0",
"dependencies": {
"cross-spawn": "^7.0.1"
},
"engines": {
"node": ">= 10"
}
},
"node_modules/csstype": {
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz",
@ -3833,6 +3780,7 @@
"integrity": "sha512-NoXo6Liy2heSklTI5OIZbCgXC1RzrDQsZkeEwXhdOro3FT1VBOvbubvscdPnjVuQ4AMwwv61oaH96AbiYg9EnQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"app-builder-lib": "25.1.8",
"builder-util": "25.1.7",
@ -4028,7 +3976,6 @@
"integrity": "sha512-2ntkJ+9+0GFP6nAISiMabKt6eqBB0kX1QqHNWFWAXgi0VULKGisM46luRFpIBiU3u/TDmhZMM8tzvo2Abn3ayg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"app-builder-lib": "25.1.8",
"archiver": "^5.3.1",
@ -4042,7 +3989,6 @@
"integrity": "sha512-oRXApq54ETRj4eMiFzGnHWGy+zo5raudjuxN0b8H7s/RU2oW0Wvsx9O0ACRN/kRq9E8Vu/ReskGB5o3ji+FzHQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
@ -4058,7 +4004,6 @@
"integrity": "sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"universalify": "^2.0.0"
},
@ -4072,7 +4017,6 @@
"integrity": "sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 10.0.0"
}
@ -4309,6 +4253,7 @@
"dev": true,
"hasInstallScript": true,
"license": "MIT",
"peer": true,
"bin": {
"esbuild": "bin/esbuild"
},
@ -4594,8 +4539,7 @@
"resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz",
"integrity": "sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/fs-extra": {
"version": "8.1.0",
@ -5202,22 +5146,6 @@
"is-ci": "bin.js"
}
},
"node_modules/is-docker": {
"version": "2.2.1",
"resolved": "https://registry.npmjs.org/is-docker/-/is-docker-2.2.1.tgz",
"integrity": "sha512-F+i2BKsFrH66iaUFc0woD8sLy8getkwTwtOBjvs56Cx4CgJDeKQeqfz8wAYiSb8JOprWhHH5p77PbmYCvvUuXQ==",
"dev": true,
"license": "MIT",
"bin": {
"is-docker": "cli.js"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-fullwidth-code-point": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
@ -5258,26 +5186,12 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-wsl": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/is-wsl/-/is-wsl-2.2.0.tgz",
"integrity": "sha512-fKzAra0rGJUUBwGBgNkHZuToZcn+TtXHpeCgmkMJMMYx1sQDYaCSyjJBSCa2nH1DGm7s3n1oBnohoVTBaN7Lww==",
"dev": true,
"license": "MIT",
"dependencies": {
"is-docker": "^2.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/isarray": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
"integrity": "sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/isbinaryfile": {
"version": "5.0.7",
@ -5462,7 +5376,6 @@
"integrity": "sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"readable-stream": "^2.0.5"
},
@ -5476,7 +5389,6 @@
"integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"core-util-is": "~1.0.0",
"inherits": "~2.0.3",
@ -5492,8 +5404,7 @@
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/lazystream/node_modules/string_decoder": {
"version": "1.1.1",
@ -5501,7 +5412,6 @@
"integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "~5.1.0"
}
@ -5548,40 +5458,35 @@
"resolved": "https://registry.npmjs.org/lodash.defaults/-/lodash.defaults-4.2.0.tgz",
"integrity": "sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/lodash.difference": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/lodash.difference/-/lodash.difference-4.5.0.tgz",
"integrity": "sha512-dS2j+W26TQ7taQBGN8Lbbq04ssV3emRw4NY58WErlTO29pIqS0HmoT5aJ9+TUQ1N3G+JOZSji4eugsWwGp9yPA==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/lodash.flatten": {
"version": "4.4.0",
"resolved": "https://registry.npmjs.org/lodash.flatten/-/lodash.flatten-4.4.0.tgz",
"integrity": "sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/lodash.isplainobject": {
"version": "4.0.6",
"resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/lodash.union": {
"version": "4.6.0",
"resolved": "https://registry.npmjs.org/lodash.union/-/lodash.union-4.6.0.tgz",
"integrity": "sha512-c4pB2CdGrGdjMKYLA+XiRDO7Y0PRQbm/Gzg8qMj+QH+pFVAoTp5sBpO0odL3FjoPCGjK96p6qsP+yQoiLoOBcw==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/log-symbols": {
"version": "4.1.0",
@ -6145,7 +6050,6 @@
"integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=0.10.0"
}
@ -6406,6 +6310,7 @@
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=12"
},
@ -6470,6 +6375,7 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"nanoid": "^3.3.11",
"picocolors": "^1.1.1",
@ -6527,8 +6433,7 @@
"resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz",
"integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/progress": {
"version": "2.0.3",
@ -6602,24 +6507,12 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/rcedit": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/rcedit/-/rcedit-5.0.2.tgz",
"integrity": "sha512-dgysxaeXZ4snLpPjn8aVtHvZDCx+aRcvZbaWBgl1poU6OPustMvOkj9a9ZqASQ6i5Y5szJ13LSvglEOwrmgUxA==",
"dev": true,
"license": "MIT",
"dependencies": {
"cross-spawn-windows-exe": "^1.1.0"
},
"engines": {
"node": ">= 22.12.0"
}
},
"node_modules/react": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz",
"integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"loose-envify": "^1.1.0"
},
@ -6684,7 +6577,6 @@
"integrity": "sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA==",
"dev": true,
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"minimatch": "^5.1.0"
}
@ -6694,8 +6586,7 @@
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
"dev": true,
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/readdir-glob/node_modules/brace-expansion": {
"version": "2.0.2",
@ -6703,7 +6594,6 @@
"integrity": "sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"balanced-match": "^1.0.0"
}
@ -6714,7 +6604,6 @@
"integrity": "sha512-7o1wEA2RyMP7Iu7GNba9vc0RWWGACJOCZBJX2GJWip0ikV+wcOsgVuY9uE8CPiyQhkGFSlhuSkZPavN7u1c2Fw==",
"dev": true,
"license": "ISC",
"peer": true,
"dependencies": {
"brace-expansion": "^2.0.1"
},
@ -7395,7 +7284,6 @@
"integrity": "sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"bl": "^4.0.3",
"end-of-stream": "^1.4.1",
@ -7680,6 +7568,7 @@
"integrity": "sha512-5C1sg4USs1lfG0GFb2RLXsdpXqBSEhAaA/0kPL01wxzpMqLILNxIxIOKiILz+cdg/pLnOUxFYOR5yhHU666wbw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "~0.27.0",
"get-tsconfig": "^4.7.5"
@ -7862,6 +7751,7 @@
"integrity": "sha512-+Oxm7q9hDoLMyJOYfUYBuHQo+dkAloi33apOPP56pzj+vsdJDzr+j1NISE5pyaAuKL4A3UD34qd0lx5+kfKp2g==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "^0.25.0",
"fdir": "^6.4.4",
@ -9471,6 +9361,7 @@
"integrity": "sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "^0.21.3",
"postcss": "^8.4.43",
@ -9728,7 +9619,6 @@
"integrity": "sha512-9qv4rlDiopXg4E69k+vMHjNN63YFMe9sZMrdlvKnCjlCRWeCBswPPMPUfx+ipsAWq1LXHe70RcbaHdJJpS6hyQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"archiver-utils": "^3.0.4",
"compress-commons": "^4.1.2",
@ -9744,7 +9634,6 @@
"integrity": "sha512-KVgf4XQVrTjhyWmx6cte4RxonPLR9onExufI1jhvw/MQ4BB6IsZD5gT8Lq+u/+pRkWna/6JoHpiQioaqFP5Rzw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"glob": "^7.2.3",
"graceful-fs": "^4.2.0",

View File

@ -1,7 +1,7 @@
{
"name": "real-debrid-downloader",
"version": "1.6.66",
"description": "Desktop downloader",
"version": "1.4.51",
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
"main": "build/main/main/main.js",
"author": "Sucukdeluxe",
"license": "MIT",
@ -17,8 +17,7 @@
"test": "vitest run",
"self-check": "tsx tests/self-check.ts",
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
"release:gitea": "node scripts/release_gitea.mjs",
"release:forgejo": "node scripts/release_gitea.mjs"
"release:codeberg": "node scripts/release_codeberg.mjs"
},
"dependencies": {
"adm-zip": "^0.5.16",
@ -37,7 +36,6 @@
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
@ -55,12 +53,8 @@
"files": [
"build/main/**/*",
"build/renderer/**/*",
"resources/extractor-jvm/**/*",
"package.json"
],
"asarUnpack": [
"resources/extractor-jvm/**/*"
],
"win": {
"target": [
"nsis",
@ -74,7 +68,6 @@
"perMachine": false,
"allowToChangeInstallationDirectory": true,
"createDesktopShortcut": true
},
"afterPack": "scripts/afterPack.cjs"
}
}
}

View File

@ -1,22 +0,0 @@
# JVM extractor runtime
This directory contains the Java sidecar runtime used by `src/main/extractor.ts`.
## Included backends
- `sevenzipjbinding` for the primary extraction path (RAR/7z/ZIP and others)
- `zip4j` for ZIP multipart handling (JD-style split ZIP behavior)
## Layout
- `classes/` compiled `JBindExtractorMain` classes
- `lib/` runtime jars required by the sidecar
- `src/` Java source for the sidecar
## Rebuild notes
The checked-in classes are Java 8 compatible and built from:
`resources/extractor-jvm/src/com/sucukdeluxe/extractor/JBindExtractorMain.java`
If you need to rebuild, compile against the jars in `lib/` with a Java 8-compatible compiler.

View File

@ -1,12 +0,0 @@
Bundled JVM extractor dependencies:
1) sevenzipjbinding (16.02-2.01)
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding-all-platforms
- Upstream: https://sevenzipjbind.sourceforge.net/
2) zip4j (2.11.5)
- Maven artifact: net.lingala.zip4j:zip4j
- Upstream: https://github.com/srikanth-lingala/zip4j
Please review upstream licenses and notices before redistribution.

View File

@ -1,18 +0,0 @@
const path = require("path");
const { rcedit } = require("rcedit");
module.exports = async function afterPack(context) {
const productFilename = context.packager?.appInfo?.productFilename;
if (!productFilename) {
console.warn(" • rcedit: skipped — productFilename not available");
return;
}
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
console.log(` • rcedit: patching icon → ${exePath}`);
try {
await rcedit(exePath, { icon: iconPath });
} catch (error) {
console.warn(` • rcedit: failed — ${String(error)}`);
}
};

View File

@ -31,7 +31,6 @@ async function main(): Promise<void> {
login: settings.megaLogin,
password: settings.megaPassword
}));
try {
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
@ -43,9 +42,7 @@ async function main(): Promise<void> {
console.log(`[FAIL] ${String(error)}`);
}
}
} finally {
megaWeb.dispose();
}
}
main().catch(e => { console.error(e); process.exit(1); });
void main();

View File

@ -16,8 +16,8 @@ function sleep(ms) {
}
function cookieFrom(headers) {
const cookies = headers.getSetCookie();
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
const raw = headers.get("set-cookie") || "";
return raw.split(",").map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
}
function parseDebridCodes(html) {
@ -47,9 +47,6 @@ async function resolveCode(cookie, code) {
});
const text = (await res.text()).trim();
if (text === "reload") {
if (attempt % 5 === 0) {
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
}
await sleep(800);
continue;
}
@ -101,13 +98,7 @@ async function main() {
redirect: "manual"
});
if (loginRes.status >= 400) {
throw new Error(`Login failed with HTTP ${loginRes.status}`);
}
const cookie = cookieFrom(loginRes.headers);
if (!cookie) {
throw new Error("Login returned no session cookie");
}
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
@ -145,4 +136,4 @@ async function main() {
}
}
await main().catch((e) => { console.error(e); process.exit(1); });
await main();

View File

@ -66,8 +66,6 @@ async function callRealDebrid(link) {
};
}
// megaCookie is intentionally cached at module scope so that multiple
// callMegaDebrid() invocations reuse the same session cookie.
async function callMegaDebrid(link) {
if (!megaCookie) {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
@ -79,15 +77,13 @@ async function callMegaDebrid(link) {
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
redirect: "manual"
});
if (loginRes.status >= 400) {
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
}
megaCookie = loginRes.headers.getSetCookie()
megaCookie = (loginRes.headers.get("set-cookie") || "")
.split(",")
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
if (!megaCookie) {
return { ok: false, error: "Mega-Web login returned no session cookie" };
return { ok: false, error: "Mega-Web login failed" };
}
}
@ -294,4 +290,4 @@ async function main() {
}
}
await main().catch((e) => { console.error(e); process.exit(1); });
await main();

View File

@ -2,15 +2,7 @@ import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const NPM_RELEASE_WIN = process.platform === "win32"
? {
command: process.env.ComSpec || "cmd.exe",
args: ["/d", "/s", "/c", "npm run release:win"]
}
: {
command: "npm",
args: ["run", "release:win"]
};
const NPM_EXECUTABLE = process.platform === "win32" ? "npm.cmd" : "npm";
function run(command, args, options = {}) {
const result = spawnSync(command, args, {
@ -45,8 +37,7 @@ function runWithInput(command, args, input) {
cwd: process.cwd(),
encoding: "utf8",
input,
stdio: ["pipe", "pipe", "pipe"],
timeout: 10000
stdio: ["pipe", "pipe", "pipe"]
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
@ -68,74 +59,37 @@ function parseArgs(argv) {
return { help: false, dryRun, version, notes };
}
function parseRemoteUrl(url) {
function parseCodebergRemote(url) {
const raw = String(url || "").trim();
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
const httpsMatch = raw.match(/^https?:\/\/(?:www\.)?codeberg\.org\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (httpsMatch) {
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
return { owner: httpsMatch[1], repo: httpsMatch[2] };
}
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
const sshMatch = raw.match(/^git@codeberg\.org:([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshMatch) {
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
return { owner: sshMatch[1], repo: sshMatch[2] };
}
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshAltMatch) {
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
}
throw new Error(`Cannot parse remote URL: ${raw}`);
throw new Error(`Cannot parse Codeberg remote URL: ${raw}`);
}
function normalizeBaseUrl(url) {
const raw = String(url || "").trim().replace(/\/+$/, "");
if (!raw) {
return "";
}
if (!/^https?:\/\//i.test(raw)) {
throw new Error("GITEA_BASE_URL must start with http:// or https://");
}
return raw;
}
function getGiteaRepo() {
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
const remotes = forcedRemote
? [forcedRemote]
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
function getCodebergRepo() {
const remotes = ["codeberg", "origin"];
for (const remote of remotes) {
try {
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
const parsed = parseRemoteUrl(remoteUrl);
const remoteBase = `https://${parsed.host}`.toLowerCase();
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
continue;
if (/codeberg\.org/i.test(remoteUrl)) {
const parsed = parseCodebergRemote(remoteUrl);
return { remote, ...parsed };
}
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
} catch {
// try next remote
}
}
if (preferredBase) {
throw new Error(
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
);
}
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
throw new Error("No Codeberg remote found. Add one with: git remote add codeberg https://codeberg.org/<owner>/<repo>.git");
}
function getAuthHeader(host) {
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
if (explicitToken) {
return `token ${explicitToken}`;
}
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
function getCodebergAuthHeader() {
const credentialText = runWithInput("git", ["credential", "fill"], "protocol=https\nhost=codeberg.org\n\n");
const map = new Map();
for (const line of credentialText.split(/\r?\n/)) {
if (!line.includes("=")) {
@ -147,9 +101,7 @@ function getAuthHeader(host) {
const username = map.get("username") || "";
const password = map.get("password") || "";
if (!username || !password) {
throw new Error(
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
);
throw new Error("Missing Codeberg credentials in git credential helper");
}
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
return `Basic ${token}`;
@ -190,25 +142,12 @@ function updatePackageVersion(rootDir, version) {
const packagePath = path.join(rootDir, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
if (String(packageJson.version || "") === version) {
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
return;
throw new Error(`package.json is already at version ${version}`);
}
packageJson.version = version;
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
}
function patchLatestYml(releaseDir, version) {
const ymlPath = path.join(releaseDir, "latest.yml");
let content = fs.readFileSync(ymlPath, "utf8");
const setupName = `Real-Debrid-Downloader Setup ${version}.exe`;
const dashedName = `Real-Debrid-Downloader-Setup-${version}.exe`;
if (content.includes(dashedName)) {
content = content.split(dashedName).join(setupName);
fs.writeFileSync(ymlPath, content, "utf8");
process.stdout.write(`Patched latest.yml: replaced "${dashedName}" with "${setupName}"\n`);
}
}
function ensureAssetsExist(rootDir, version) {
const releaseDir = path.join(rootDir, "release");
const files = [
@ -223,7 +162,6 @@ function ensureAssetsExist(rootDir, version) {
throw new Error(`Missing release artifact: ${fullPath}`);
}
}
patchLatestYml(releaseDir, version);
return { releaseDir, files };
}
@ -246,7 +184,8 @@ function ensureTagMissing(tag) {
}
}
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
if (byTag.ok) {
return byTag.body;
@ -266,34 +205,13 @@ async function createOrGetRelease(baseApi, tag, authHeader, notes) {
return created.body;
}
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDir, files) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
for (const fileName of files) {
const filePath = path.join(releaseDir, fileName);
const fileSize = fs.statSync(filePath).size;
const fileData = fs.readFileSync(filePath);
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
// Stream large files instead of loading them entirely into memory
const fileStream = fs.createReadStream(filePath);
const response = await fetch(uploadUrl, {
method: "POST",
headers: {
Accept: "application/json",
Authorization: authHeader,
"Content-Type": "application/octet-stream",
"Content-Length": String(fileSize)
},
body: fileStream,
duplex: "half"
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
const response = await apiRequest("POST", uploadUrl, authHeader, fileData, "application/octet-stream");
if (response.ok) {
process.stdout.write(`Uploaded: ${fileName}\n`);
continue;
@ -302,7 +220,7 @@ async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, f
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
continue;
}
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(response.body)}`);
}
}
@ -310,44 +228,46 @@ async function main() {
const rootDir = process.cwd();
const args = parseArgs(process.argv);
if (args.help) {
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
process.stdout.write("Usage: npm run release:codeberg -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Example: npm run release:codeberg -- 1.4.42 \"- Small fixes\"\n");
return;
}
const version = ensureVersionString(args.version);
const tag = `v${version}`;
const releaseNotes = args.notes || `- Release ${tag}`;
const repo = getGiteaRepo();
const { remote, owner, repo } = getCodebergRepo();
ensureNoTrackedChanges();
ensureTagMissing(tag);
if (args.dryRun) {
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
return;
}
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_RELEASE_WIN.command, NPM_RELEASE_WIN.args);
run(NPM_EXECUTABLE, ["run", "release:win"]);
const assets = ensureAssetsExist(rootDir, version);
if (args.dryRun) {
process.stdout.write(`Dry run complete. Assets exist for ${tag}.\n`);
return;
}
run("git", ["add", "package.json"]);
run("git", ["commit", "-m", `Release ${tag}`]);
run("git", ["push", repo.remote, "main"]);
run("git", ["push", remote, "main"]);
run("git", ["tag", tag]);
run("git", ["push", repo.remote, tag]);
run("git", ["push", remote, tag]);
const authHeader = getAuthHeader(repo.host);
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
const authHeader = getCodebergAuthHeader();
const baseRepoApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
const patchReleaseEnabled = await apiRequest("PATCH", baseRepoApi, authHeader, JSON.stringify({ has_releases: true }));
if (!patchReleaseEnabled.ok) {
throw new Error(`Failed to enable releases (${patchReleaseEnabled.status}): ${JSON.stringify(patchReleaseEnabled.body)}`);
}
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
const release = await createOrGetRelease(owner, repo, tag, authHeader, releaseNotes);
await uploadReleaseAssets(owner, repo, release.id, authHeader, assets.releaseDir, assets.files);
process.stdout.write(`Release published: ${release.html_url}\n`);
}
main().catch((error) => {

View File

@ -0,0 +1,24 @@
import fs from "node:fs";
import path from "node:path";
const version = process.argv[2];
if (!version) {
console.error("Usage: node scripts/set_version_node.mjs <version>");
process.exit(1);
}
const root = process.cwd();
const packageJsonPath = path.join(root, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, "utf8"));
packageJson.version = version;
fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
const constantsPath = path.join(root, "src", "main", "constants.ts");
const constants = fs.readFileSync(constantsPath, "utf8").replace(
/APP_VERSION = "[^"]+"/,
`APP_VERSION = "${version}"`
);
fs.writeFileSync(constantsPath, constants, "utf8");
console.log(`Set version to ${version}`);

View File

@ -4,10 +4,7 @@ import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
ParsedPackageInput,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
@ -20,11 +17,9 @@ import { APP_VERSION } from "./constants";
import { DownloadManager } from "./download-manager";
import { parseCollectorInput } from "./link-parser";
import { configureLogger, getLogFilePath, logger } from "./logger";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
import { MegaWebFallback } from "./mega-web-fallback";
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { createStoragePaths, loadSession, loadSettings, normalizeSettings, saveSettings } from "./storage";
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
import { startDebugServer, stopDebugServer } from "./debug-server";
function sanitizeSettingsPatch(partial: Partial<AppSettings>): Partial<AppSettings> {
const entries = Object.entries(partial || {}).filter(([, value]) => value !== undefined);
@ -54,7 +49,6 @@ export class AppController {
public constructor() {
configureLogger(this.storagePaths.baseDir);
initSessionLog(this.storagePaths.baseDir);
this.settings = loadSettings(this.storagePaths);
const session = loadSession(this.storagePaths);
this.megaWebFallback = new MegaWebFallback(() => ({
@ -62,40 +56,24 @@ export class AppController {
password: this.settings.megaPassword
}));
this.manager = new DownloadManager(this.settings, session, this.storagePaths, {
megaWebUnrestrict: (link: string, signal?: AbortSignal) => this.megaWebFallback.unrestrict(link, signal),
invalidateMegaSession: () => this.megaWebFallback.invalidateSession(),
onHistoryEntry: (entry: HistoryEntry) => {
addHistoryEntry(this.storagePaths, entry);
}
megaWebUnrestrict: (link: string) => this.megaWebFallback.unrestrict(link)
});
this.manager.on("state", (snapshot: UiSnapshot) => {
this.onStateHandler?.(snapshot);
});
logger.info(`App gestartet v${APP_VERSION}`);
logger.info(`Log-Datei: ${getLogFilePath()}`);
startDebugServer(this.manager, this.storagePaths.baseDir);
if (this.settings.autoResumeOnStart) {
const snapshot = this.manager.getSnapshot();
const hasPending = Object.values(snapshot.session.items).some((item) => item.status === "queued" || item.status === "reconnect_wait");
if (hasPending) {
void this.manager.getStartConflicts().then((conflicts) => {
const hasConflicts = conflicts.length > 0;
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
// If the onState handler is already set (renderer connected), start immediately.
// Otherwise mark as pending so the onState setter triggers the start.
if (this.onStateHandler) {
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
} else {
const hasConflicts = this.manager.getStartConflicts().length > 0;
if (hasPending && this.hasAnyProviderToken(this.settings) && !hasConflicts) {
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
}
} else if (hasConflicts) {
} else if (hasPending && hasConflicts) {
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
}
}).catch((err) => logger.warn(`getStartConflicts Fehler (constructor): ${String(err)}`));
}
}
}
@ -105,8 +83,6 @@ export class AppController {
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|| settings.bestToken.trim()
|| settings.allDebridToken.trim()
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
|| settings.oneFichierApiKey.trim()
);
}
@ -120,11 +96,8 @@ export class AppController {
handler(this.manager.getSnapshot());
if (this.autoResumePending) {
this.autoResumePending = false;
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
this.manager.start();
logger.info("Auto-Resume beim Start aktiviert");
} else {
// Trigger pending extractions without starting the session
this.manager.triggerIdleExtractions();
}
}
}
@ -152,9 +125,6 @@ export class AppController {
return this.settings;
}
// Preserve the live totalDownloadedAllTime from the download manager
const liveSettings = this.manager.getSettings();
nextSettings.totalDownloadedAllTime = Math.max(nextSettings.totalDownloadedAllTime || 0, liveSettings.totalDownloadedAllTime || 0);
this.settings = nextSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
@ -171,12 +141,6 @@ export class AppController {
}
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
// Stop active downloads before installing. Extractions may continue briefly
// until prepareForShutdown() is called during app quit.
if (this.manager.isSessionRunning()) {
this.manager.stop();
}
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
? this.lastUpdateCheck
@ -209,7 +173,7 @@ export class AppController {
return result;
}
public async getStartConflicts(): Promise<StartConflictEntry[]> {
public getStartConflicts(): StartConflictEntry[] {
return this.manager.getStartConflicts();
}
@ -221,16 +185,8 @@ export class AppController {
this.manager.clearAll();
}
public async start(): Promise<void> {
await this.manager.start();
}
public async startPackages(packageIds: string[]): Promise<void> {
await this.manager.startPackages(packageIds);
}
public async startItems(itemIds: string[]): Promise<void> {
await this.manager.startItems(itemIds);
public start(): void {
this.manager.start();
}
public stop(): void {
@ -241,18 +197,6 @@ export class AppController {
return this.manager.togglePause();
}
public retryExtraction(packageId: string): void {
this.manager.retryExtraction(packageId);
}
public extractNow(packageId: string): void {
this.manager.extractNow(packageId);
}
public resetPackage(packageId: string): void {
this.manager.resetPackage(packageId);
}
public cancelPackage(packageId: string): void {
this.manager.cancelPackage(packageId);
}
@ -281,104 +225,10 @@ export class AppController {
return this.manager.importQueue(json);
}
public getSessionStats(): SessionStats {
return this.manager.getSessionStats();
}
public exportBackup(): string {
const settings = { ...this.settings };
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = settings[key];
if (typeof val === "string" && val.length > 0) {
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
}
}
const session = this.manager.getSession();
return JSON.stringify({ version: 1, settings, session }, null, 2);
}
public importBackup(json: string): { restored: boolean; message: string } {
let parsed: Record<string, unknown>;
try {
parsed = JSON.parse(json) as Record<string, unknown>;
} catch {
return { restored: false, message: "Ungültiges JSON" };
}
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
}
const importedSettings = parsed.settings as AppSettings;
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = (importedSettings as Record<string, unknown>)[key];
if (typeof val === "string" && val.startsWith("***")) {
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
}
}
const restoredSettings = normalizeSettings(importedSettings);
this.settings = restoredSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
// Full stop including extraction abort — the old session is being replaced,
// so no extraction tasks from it should keep running.
this.manager.stop();
this.manager.abortAllPostProcessing();
// Cancel any deferred persist timer and queued async writes so the old
// in-memory session does not overwrite the restored session file on disk.
this.manager.clearPersistTimer();
cancelPendingAsyncSaves();
const restoredSession = normalizeLoadedSessionTransientFields(
normalizeLoadedSession(parsed.session)
);
saveSession(this.storagePaths, restoredSession);
// Prevent prepareForShutdown from overwriting the restored session file
// with the old in-memory session when the app quits after backup restore.
this.manager.skipShutdownPersist = true;
// Block all persistence (including persistSoon from any IPC operations
// the user might trigger before restarting) to protect the restored backup.
this.manager.blockAllPersistence = true;
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
}
public getSessionLogPath(): string | null {
return getSessionLogPath();
}
public shutdown(): void {
stopDebugServer();
abortActiveUpdateDownload();
this.manager.prepareForShutdown();
this.megaWebFallback.dispose();
shutdownSessionLog();
logger.info("App beendet");
}
public getHistory(): HistoryEntry[] {
return loadHistory(this.storagePaths);
}
public clearHistory(): void {
clearHistory(this.storagePaths);
}
public setPackagePriority(packageId: string, priority: PackagePriority): void {
this.manager.setPackagePriority(packageId, priority);
}
public skipItems(itemIds: string[]): void {
this.manager.skipItems(itemIds);
}
public resetItems(itemIds: string[]): void {
this.manager.resetItems(itemIds);
}
public removeHistoryEntry(entryId: string): void {
removeHistoryEntry(this.storagePaths, entryId);
}
public addToHistory(entry: HistoryEntry): void {
addHistoryEntry(this.storagePaths, entry);
}
}

View File

@ -1,66 +0,0 @@
import crypto from "node:crypto";
export const SENSITIVE_KEYS = [
"token",
"megaLogin",
"megaPassword",
"bestToken",
"allDebridToken",
"archivePasswordList"
] as const;
export type SensitiveKey = (typeof SENSITIVE_KEYS)[number];
export interface EncryptedCredentials {
salt: string;
iv: string;
tag: string;
data: string;
}
const PBKDF2_ITERATIONS = 100_000;
const KEY_LENGTH = 32; // 256 bit
const IV_LENGTH = 12; // 96 bit for GCM
const SALT_LENGTH = 16;
function deriveKey(username: string, salt: Buffer): Buffer {
return crypto.pbkdf2Sync(username, salt, PBKDF2_ITERATIONS, KEY_LENGTH, "sha256");
}
export function encryptCredentials(
fields: Record<string, string>,
username: string
): EncryptedCredentials {
const salt = crypto.randomBytes(SALT_LENGTH);
const iv = crypto.randomBytes(IV_LENGTH);
const key = deriveKey(username, salt);
const cipher = crypto.createCipheriv("aes-256-gcm", key, iv);
const plaintext = JSON.stringify(fields);
const encrypted = Buffer.concat([cipher.update(plaintext, "utf8"), cipher.final()]);
const tag = cipher.getAuthTag();
return {
salt: salt.toString("hex"),
iv: iv.toString("hex"),
tag: tag.toString("hex"),
data: encrypted.toString("hex")
};
}
export function decryptCredentials(
encrypted: EncryptedCredentials,
username: string
): Record<string, string> {
const salt = Buffer.from(encrypted.salt, "hex");
const iv = Buffer.from(encrypted.iv, "hex");
const tag = Buffer.from(encrypted.tag, "hex");
const data = Buffer.from(encrypted.data, "hex");
const key = deriveKey(username, salt);
const decipher = crypto.createDecipheriv("aes-256-gcm", key, iv);
decipher.setAuthTag(tag);
const decrypted = Buffer.concat([decipher.update(data), decipher.final()]);
return JSON.parse(decrypted.toString("utf8")) as Record<string, string>;
}

View File

@ -88,10 +88,8 @@ export async function cleanupCancelledPackageArtifactsAsync(packageDir: string):
return removed;
}
export async function removeDownloadLinkArtifacts(extractDir: string): Promise<number> {
try {
await fs.promises.access(extractDir);
} catch {
export function removeDownloadLinkArtifacts(extractDir: string): number {
if (!fs.existsSync(extractDir)) {
return 0;
}
let removed = 0;
@ -99,7 +97,7 @@ export async function removeDownloadLinkArtifacts(extractDir: string): Promise<n
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
@ -116,9 +114,9 @@ export async function removeDownloadLinkArtifacts(extractDir: string): Promise<n
if (!shouldDelete && [".txt", ".html", ".htm", ".nfo"].includes(ext)) {
if (/[._\- ](links?|downloads?|urls?|dlc)([._\- ]|$)/i.test(name)) {
try {
const stat = await fs.promises.stat(full);
const stat = fs.statSync(full);
if (stat.size <= MAX_LINK_ARTIFACT_BYTES) {
const text = await fs.promises.readFile(full, "utf8");
const text = fs.readFileSync(full, "utf8");
shouldDelete = /https?:\/\//i.test(text);
}
} catch {
@ -129,7 +127,7 @@ export async function removeDownloadLinkArtifacts(extractDir: string): Promise<n
if (shouldDelete) {
try {
await fs.promises.rm(full, { force: true });
fs.rmSync(full, { force: true });
removed += 1;
} catch {
// ignore
@ -140,10 +138,8 @@ export async function removeDownloadLinkArtifacts(extractDir: string): Promise<n
return removed;
}
export async function removeSampleArtifacts(extractDir: string): Promise<{ files: number; dirs: number }> {
try {
await fs.promises.access(extractDir);
} catch {
export function removeSampleArtifacts(extractDir: string): { files: number; dirs: number } {
if (!fs.existsSync(extractDir)) {
return { files: 0, dirs: 0 };
}
@ -152,14 +148,14 @@ export async function removeSampleArtifacts(extractDir: string): Promise<{ files
const sampleDirs: string[] = [];
const stack = [extractDir];
const countFilesRecursive = async (rootDir: string): Promise<number> => {
const countFilesRecursive = (rootDir: string): number => {
let count = 0;
const dirs = [rootDir];
while (dirs.length > 0) {
const current = dirs.pop() as string;
let entries: fs.Dirent[] = [];
try {
entries = await fs.promises.readdir(current, { withFileTypes: true });
entries = fs.readdirSync(current, { withFileTypes: true });
} catch {
continue;
}
@ -167,7 +163,7 @@ export async function removeSampleArtifacts(extractDir: string): Promise<{ files
const full = path.join(current, entry.name);
if (entry.isDirectory()) {
try {
const stat = await fs.promises.lstat(full);
const stat = fs.lstatSync(full);
if (stat.isSymbolicLink()) {
continue;
}
@ -186,7 +182,7 @@ export async function removeSampleArtifacts(extractDir: string): Promise<{ files
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() || entry.isSymbolicLink()) {
@ -210,7 +206,7 @@ export async function removeSampleArtifacts(extractDir: string): Promise<{ files
if (isSampleVideo) {
try {
await fs.promises.rm(full, { force: true });
fs.rmSync(full, { force: true });
removedFiles += 1;
} catch {
// ignore
@ -222,14 +218,14 @@ export async function removeSampleArtifacts(extractDir: string): Promise<{ files
sampleDirs.sort((a, b) => b.length - a.length);
for (const dir of sampleDirs) {
try {
const stat = await fs.promises.lstat(dir);
const stat = fs.lstatSync(dir);
if (stat.isSymbolicLink()) {
await fs.promises.rm(dir, { force: true });
fs.rmSync(dir, { force: true });
removedDirs += 1;
continue;
}
const filesInDir = await countFilesRecursive(dir);
await fs.promises.rm(dir, { recursive: true, force: true });
const filesInDir = countFilesRecursive(dir);
fs.rmSync(dir, { recursive: true, force: true });
removedFiles += filesInDir;
removedDirs += 1;
} catch {

View File

@ -16,26 +16,20 @@ export const DLC_AES_IV = Buffer.from("9bc24cb995cb8db3", "utf8");
export const REQUEST_RETRIES = 3;
export const CHUNK_SIZE = 512 * 1024;
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
export const LINK_ARTIFACT_EXTENSIONS = new Set([".url", ".webloc", ".dlc", ".rsdf", ".ccf"]);
export const SAMPLE_TOKEN_RE = /(^|[._\-\s])sample([._\-\s]|$)/i;
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz", ".rev"]);
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz"]);
export const RAR_SPLIT_RE = /\.r\d{2,3}$/i;
export const MAX_MANIFEST_FILE_BYTES = 5 * 1024 * 1024;
export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
export const SPEED_WINDOW_SECONDS = 1;
export const SPEED_WINDOW_SECONDS = 3;
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
export const DEFAULT_UPDATE_REPO = "Sucukdeluxe/real-debrid-downloader";
export function defaultSettings(): AppSettings {
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
@ -45,9 +39,6 @@ export function defaultSettings(): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: "",
archivePasswordList: "",
rememberToken: true,
providerPrimary: "realdebrid",
@ -73,7 +64,6 @@ export function defaultSettings(): AppSettings {
reconnectWaitSeconds: 45,
completedCleanupPolicy: "never",
maxParallel: 4,
maxParallelExtract: 2,
retryLimit: 0,
speedLimitEnabled: false,
speedLimitKbps: 0,
@ -83,13 +73,6 @@ export function defaultSettings(): AppSettings {
clipboardWatch: false,
minimizeToTray: false,
theme: "dark" as const,
collapseNewPackages: true,
autoSkipExtracted: false,
confirmDeleteSelection: true,
totalDownloadedAllTime: 0,
bandwidthSchedules: [],
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
extractCpuPriority: "high",
autoExtractWhenStopped: true
bandwidthSchedules: []
};
}

View File

@ -164,7 +164,7 @@ async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]>
const dlcData = content.slice(0, -88);
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
const rcResponse = await fetch(rcUrl, { method: "GET" });
if (!rcResponse.ok) {
return [];
}
@ -217,8 +217,7 @@ async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<s
const response = await fetch(DCRYPT_UPLOAD_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
body: form
});
if (response.status === 413) {
return null;
@ -236,8 +235,7 @@ async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
const response = await fetch(DCRYPT_PASTE_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
body: form
});
if (response.status === 413) {
return null;

View File

@ -11,16 +11,11 @@ const RAPIDGATOR_SCAN_MAX_BYTES = 512 * 1024;
const BEST_DEBRID_API_BASE = "https://bestdebrid.com/api/v1";
const ALL_DEBRID_API_BASE = "https://api.alldebrid.com/v4";
const ONEFICHIER_API_BASE = "https://api.1fichier.com/v1";
const ONEFICHIER_URL_RE = /^https?:\/\/(?:www\.)?(?:1fichier\.com|alterupload\.com|cjoint\.net|desfichiers\.com|dfichiers\.com|megadl\.fr|mesfichiers\.org|piecejointe\.net|pjointe\.com|tenvoi\.com|dl4free\.com)\/\?([a-z0-9]{5,20})$/i;
const PROVIDER_LABELS: Record<DebridProvider, string> = {
realdebrid: "Real-Debrid",
megadebrid: "Mega-Debrid",
bestdebrid: "BestDebrid",
alldebrid: "AllDebrid",
ddownload: "DDownload",
onefichier: "1Fichier"
alldebrid: "AllDebrid"
};
interface ProviderUnrestrictedLink extends UnrestrictedLink {
@ -28,7 +23,7 @@ interface ProviderUnrestrictedLink extends UnrestrictedLink {
providerLabel: string;
}
export type MegaWebUnrestrictor = (link: string, signal?: AbortSignal) => Promise<UnrestrictedLink | null>;
export type MegaWebUnrestrictor = (link: string) => Promise<UnrestrictedLink | null>;
interface DebridServiceOptions {
megaWebUnrestrict?: MegaWebUnrestrictor;
@ -231,9 +226,7 @@ function isRapidgatorLink(link: string): boolean {
return hostname === "rapidgator.net"
|| hostname.endsWith(".rapidgator.net")
|| hostname === "rg.to"
|| hostname.endsWith(".rg.to")
|| hostname === "rapidgator.asia"
|| hostname.endsWith(".rapidgator.asia");
|| hostname.endsWith(".rg.to");
} catch {
return false;
}
@ -322,7 +315,7 @@ async function runWithConcurrency<T>(items: T[], concurrency: number, worker: (i
let index = 0;
let firstError: unknown = null;
const next = (): T | undefined => {
if (firstError || index >= items.length) {
if (index >= items.length) {
return undefined;
}
const item = items[index];
@ -422,7 +415,6 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES + 2) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
@ -438,11 +430,9 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
&& !contentType.includes("text/plain")
&& !contentType.includes("text/xml")
&& !contentType.includes("application/xml")) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
if (!contentType && Number.isFinite(contentLength) && contentLength > RAPIDGATOR_SCAN_MAX_BYTES) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
@ -454,7 +444,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
return "";
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (/aborted/i.test(errorText)) {
throw error;
}
if (attempt >= REQUEST_RETRIES + 2 || !isRetryableErrorText(errorText)) {
@ -470,138 +460,6 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
return "";
}
export interface RapidgatorCheckResult {
online: boolean;
fileName: string;
fileSize: string | null;
}
const RG_FILE_ID_RE = /\/file\/([a-z0-9]{32}|\d+)/i;
const RG_FILE_NOT_FOUND_RE = />\s*404\s*File not found/i;
const RG_FILESIZE_RE = /File\s*size:\s*<strong>([^<>"]+)<\/strong>/i;
export async function checkRapidgatorOnline(
link: string,
signal?: AbortSignal
): Promise<RapidgatorCheckResult | null> {
if (!isRapidgatorLink(link)) {
return null;
}
const fileIdMatch = link.match(RG_FILE_ID_RE);
if (!fileIdMatch) {
return null;
}
const fileId = fileIdMatch[1];
const headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36",
Accept: "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",
"Accept-Language": "en-US,en;q=0.9,de;q=0.8"
};
// Fast path: HEAD request (no body download, much faster)
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
const response = await fetch(link, {
method: "HEAD",
redirect: "follow",
headers,
signal: withTimeoutSignal(signal, 15000)
});
if (response.status === 404) {
return { online: false, fileName: "", fileSize: null };
}
if (response.ok) {
const finalUrl = response.url || link;
if (!finalUrl.includes(fileId)) {
return { online: false, fileName: "", fileSize: null };
}
// HEAD 200 + URL still contains file ID → online
const fileName = filenameFromRapidgatorUrlPath(link);
return { online: true, fileName, fileSize: null };
}
// Non-OK, non-404: retry or give up
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
// HEAD inconclusive — fall through to GET
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
break; // fall through to GET
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
// Slow path: GET request (downloads HTML, more thorough)
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
const response = await fetch(link, {
method: "GET",
redirect: "follow",
headers,
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (response.status === 404) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
return null;
}
const finalUrl = response.url || link;
if (!finalUrl.includes(fileId)) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
const html = await readResponseTextLimited(response, RAPIDGATOR_SCAN_MAX_BYTES, signal);
if (RG_FILE_NOT_FOUND_RE.test(html)) {
return { online: false, fileName: "", fileSize: null };
}
const fileName = extractRapidgatorFilenameFromHtml(html) || filenameFromRapidgatorUrlPath(link);
const sizeMatch = html.match(RG_FILESIZE_RE);
const fileSize = sizeMatch ? sizeMatch[1].trim() : null;
return { online: true, fileName, fileSize };
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
return null;
}
}
if (attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelay(attempt), signal);
}
}
return null;
}
function buildBestDebridRequests(link: string, token: string): BestDebridRequest[] {
const linkParam = encodeURIComponent(link);
const safeToken = String(token || "").trim();
@ -630,7 +488,7 @@ class MegaDebridClient {
if (signal?.aborted) {
throw new Error("aborted:debrid");
}
const web = await this.megaWebUnrestrict(link, signal).catch((error) => {
const web = await this.megaWebUnrestrict(link).catch((error) => {
lastError = compactErrorText(error);
return null;
});
@ -645,17 +503,13 @@ class MegaDebridClient {
throw new Error("Mega-Web Antwort ohne Download-Link");
}
if (!lastError) {
lastError = "Mega-Web Antwort leer";
}
// Don't retry permanent hoster errors (dead link, file removed, etc.)
if (/permanent ungültig|hosternotavailable|file.?not.?found|file.?unavailable|link.?is.?dead/i.test(lastError)) {
break;
lastError = web ? "Mega-Web Antwort ohne Download-Link" : "Mega-Web Antwort leer";
}
if (attempt < REQUEST_RETRIES) {
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "Mega-Web Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "Mega-Web Unrestrict fehlgeschlagen");
}
}
@ -674,11 +528,7 @@ class BestDebridClient {
try {
return await this.tryRequest(request, link, signal);
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
lastError = errorText;
lastError = compactErrorText(error);
}
}
@ -743,7 +593,7 @@ class BestDebridClient {
throw new Error("BestDebrid Antwort ohne Download-Link");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -837,7 +687,7 @@ class AllDebridClient {
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
@ -949,7 +799,7 @@ class AllDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -959,257 +809,7 @@ class AllDebridClient {
}
}
throw new Error(String(lastError || "AllDebrid Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}
// ── 1Fichier Client ──
class OneFichierClient {
private apiKey: string;
public constructor(apiKey: string) {
this.apiKey = apiKey;
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
if (!ONEFICHIER_URL_RE.test(link)) {
throw new Error("Kein 1Fichier-Link");
}
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
if (signal?.aborted) throw new Error("aborted:debrid");
try {
const res = await fetch(`${ONEFICHIER_API_BASE}/download/get_token.cgi`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${this.apiKey}`
},
body: JSON.stringify({ url: link, pretty: 1 }),
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const json = await res.json() as Record<string, unknown>;
if (json.status === "KO" || json.error) {
const msg = String(json.message || json.error || "Unbekannter 1Fichier-Fehler");
throw new Error(msg);
}
const directUrl = String(json.url || "");
if (!directUrl) {
throw new Error("1Fichier: Keine Download-URL in Antwort");
}
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
throw error;
}
if (attempt < REQUEST_RETRIES) {
await sleep(retryDelay(attempt), signal);
}
}
}
throw new Error(`1Fichier-Unrestrict fehlgeschlagen: ${lastError}`);
}
}
const DDOWNLOAD_URL_RE = /^https?:\/\/(?:www\.)?(?:ddownload\.com|ddl\.to)\/([a-z0-9]+)/i;
const DDOWNLOAD_WEB_BASE = "https://ddownload.com";
const DDOWNLOAD_WEB_UA = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36";
class DdownloadClient {
private login: string;
private password: string;
private cookies: string = "";
public constructor(login: string, password: string) {
this.login = login;
this.password = password;
}
private async webLogin(signal?: AbortSignal): Promise<void> {
// Step 1: GET login page to extract form token
const loginPageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/login.html`, {
headers: { "User-Agent": DDOWNLOAD_WEB_UA },
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const loginPageHtml = await loginPageRes.text();
const tokenMatch = loginPageHtml.match(/name="token" value="([^"]+)"/);
const pageCookies = (loginPageRes.headers.getSetCookie?.() || []).map((c: string) => c.split(";")[0]).join("; ");
// Step 2: POST login
const body = new URLSearchParams({
op: "login",
token: tokenMatch?.[1] || "",
rand: "",
redirect: "",
login: this.login,
password: this.password
});
const loginRes = await fetch(`${DDOWNLOAD_WEB_BASE}/`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
...(pageCookies ? { Cookie: pageCookies } : {})
},
body: body.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Drain body
try { await loginRes.text(); } catch { /* ignore */ }
const setCookies = loginRes.headers.getSetCookie?.() || [];
const xfss = setCookies.find((c: string) => c.startsWith("xfss="));
const loginCookie = setCookies.find((c: string) => c.startsWith("login="));
if (!xfss) {
throw new Error("DDownload Login fehlgeschlagen (kein Session-Cookie)");
}
this.cookies = [loginCookie, xfss].filter(Boolean).map((c: string) => c.split(";")[0]).join("; ");
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
const match = link.match(DDOWNLOAD_URL_RE);
if (!match) {
throw new Error("Kein DDownload-Link");
}
const fileCode = match[1];
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
// Login if no session yet
if (!this.cookies) {
await this.webLogin(signal);
}
// Step 1: GET file page to extract form fields
const filePageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
Cookie: this.cookies
},
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Premium with direct downloads enabled → redirect immediately
if (filePageRes.status >= 300 && filePageRes.status < 400) {
const directUrl = filePageRes.headers.get("location") || "";
try { await filePageRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const html = await filePageRes.text();
// Check for file not found
if (/File Not Found|file was removed|file was banned/i.test(html)) {
throw new Error("DDownload: Datei nicht gefunden");
}
// Extract form fields
const idVal = html.match(/name="id" value="([^"]+)"/)?.[1] || fileCode;
const randVal = html.match(/name="rand" value="([^"]+)"/)?.[1] || "";
const fileNameMatch = html.match(/class="file-info-name"[^>]*>([^<]+)</);
const fileName = fileNameMatch?.[1]?.trim() || filenameFromUrl(link);
// Step 2: POST download2 for premium download
const dlBody = new URLSearchParams({
op: "download2",
id: idVal,
rand: randVal,
referer: "",
method_premium: "1",
adblock_detected: "0"
});
const dlRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
Cookie: this.cookies,
Referer: `${DDOWNLOAD_WEB_BASE}/${fileCode}`
},
body: dlBody.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (dlRes.status >= 300 && dlRes.status < 400) {
const directUrl = dlRes.headers.get("location") || "";
try { await dlRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: fileName || filenameFromUrl(directUrl),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const dlHtml = await dlRes.text();
// Try to find direct URL in response HTML
const directMatch = dlHtml.match(/https?:\/\/[a-z0-9]+\.(?:dstorage\.org|ddownload\.com|ddl\.to|ucdn\.to)[^\s"'<>]+/i);
if (directMatch) {
return {
fileName,
directUrl: directMatch[0],
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
// Check for error messages
const errMatch = dlHtml.match(/class="err"[^>]*>([^<]+)</i);
if (errMatch) {
throw new Error(`DDownload: ${errMatch[1].trim()}`);
}
throw new Error("DDownload: Kein Download-Link erhalten");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
// Re-login on auth errors
if (/login|session|cookie/i.test(lastError)) {
this.cookies = "";
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
break;
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "DDownload Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "AllDebrid Unrestrict fehlgeschlagen");
}
}
@ -1218,9 +818,6 @@ export class DebridService {
private options: DebridServiceOptions;
private cachedDdownloadClient: DdownloadClient | null = null;
private cachedDdownloadKey = "";
public constructor(settings: AppSettings, options: DebridServiceOptions = {}) {
this.settings = cloneSettings(settings);
this.options = options;
@ -1230,16 +827,6 @@ export class DebridService {
this.settings = cloneSettings(next);
}
private getDdownloadClient(login: string, password: string): DdownloadClient {
const key = `${login}\0${password}`;
if (this.cachedDdownloadClient && this.cachedDdownloadKey === key) {
return this.cachedDdownloadClient;
}
this.cachedDdownloadClient = new DdownloadClient(login, password);
this.cachedDdownloadKey = key;
return this.cachedDdownloadClient;
}
public async resolveFilenames(
links: string[],
onResolved?: (link: string, fileName: string) => void,
@ -1274,7 +861,7 @@ export class DebridService {
}
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
// ignore and continue with host page fallback
@ -1292,46 +879,6 @@ export class DebridService {
public async unrestrictLink(link: string, signal?: AbortSignal, settingsSnapshot?: AppSettings): Promise<ProviderUnrestrictedLink> {
const settings = settingsSnapshot ? cloneSettings(settingsSnapshot) : cloneSettings(this.settings);
// 1Fichier is a direct file hoster. If the link is a 1fichier.com URL
// and the API key is configured, use 1Fichier directly before debrid providers.
if (ONEFICHIER_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "onefichier")) {
try {
const result = await this.unrestrictViaProvider(settings, "onefichier", link, signal);
return {
...result,
provider: "onefichier",
providerLabel: PROVIDER_LABELS["onefichier"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain
}
}
// DDownload is a direct file hoster, not a debrid service.
// If the link is a ddownload.com/ddl.to URL and the account is configured,
// use DDownload directly before trying any debrid providers.
if (DDOWNLOAD_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "ddownload")) {
try {
const result = await this.unrestrictViaProvider(settings, "ddownload", link, signal);
return {
...result,
provider: "ddownload",
providerLabel: PROVIDER_LABELS["ddownload"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain (debrid services may also support ddownload links)
}
}
const order = toProviderOrder(
settings.providerPrimary,
settings.providerSecondary,
@ -1359,11 +906,7 @@ export class DebridService {
providerLabel: PROVIDER_LABELS[primary]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${errorText}`);
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${compactErrorText(error)}`);
}
}
@ -1393,7 +936,7 @@ export class DebridService {
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
attempts.push(`${PROVIDER_LABELS[provider]}: ${compactErrorText(error)}`);
@ -1417,12 +960,6 @@ export class DebridService {
if (provider === "alldebrid") {
return Boolean(settings.allDebridToken.trim());
}
if (provider === "ddownload") {
return Boolean(settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim());
}
if (provider === "onefichier") {
return Boolean(settings.oneFichierApiKey.trim());
}
return Boolean(settings.bestToken.trim());
}
@ -1436,12 +973,6 @@ export class DebridService {
if (provider === "alldebrid") {
return new AllDebridClient(settings.allDebridToken).unrestrictLink(link, signal);
}
if (provider === "ddownload") {
return this.getDdownloadClient(settings.ddownloadLogin, settings.ddownloadPassword).unrestrictLink(link, signal);
}
if (provider === "onefichier") {
return new OneFichierClient(settings.oneFichierApiKey).unrestrictLink(link, signal);
}
return new BestDebridClient(settings.bestToken).unrestrictLink(link, signal);
}
}

View File

@ -1,279 +0,0 @@
import http from "node:http";
import fs from "node:fs";
import path from "node:path";
import { logger, getLogFilePath } from "./logger";
import type { DownloadManager } from "./download-manager";
const DEFAULT_PORT = 9868;
const MAX_LOG_LINES = 10000;
let server: http.Server | null = null;
let manager: DownloadManager | null = null;
let authToken = "";
function loadToken(baseDir: string): string {
const tokenPath = path.join(baseDir, "debug_token.txt");
try {
return fs.readFileSync(tokenPath, "utf8").trim();
} catch {
return "";
}
}
function getPort(baseDir: string): number {
const portPath = path.join(baseDir, "debug_port.txt");
try {
const n = Number(fs.readFileSync(portPath, "utf8").trim());
if (Number.isFinite(n) && n >= 1024 && n <= 65535) {
return n;
}
} catch {
// ignore
}
return DEFAULT_PORT;
}
function checkAuth(req: http.IncomingMessage): boolean {
if (!authToken) {
return false;
}
const header = req.headers.authorization || "";
if (header === `Bearer ${authToken}`) {
return true;
}
const url = new URL(req.url || "/", "http://localhost");
return url.searchParams.get("token") === authToken;
}
function jsonResponse(res: http.ServerResponse, status: number, data: unknown): void {
const body = JSON.stringify(data, null, 2);
res.writeHead(status, {
"Content-Type": "application/json; charset=utf-8",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-cache"
});
res.end(body);
}
function readLogTail(lines: number): string[] {
const logPath = getLogFilePath();
try {
const content = fs.readFileSync(logPath, "utf8");
const allLines = content.split("\n").filter((l) => l.trim().length > 0);
return allLines.slice(-Math.min(lines, MAX_LOG_LINES));
} catch {
return ["(Log-Datei nicht lesbar)"];
}
}
function handleRequest(req: http.IncomingMessage, res: http.ServerResponse): void {
if (req.method === "OPTIONS") {
res.writeHead(204, {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Headers": "Authorization"
});
res.end();
return;
}
if (!checkAuth(req)) {
jsonResponse(res, 401, { error: "Unauthorized" });
return;
}
const url = new URL(req.url || "/", "http://localhost");
const pathname = url.pathname;
if (pathname === "/health") {
jsonResponse(res, 200, {
status: "ok",
uptime: Math.floor(process.uptime()),
memoryMB: Math.round(process.memoryUsage().rss / 1024 / 1024)
});
return;
}
if (pathname === "/log") {
const count = Math.min(Number(url.searchParams.get("lines") || "100"), MAX_LOG_LINES);
const grep = url.searchParams.get("grep") || "";
let lines = readLogTail(count);
if (grep) {
const pattern = grep.toLowerCase();
lines = lines.filter((l) => l.toLowerCase().includes(pattern));
}
jsonResponse(res, 200, { lines, count: lines.length });
return;
}
if (pathname === "/status") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const items = Object.values(snapshot.session.items);
const packages = Object.values(snapshot.session.packages);
const byStatus: Record<string, number> = {};
for (const item of items) {
byStatus[item.status] = (byStatus[item.status] || 0) + 1;
}
const activeItems = items
.filter((i) => i.status === "downloading" || i.status === "validating")
.map((i) => ({
id: i.id,
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}));
const failedItems = items
.filter((i) => i.status === "failed")
.map((i) => ({
fileName: i.fileName,
lastError: i.lastError,
retries: i.retries,
provider: i.provider
}));
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
speed: snapshot.speedText,
eta: snapshot.etaText,
itemCounts: byStatus,
totalItems: items.length,
packages: packages.map((p) => ({
name: p.name,
status: p.status,
items: p.itemIds.length
})),
activeItems,
failedItems: failedItems.length > 0 ? failedItems : undefined
});
return;
}
if (pathname === "/items") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const filter = url.searchParams.get("status");
const pkg = url.searchParams.get("package");
let items = Object.values(snapshot.session.items);
if (filter) {
items = items.filter((i) => i.status === filter);
}
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
items = items.filter((i) => ids.has(i.id));
}
}
jsonResponse(res, 200, {
count: items.length,
items: items.map((i) => ({
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}))
});
return;
}
if (pathname === "/session") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const pkg = url.searchParams.get("package");
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
const pkgItems = Object.values(snapshot.session.items)
.filter((i) => ids.has(i.id));
jsonResponse(res, 200, {
package: matchedPkg,
items: pkgItems
});
return;
}
}
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
packageCount: Object.keys(snapshot.session.packages).length,
itemCount: Object.keys(snapshot.session.items).length,
packages: Object.values(snapshot.session.packages).map((p) => ({
id: p.id,
name: p.name,
status: p.status,
items: p.itemIds.length
}))
});
return;
}
jsonResponse(res, 404, {
error: "Not found",
endpoints: [
"GET /health",
"GET /log?lines=100&grep=keyword",
"GET /status",
"GET /items?status=downloading&package=Bloodline",
"GET /session?package=Criminal"
]
});
}
export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
authToken = loadToken(baseDir);
if (!authToken) {
logger.info("Debug-Server: Kein Token in debug_token.txt, Server wird nicht gestartet");
return;
}
manager = mgr;
const port = getPort(baseDir);
server = http.createServer(handleRequest);
server.listen(port, "127.0.0.1", () => {
logger.info(`Debug-Server gestartet auf Port ${port}`);
});
server.on("error", (err) => {
logger.warn(`Debug-Server Fehler: ${String(err)}`);
server = null;
});
}
export function stopDebugServer(): void {
if (server) {
server.close();
server = null;
logger.info("Debug-Server gestoppt");
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -8,19 +8,12 @@ const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
const rotateCheckAtByFile = new Map<string, number>();
type LogListener = (line: string) => void;
let logListener: LogListener | null = null;
let pendingLines: string[] = [];
let pendingChars = 0;
let flushTimer: NodeJS.Timeout | null = null;
let flushInFlight = false;
let exitHookAttached = false;
export function setLogListener(listener: LogListener | null): void {
logListener = listener;
}
export function configureLogger(baseDir: string): void {
logFilePath = path.join(baseDir, "rd_downloader.log");
const cwdLogPath = path.resolve(process.cwd(), "rd_downloader.log");
@ -125,26 +118,6 @@ function rotateIfNeeded(filePath: string): void {
}
}
async function rotateIfNeededAsync(filePath: string): Promise<void> {
try {
const now = Date.now();
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
if (now - lastRotateCheckAt < 60_000) {
return;
}
rotateCheckAtByFile.set(filePath, now);
const stat = await fs.promises.stat(filePath);
if (stat.size < LOG_MAX_FILE_BYTES) {
return;
}
const backup = `${filePath}.old`;
await fs.promises.rm(backup, { force: true }).catch(() => {});
await fs.promises.rename(filePath, backup);
} catch {
// ignore - file may not exist yet
}
}
async function flushAsync(): Promise<void> {
if (flushInFlight || pendingLines.length === 0) {
return;
@ -155,11 +128,11 @@ async function flushAsync(): Promise<void> {
const chunk = linesSnapshot.join("");
try {
await rotateIfNeededAsync(logFilePath);
rotateIfNeeded(logFilePath);
const primary = await appendChunk(logFilePath, chunk);
let wroteAny = primary.ok;
if (fallbackLogFilePath) {
await rotateIfNeededAsync(fallbackLogFilePath);
rotateIfNeeded(fallbackLogFilePath);
const fallback = await appendChunk(fallbackLogFilePath, chunk);
wroteAny = wroteAny || fallback.ok;
if (!primary.ok && !fallback.ok) {
@ -195,10 +168,6 @@ function write(level: "INFO" | "WARN" | "ERROR", message: string): void {
pendingLines.push(line);
pendingChars += line.length;
if (logListener) {
try { logListener(line); } catch { /* ignore */ }
}
while (pendingChars > LOG_BUFFER_LIMIT_CHARS && pendingLines.length > 1) {
const removed = pendingLines.shift();
if (!removed) {

View File

@ -1,13 +1,11 @@
import fs from "node:fs";
import path from "node:path";
import { app, BrowserWindow, clipboard, dialog, ipcMain, IpcMainInvokeEvent, Menu, shell, Tray } from "electron";
import { AddLinksPayload, AppSettings, UpdateInstallProgress } from "../shared/types";
import { AppController } from "./app-controller";
import { IPC_CHANNELS } from "../shared/ipc";
import { getLogFilePath, logger } from "./logger";
import { logger } from "./logger";
import { APP_NAME } from "./constants";
import { extractHttpLinksFromText } from "./utils";
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
/* ── IPC validation helpers ────────────────────────────────────── */
function validateString(value: unknown, name: string): string {
@ -51,7 +49,6 @@ process.on("unhandledRejection", (reason) => {
let mainWindow: BrowserWindow | null = null;
let tray: Tray | null = null;
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
let lastClipboardText = "";
const controller = new AppController();
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
@ -68,7 +65,6 @@ function createWindow(): BrowserWindow {
minHeight: 760,
backgroundColor: "#070b14",
title: `${APP_NAME} - v${controller.getVersion()}`,
icon: path.join(app.getAppPath(), "assets", "app_icon.ico"),
webPreferences: {
contextIsolation: true,
nodeIntegration: false,
@ -82,16 +78,13 @@ function createWindow(): BrowserWindow {
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu"
]
}
});
});
}
window.setMenuBarVisibility(false);
window.setAutoHideMenuBar(true);
if (isDevMode()) {
void window.loadURL("http://localhost:5173");
} else {
@ -131,7 +124,7 @@ function createTray(): void {
const contextMenu = Menu.buildFromTemplate([
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
{ type: "separator" },
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
{ label: "Start", click: () => { controller.start(); } },
{ label: "Stop", click: () => { controller.stop(); } },
{ type: "separator" },
{ label: "Beenden", click: () => { app.quit(); } }
@ -189,12 +182,7 @@ function startClipboardWatcher(): void {
}
lastClipboardText = normalizeClipboardText(clipboard.readText());
clipboardTimer = setInterval(() => {
let text: string;
try {
text = normalizeClipboardText(clipboard.readText());
} catch {
return;
}
const text = normalizeClipboardText(clipboard.readText());
if (text === lastClipboardText || !text.trim()) {
return;
}
@ -243,9 +231,9 @@ function registerIpcHandlers(): void {
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
});
if (result.started) {
updateQuitTimer = setTimeout(() => {
setTimeout(() => {
app.quit();
}, 2500);
}, 800);
}
return result;
});
@ -295,14 +283,6 @@ function registerIpcHandlers(): void {
});
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds ?? [], "packageIds");
return controller.startPackages(packageIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.startItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
ipcMain.handle(IPC_CHANNELS.CANCEL_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
@ -329,53 +309,7 @@ function registerIpcHandlers(): void {
validateString(packageId, "packageId");
return controller.togglePackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.RETRY_EXTRACTION, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.retryExtraction(packageId);
});
ipcMain.handle(IPC_CHANNELS.EXTRACT_NOW, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.extractNow(packageId);
});
ipcMain.handle(IPC_CHANNELS.RESET_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.resetPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
validateString(packageId, "packageId");
validateString(priority, "priority");
if (priority !== "high" && priority !== "normal" && priority !== "low") {
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
}
return controller.setPackagePriority(packageId, priority);
});
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.skipItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.resetItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
ipcMain.handle(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, (_event: IpcMainInvokeEvent, entryId: string) => {
validateString(entryId, "entryId");
return controller.removeHistoryEntry(entryId);
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, async () => {
const options = {
defaultPath: `rd-queue-export.json`,
filters: [{ name: "Queue Export", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportQueue();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, () => controller.exportQueue());
ipcMain.handle(IPC_CHANNELS.IMPORT_QUEUE, (_event: IpcMainInvokeEvent, json: string) => {
validateString(json, "json");
const bytes = Buffer.byteLength(json, "utf8");
@ -409,64 +343,6 @@ function registerIpcHandlers(): void {
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
return result.canceled ? [] : result.filePaths;
});
ipcMain.handle(IPC_CHANNELS.GET_SESSION_STATS, () => controller.getSessionStats());
ipcMain.handle(IPC_CHANNELS.RESTART, () => {
app.relaunch();
app.quit();
});
ipcMain.handle(IPC_CHANNELS.QUIT, () => {
app.quit();
});
ipcMain.handle(IPC_CHANNELS.EXPORT_BACKUP, async () => {
const options = {
defaultPath: `mdd-backup-${new Date().toISOString().slice(0, 10)}.json`,
filters: [{ name: "Backup", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportBackup();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.OPEN_LOG, async () => {
const logPath = getLogFilePath();
await shell.openPath(logPath);
});
ipcMain.handle(IPC_CHANNELS.OPEN_SESSION_LOG, async () => {
const logPath = controller.getSessionLogPath();
if (logPath) {
await shell.openPath(logPath);
}
});
ipcMain.handle(IPC_CHANNELS.IMPORT_BACKUP, async () => {
const options = {
properties: ["openFile"] as Array<"openFile">,
filters: [
{ name: "Backup", extensions: ["json"] },
{ name: "Alle Dateien", extensions: ["*"] }
]
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
if (result.canceled || result.filePaths.length === 0) {
return { restored: false, message: "Abgebrochen" };
}
const filePath = result.filePaths[0];
const stat = await fs.promises.stat(filePath);
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
if (stat.size > BACKUP_MAX_BYTES) {
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
}
const json = await fs.promises.readFile(filePath, "utf8");
return controller.importBackup(json);
});
controller.onState = (snapshot) => {
if (!mainWindow || mainWindow.isDestroyed()) {
@ -487,7 +363,6 @@ app.on("second-instance", () => {
});
app.whenReady().then(() => {
cleanupStaleSubstDrives();
registerIpcHandlers();
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
@ -500,9 +375,6 @@ app.whenReady().then(() => {
bindMainWindowLifecycle(mainWindow);
}
});
}).catch((error) => {
console.error("App startup failed:", error);
app.quit();
});
app.on("window-all-closed", () => {
@ -512,10 +384,8 @@ app.on("window-all-closed", () => {
});
app.on("before-quit", () => {
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
stopClipboardWatcher();
destroyTray();
shutdownDaemon();
try {
controller.shutdown();
} catch (error) {

View File

@ -42,45 +42,6 @@ function parseSetCookieFromHeaders(headers: Headers): string {
.join("; ");
}
const PERMANENT_HOSTER_ERRORS = [
"hosternotavailable",
"filenotfound",
"file_unavailable",
"file not found",
"link is dead",
"file has been removed",
"file has been deleted",
"file was deleted",
"file was removed",
"not available",
"file is no longer available"
];
function parsePageErrors(html: string): string[] {
const errors: string[] = [];
const errorRegex = /class=["'][^"']*\berror\b[^"']*["'][^>]*>([^<]+)</gi;
let m: RegExpExecArray | null;
while ((m = errorRegex.exec(html)) !== null) {
const text = m[1].replace(/^Fehler:\s*/i, "").trim();
if (text) {
errors.push(text);
}
}
return errors;
}
function isPermanentHosterError(errors: string[]): string | null {
for (const err of errors) {
const lower = err.toLowerCase();
for (const pattern of PERMANENT_HOSTER_ERRORS) {
if (lower.includes(pattern)) {
return err;
}
}
}
return null;
}
function parseCodes(html: string): CodeEntry[] {
const entries: CodeEntry[] = [];
const cardRegex = /<div[^>]*class=['"][^'"]*acp-box[^'"]*['"][^>]*>[\s\S]*?<\/div>/gi;
@ -127,93 +88,6 @@ function parseDebridJson(text: string): { link: string; text: string } | null {
}
}
function abortError(): Error {
return new Error("aborted:mega-web");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
const timeoutSignal = AbortSignal.timeout(timeoutMs);
if (!signal) {
return timeoutSignal;
}
return AbortSignal.any([signal, timeoutSignal]);
}
function throwIfAborted(signal?: AbortSignal): void {
if (signal?.aborted) {
throw abortError();
}
}
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
if (!signal) {
await sleep(ms);
return;
}
if (signal.aborted) {
throw abortError();
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
signal.removeEventListener("abort", onAbort);
resolve();
}, Math.max(0, ms));
const onAbort = (): void => {
if (timer) {
clearTimeout(timer);
timer = null;
}
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
});
}
async function raceWithAbort<T>(promise: Promise<T>, signal?: AbortSignal): Promise<T> {
if (!signal) {
return promise;
}
if (signal.aborted) {
throw abortError();
}
return new Promise<T>((resolve, reject) => {
let settled = false;
const onAbort = (): void => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
promise.then((value) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
resolve(value);
}, (error) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(error);
});
});
}
export class MegaWebFallback {
private queue: Promise<unknown> = Promise.resolve();
@ -227,24 +101,22 @@ export class MegaWebFallback {
this.getCredentials = getCredentials;
}
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
const overallSignal = withTimeoutSignal(signal, 180000);
public async unrestrict(link: string): Promise<UnrestrictedLink | null> {
return this.runExclusive(async () => {
throwIfAborted(overallSignal);
const creds = this.getCredentials();
if (!creds.login.trim() || !creds.password.trim()) {
return null;
}
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
await this.login(creds.login, creds.password, overallSignal);
await this.login(creds.login, creds.password);
}
const generated = await this.generate(link, overallSignal);
const generated = await this.generate(link);
if (!generated) {
this.cookie = "";
await this.login(creds.login, creds.password, overallSignal);
const retry = await this.generate(link, overallSignal);
await this.login(creds.login, creds.password);
const retry = await this.generate(link);
if (!retry) {
return null;
}
@ -262,32 +134,16 @@ export class MegaWebFallback {
fileSize: null,
retriesUsed: 0
};
}, overallSignal);
});
}
public invalidateSession(): void {
this.cookie = "";
this.cookieSetAt = 0;
}
private async runExclusive<T>(job: () => Promise<T>, signal?: AbortSignal): Promise<T> {
const queuedAt = Date.now();
const QUEUE_WAIT_TIMEOUT_MS = 90000;
const guardedJob = async (): Promise<T> => {
throwIfAborted(signal);
const waited = Date.now() - queuedAt;
if (waited > QUEUE_WAIT_TIMEOUT_MS) {
throw new Error(`Mega-Web Queue-Timeout (${Math.floor(waited / 1000)}s gewartet)`);
}
return job();
};
const run = this.queue.then(guardedJob, guardedJob);
private async runExclusive<T>(job: () => Promise<T>): Promise<T> {
const run = this.queue.then(job, job);
this.queue = run.then(() => undefined, () => undefined);
return raceWithAbort(run, signal);
return run;
}
private async login(login: string, password: string, signal?: AbortSignal): Promise<void> {
throwIfAborted(signal);
private async login(login: string, password: string): Promise<void> {
const response = await fetch(LOGIN_URL, {
method: "POST",
headers: {
@ -300,7 +156,7 @@ export class MegaWebFallback {
remember: "on"
}),
redirect: "manual",
signal: withTimeoutSignal(signal, 30000)
signal: AbortSignal.timeout(30000)
});
const cookie = parseSetCookieFromHeaders(response.headers);
@ -315,7 +171,7 @@ export class MegaWebFallback {
Cookie: cookie,
Referer: DEBRID_REFERER
},
signal: withTimeoutSignal(signal, 30000)
signal: AbortSignal.timeout(30000)
});
const verifyHtml = await verify.text();
const hasDebridForm = /id=["']debridForm["']/i.test(verifyHtml) || /name=["']links["']/i.test(verifyHtml);
@ -327,8 +183,7 @@ export class MegaWebFallback {
this.cookieSetAt = Date.now();
}
private async generate(link: string, signal?: AbortSignal): Promise<{ directUrl: string; fileName: string } | null> {
throwIfAborted(signal);
private async generate(link: string): Promise<{ directUrl: string; fileName: string } | null> {
const page = await fetch(DEBRID_URL, {
method: "POST",
headers: {
@ -342,25 +197,16 @@ export class MegaWebFallback {
password: "",
showLinks: "1"
}),
signal: withTimeoutSignal(signal, 30000)
signal: AbortSignal.timeout(30000)
});
const html = await page.text();
// Check for permanent hoster errors before looking for debrid codes
const pageErrors = parsePageErrors(html);
const permanentError = isPermanentHosterError(pageErrors);
if (permanentError) {
throw new Error(`Mega-Web: Link permanent ungültig (${permanentError})`);
}
const code = pickCode(parseCodes(html), link);
if (!code) {
return null;
}
for (let attempt = 1; attempt <= 60; attempt += 1) {
throwIfAborted(signal);
const res = await fetch(DEBRID_AJAX_URL, {
method: "POST",
headers: {
@ -373,12 +219,12 @@ export class MegaWebFallback {
code,
autodl: "0"
}),
signal: withTimeoutSignal(signal, 15000)
signal: AbortSignal.timeout(15000)
});
const text = (await res.text()).trim();
if (text === "reload") {
await sleepWithSignal(650, signal);
await sleep(650);
continue;
}
if (text === "false") {
@ -392,7 +238,7 @@ export class MegaWebFallback {
if (!parsed.link) {
if (/hoster does not respond correctly|could not be done for this moment/i.test(parsed.text || "")) {
await sleepWithSignal(1200, signal);
await sleep(1200);
continue;
}
return null;

View File

@ -8,7 +8,6 @@ export interface UnrestrictedLink {
directUrl: string;
fileSize: number | null;
retriesUsed: number;
skipTlsVerify?: boolean;
}
function shouldRetryStatus(status: number): boolean {
@ -63,8 +62,7 @@ function isRetryableErrorText(text: string): boolean {
|| lower.includes("aborted")
|| lower.includes("econnreset")
|| lower.includes("enotfound")
|| lower.includes("etimedout")
|| lower.includes("html statt json");
|| lower.includes("etimedout");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
@ -79,11 +77,6 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
await sleep(ms);
return;
}
// Check before entering the Promise constructor to avoid a race where the timer
// resolves before the aborted check runs (especially when ms=0).
if (signal.aborted) {
throw new Error("aborted");
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
@ -100,6 +93,10 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
reject(new Error("aborted"));
};
if (signal.aborted) {
onAbort();
return;
}
signal.addEventListener("abort", onAbort, { once: true });
});
}
@ -168,15 +165,6 @@ export class RealDebridClient {
if (!directUrl) {
throw new Error("Unrestrict ohne Download-URL");
}
try {
const parsedUrl = new URL(directUrl);
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
}
} catch (urlError) {
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
}
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
const fileSizeRaw = Number(payload.filesize ?? NaN);
@ -188,7 +176,7 @@ export class RealDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -198,6 +186,6 @@ export class RealDebridClient {
}
}
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "Unrestrict fehlgeschlagen");
}
}

View File

@ -1,128 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { setLogListener } from "./logger";
const SESSION_LOG_FLUSH_INTERVAL_MS = 200;
let sessionLogPath: string | null = null;
let sessionLogsDir: string | null = null;
let pendingLines: string[] = [];
let flushTimer: NodeJS.Timeout | null = null;
function formatTimestamp(): string {
const now = new Date();
const y = now.getFullYear();
const mo = String(now.getMonth() + 1).padStart(2, "0");
const d = String(now.getDate()).padStart(2, "0");
const h = String(now.getHours()).padStart(2, "0");
const mi = String(now.getMinutes()).padStart(2, "0");
const s = String(now.getSeconds()).padStart(2, "0");
return `${y}-${mo}-${d}_${h}-${mi}-${s}`;
}
function flushPending(): void {
if (pendingLines.length === 0 || !sessionLogPath) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
try {
fs.appendFileSync(sessionLogPath, chunk, "utf8");
} catch {
// ignore write errors
}
}
function scheduleFlush(): void {
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
flushPending();
}, SESSION_LOG_FLUSH_INTERVAL_MS);
}
function appendToSessionLog(line: string): void {
if (!sessionLogPath) {
return;
}
pendingLines.push(line);
scheduleFlush();
}
async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<void> {
try {
const files = await fs.promises.readdir(dir);
const cutoff = Date.now() - maxAgeDays * 24 * 60 * 60 * 1000;
for (const file of files) {
if (!file.startsWith("session_") || !file.endsWith(".txt")) {
continue;
}
const filePath = path.join(dir, file);
try {
const stat = await fs.promises.stat(filePath);
if (stat.mtimeMs < cutoff) {
await fs.promises.unlink(filePath);
}
} catch {
// ignore - file may be locked
}
}
} catch {
// ignore - dir may not exist
}
}
export function initSessionLog(baseDir: string): void {
sessionLogsDir = path.join(baseDir, "session-logs");
try {
fs.mkdirSync(sessionLogsDir, { recursive: true });
} catch {
sessionLogsDir = null;
return;
}
const timestamp = formatTimestamp();
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);
const isoTimestamp = new Date().toISOString();
try {
fs.writeFileSync(sessionLogPath, `=== Session gestartet: ${isoTimestamp} ===\n`, "utf8");
} catch {
sessionLogPath = null;
return;
}
setLogListener((line) => appendToSessionLog(line));
void cleanupOldSessionLogs(sessionLogsDir, 7);
}
export function getSessionLogPath(): string | null {
return sessionLogPath;
}
export function shutdownSessionLog(): void {
if (!sessionLogPath) {
return;
}
// Flush any pending lines
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
flushPending();
// Write closing line
const isoTimestamp = new Date().toISOString();
try {
fs.appendFileSync(sessionLogPath, `=== Session beendet: ${isoTimestamp} ===\n`, "utf8");
} catch {
// ignore
}
setLogListener(null);
sessionLogPath = null;
}

View File

@ -1,24 +1,21 @@
import fs from "node:fs";
import fsp from "node:fs/promises";
import path from "node:path";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, PackagePriority, SessionState } from "../shared/types";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, PackageEntry, SessionState } from "../shared/types";
import { defaultSettings } from "./constants";
import { logger } from "./logger";
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
const VALID_SPEED_MODES = new Set(["global", "per_download"]);
const VALID_THEMES = new Set(["dark", "light"]);
const VALID_EXTRACT_CPU_PRIORITIES = new Set(["high", "middle", "low"]);
const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
function asText(value: unknown): string {
return String(value ?? "").trim();
@ -68,41 +65,6 @@ function normalizeAbsoluteDir(value: unknown, fallback: string): string {
return path.resolve(text);
}
const DEFAULT_COLUMN_ORDER = ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"];
const ALL_VALID_COLUMNS = new Set([...DEFAULT_COLUMN_ORDER, "added"]);
function normalizeColumnOrder(raw: unknown): string[] {
if (!Array.isArray(raw) || raw.length === 0) {
return [...DEFAULT_COLUMN_ORDER];
}
const valid = ALL_VALID_COLUMNS;
const seen = new Set<string>();
const result: string[] = [];
for (const col of raw) {
if (typeof col === "string" && valid.has(col) && !seen.has(col)) {
seen.add(col);
result.push(col);
}
}
// "name" is mandatory — ensure it's always present
if (!seen.has("name")) {
result.unshift("name");
}
return result;
}
const DEPRECATED_UPDATE_REPOS = new Set([
"sucukdeluxe/real-debrid-downloader"
]);
function migrateUpdateRepo(raw: string, fallback: string): string {
const trimmed = raw.trim();
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
return fallback;
}
return trimmed;
}
export function normalizeSettings(settings: AppSettings): AppSettings {
const defaults = defaultSettings();
const normalized: AppSettings = {
@ -111,10 +73,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
megaPassword: asText(settings.megaPassword),
bestToken: asText(settings.bestToken),
allDebridToken: asText(settings.allDebridToken),
ddownloadLogin: asText(settings.ddownloadLogin),
ddownloadPassword: asText(settings.ddownloadPassword),
oneFichierApiKey: asText(settings.oneFichierApiKey),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n/g, "\n"),
rememberToken: Boolean(settings.rememberToken),
providerPrimary: settings.providerPrimary,
providerSecondary: settings.providerSecondary,
@ -137,7 +96,6 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
autoResumeOnStart: Boolean(settings.autoResumeOnStart),
autoReconnect: Boolean(settings.autoReconnect),
maxParallel: clampNumber(settings.maxParallel, defaults.maxParallel, 1, 50),
maxParallelExtract: clampNumber(settings.maxParallelExtract, defaults.maxParallelExtract, 1, 8),
retryLimit: clampNumber(settings.retryLimit, defaults.retryLimit, 0, 99),
reconnectWaitSeconds: clampNumber(settings.reconnectWaitSeconds, defaults.reconnectWaitSeconds, 10, 600),
completedCleanupPolicy: settings.completedCleanupPolicy,
@ -145,18 +103,11 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
speedLimitMode: settings.speedLimitMode,
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
updateRepo: asText(settings.updateRepo) || defaults.updateRepo,
clipboardWatch: Boolean(settings.clipboardWatch),
minimizeToTray: Boolean(settings.minimizeToTray),
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
autoSkipExtracted: settings.autoSkipExtracted !== undefined ? Boolean(settings.autoSkipExtracted) : defaults.autoSkipExtracted,
confirmDeleteSelection: settings.confirmDeleteSelection !== undefined ? Boolean(settings.confirmDeleteSelection) : defaults.confirmDeleteSelection,
totalDownloadedAllTime: typeof settings.totalDownloadedAllTime === "number" && settings.totalDownloadedAllTime >= 0 ? settings.totalDownloadedAllTime : defaults.totalDownloadedAllTime,
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
columnOrder: normalizeColumnOrder(settings.columnOrder),
extractCpuPriority: settings.extractCpuPriority,
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules)
};
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
@ -186,9 +137,6 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
if (!VALID_SPEED_MODES.has(normalized.speedLimitMode)) {
normalized.speedLimitMode = defaults.speedLimitMode;
}
if (!VALID_EXTRACT_CPU_PRIORITIES.has(normalized.extractCpuPriority)) {
normalized.extractCpuPriority = defaults.extractCpuPriority;
}
return normalized;
}
@ -204,9 +152,7 @@ function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: ""
archivePasswordList: ""
};
}
@ -214,15 +160,13 @@ export interface StoragePaths {
baseDir: string;
configFile: string;
sessionFile: string;
historyFile: string;
}
export function createStoragePaths(baseDir: string): StoragePaths {
return {
baseDir,
configFile: path.join(baseDir, "rd_downloader_config.json"),
sessionFile: path.join(baseDir, "rd_session_state.json"),
historyFile: path.join(baseDir, "rd_history.json")
sessionFile: path.join(baseDir, "rd_session_state.json")
};
}
@ -250,7 +194,7 @@ function readSettingsFile(filePath: string): AppSettings | null {
}
}
export function normalizeLoadedSession(raw: unknown): SessionState {
function normalizeLoadedSession(raw: unknown): SessionState {
const fallback = emptySession();
const parsed = asRecord(raw);
if (!parsed) {
@ -276,8 +220,6 @@ export function normalizeLoadedSession(raw: unknown): SessionState {
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const providerRaw = asText(item.provider) as DebridProvider;
const onlineStatusRaw = asText(item.onlineStatus);
itemsById[id] = {
id,
packageId,
@ -295,7 +237,6 @@ export function normalizeLoadedSession(raw: unknown): SessionState {
attempts: clampNumber(item.attempts, 0, 0, 10_000),
lastError: asText(item.lastError),
fullStatus: asText(item.fullStatus),
onlineStatus: VALID_ONLINE_STATUSES.has(onlineStatusRaw) ? onlineStatusRaw as "online" | "offline" | "checking" : undefined,
createdAt: clampNumber(item.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(item.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
@ -326,7 +267,6 @@ export function normalizeLoadedSession(raw: unknown): SessionState {
.filter((value) => value.length > 0),
cancelled: Boolean(pkg.cancelled),
enabled: pkg.enabled === undefined ? true : Boolean(pkg.enabled),
priority: VALID_PACKAGE_PRIORITIES.has(asText(pkg.priority)) ? asText(pkg.priority) as PackagePriority : "normal",
createdAt: clampNumber(pkg.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(pkg.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
@ -357,8 +297,7 @@ export function normalizeLoadedSession(raw: unknown): SessionState {
return true;
});
for (const packageId of Object.keys(packagesById)) {
if (!seenOrder.has(packageId)) {
seenOrder.add(packageId);
if (!packageOrder.includes(packageId)) {
packageOrder.push(packageId);
}
}
@ -426,47 +365,6 @@ function sessionTempPath(sessionFile: string, kind: "sync" | "async"): string {
return `${sessionFile}.${kind}.tmp`;
}
function sessionBackupPath(sessionFile: string): string {
return `${sessionFile}.bak`;
}
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
if (ACTIVE_STATUSES.has(item.status)) {
item.status = "queued";
item.lastError = "";
}
// Always clear stale speed values
item.speedBps = 0;
}
// Reset package-level active statuses to queued (mirrors item reset above)
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const pkg of Object.values(session.packages)) {
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
pkg.status = "queued";
}
pkg.postProcessLabel = undefined;
}
// Clear stale session-level running/paused flags
session.running = false;
session.paused = false;
return session;
}
function readSessionFile(filePath: string): SessionState | null {
try {
const parsed = JSON.parse(fs.readFileSync(filePath, "utf8")) as unknown;
return normalizeLoadedSessionTransientFields(normalizeLoadedSession(parsed));
} catch {
return null;
}
}
export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
ensureBaseDir(paths.baseDir);
// Create a backup of the existing config before overwriting
@ -480,56 +378,8 @@ export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
const tempPath = `${paths.configFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSettingsSaveRunning = false;
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.configFile, `${paths.configFile}.bak`).catch(() => {});
const tempPath = `${paths.configFile}.settings.tmp`;
await fsp.writeFile(tempPath, payload, "utf8");
try {
await fsp.rename(tempPath, paths.configFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
await fsp.copyFile(tempPath, paths.configFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
}
export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettings): Promise<void> {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
if (asyncSettingsSaveRunning) {
asyncSettingsSaveQueued = { paths, settings };
return;
}
asyncSettingsSaveRunning = true;
try {
await writeSettingsPayload(paths, payload);
} catch (error) {
logger.error(`Async Settings-Save fehlgeschlagen: ${String(error)}`);
} finally {
asyncSettingsSaveRunning = false;
if (asyncSettingsSaveQueued) {
const queued = asyncSettingsSaveQueued;
asyncSettingsSaveQueued = null;
void saveSettingsAsync(queued.paths, queued.settings);
}
}
}
export function emptySession(): SessionState {
@ -554,78 +404,50 @@ export function loadSession(paths: StoragePaths): SessionState {
if (!fs.existsSync(paths.sessionFile)) {
return emptySession();
}
const primary = readSessionFile(paths.sessionFile);
if (primary) {
return primary;
}
const backupFile = sessionBackupPath(paths.sessionFile);
const backup = fs.existsSync(backupFile) ? readSessionFile(backupFile) : null;
if (backup) {
logger.warn("Session defekt, Backup-Datei wird verwendet");
try {
const payload = JSON.stringify({ ...backup, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch {
// ignore restore write failure
const parsed = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as unknown;
const session = normalizeLoadedSession(parsed);
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
if (ACTIVE_STATUSES.has(item.status)) {
item.status = "queued";
item.lastError = "";
}
return backup;
// Always clear stale speed values
item.speedBps = 0;
}
logger.error("Session konnte nicht geladen werden (auch Backup fehlgeschlagen)");
return session;
} catch (error) {
logger.error(`Session konnte nicht geladen werden: ${String(error)}`);
return emptySession();
}
}
export function saveSession(paths: StoragePaths, session: SessionState): void {
syncSaveGeneration += 1;
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.sessionFile)) {
try {
fs.copyFileSync(paths.sessionFile, sessionBackupPath(paths.sessionFile));
} catch {
// Best-effort backup; proceed even if it fails
}
}
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSaveRunning = false;
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let syncSaveGeneration = 0;
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
async function writeSessionPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
const tempPath = sessionTempPath(paths.sessionFile, "async");
await fsp.writeFile(tempPath, payload, "utf8");
// If a synchronous save occurred after this async save started, discard the stale write
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
try {
await fsp.rename(tempPath, paths.sessionFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
await fsp.copyFile(tempPath, paths.sessionFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
@ -637,9 +459,8 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
return;
}
asyncSaveRunning = true;
const gen = syncSaveGeneration;
try {
await writeSessionPayload(paths, payload, gen);
await writeSessionPayload(paths, payload);
} catch (error) {
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
} finally {
@ -652,98 +473,7 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
}
}
export function cancelPendingAsyncSaves(): void {
asyncSaveQueued = null;
asyncSettingsSaveQueued = null;
syncSaveGeneration += 1;
}
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
await saveSessionPayloadAsync(paths, payload);
}
const MAX_HISTORY_ENTRIES = 500;
function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null {
const entry = asRecord(raw);
if (!entry) return null;
const id = asText(entry.id) || `hist-${Date.now().toString(36)}-${index}`;
const name = asText(entry.name) || "Unbenannt";
const providerRaw = asText(entry.provider);
return {
id,
name,
totalBytes: clampNumber(entry.totalBytes, 0, 0, Number.MAX_SAFE_INTEGER),
downloadedBytes: clampNumber(entry.downloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
fileCount: clampNumber(entry.fileCount, 0, 0, 100000),
provider: VALID_ITEM_PROVIDERS.has(providerRaw as DebridProvider) ? providerRaw as DebridProvider : null,
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
status: entry.status === "deleted" ? "deleted" : "completed",
outputDir: asText(entry.outputDir),
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
};
}
export function loadHistory(paths: StoragePaths): HistoryEntry[] {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.historyFile)) {
return [];
}
try {
const raw = JSON.parse(fs.readFileSync(paths.historyFile, "utf8")) as unknown;
if (!Array.isArray(raw)) return [];
const entries: HistoryEntry[] = [];
for (let i = 0; i < raw.length && entries.length < MAX_HISTORY_ENTRIES; i++) {
const normalized = normalizeHistoryEntry(raw[i], i);
if (normalized) entries.push(normalized);
}
return entries;
} catch {
return [];
}
}
export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void {
ensureBaseDir(paths.baseDir);
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
const payload = JSON.stringify(trimmed, null, 2);
const tempPath = `${paths.historyFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = [entry, ...existing].slice(0, MAX_HISTORY_ENTRIES);
saveHistory(paths, updated);
return updated;
}
export function removeHistoryEntry(paths: StoragePaths, entryId: string): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = existing.filter(e => e.id !== entryId);
saveHistory(paths, updated);
return updated;
}
export function clearHistory(paths: StoragePaths): void {
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.historyFile)) {
try {
fs.unlinkSync(paths.historyFile);
} catch {
// ignore
}
}
}

View File

@ -3,6 +3,9 @@ import os from "node:os";
import path from "node:path";
import crypto from "node:crypto";
import { spawn } from "node:child_process";
import { Readable } from "node:stream";
import { pipeline } from "node:stream/promises";
import { ReadableStream as NodeReadableStream } from "node:stream/web";
import { APP_VERSION, DEFAULT_UPDATE_REPO } from "./constants";
import { UpdateCheckResult, UpdateInstallProgress, UpdateInstallResult } from "../shared/types";
import { compactErrorText, humanSize } from "./utils";
@ -14,32 +17,8 @@ const DOWNLOAD_BODY_IDLE_TIMEOUT_MS = 45000;
const RETRIES_PER_CANDIDATE = 3;
const RETRY_DELAY_MS = 1500;
const UPDATE_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
type UpdateSource = {
name: string;
webBase: string;
apiBase: string;
};
const UPDATE_SOURCES: UpdateSource[] = [
{
name: "git24",
webBase: "https://git.24-music.de",
apiBase: "https://git.24-music.de/api/v1"
},
{
name: "codeberg",
webBase: "https://codeberg.org",
apiBase: "https://codeberg.org/api/v1"
},
{
name: "github",
webBase: "https://github.com",
apiBase: "https://api.github.com"
}
];
const PRIMARY_UPDATE_SOURCE = UPDATE_SOURCES[0];
const UPDATE_WEB_BASE = PRIMARY_UPDATE_SOURCE.webBase;
const UPDATE_API_BASE = PRIMARY_UPDATE_SOURCE.apiBase;
const UPDATE_WEB_BASE = "https://codeberg.org";
const UPDATE_API_BASE = "https://codeberg.org/api/v1";
let activeUpdateAbortController: AbortController | null = null;
@ -81,9 +60,9 @@ export function normalizeUpdateRepo(repo: string): string {
const normalizeParts = (input: string): string => {
const cleaned = input
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com|git\.24-music\.de):/i, "")
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com):/i, "")
.replace(/\.git$/i, "")
.replace(/^\/+|\/+$/g, "");
const parts = cleaned.split("/").filter(Boolean);
@ -100,13 +79,7 @@ export function normalizeUpdateRepo(repo: string): string {
try {
const url = new URL(raw);
const host = url.hostname.toLowerCase();
if (
host === "codeberg.org"
|| host === "www.codeberg.org"
|| host === "github.com"
|| host === "www.github.com"
|| host === "git.24-music.de"
) {
if (host === "codeberg.org" || host === "www.codeberg.org" || host === "github.com" || host === "www.github.com") {
const normalized = normalizeParts(url.pathname);
if (normalized) {
return normalized;
@ -336,8 +309,6 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
const releaseUrl = String(payload.html_url || fallback.releaseUrl);
const setup = pickSetupAsset(readReleaseAssets(payload));
const body = typeof payload.body === "string" ? payload.body.trim() : "";
return {
updateAvailable: isRemoteNewer(APP_VERSION, latestVersion),
currentVersion: APP_VERSION,
@ -346,8 +317,7 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
releaseUrl,
setupAssetUrl: setup?.browser_download_url || "",
setupAssetName: setup?.name || "",
setupAssetDigest: setup?.digest || "",
releaseNotes: body || undefined
setupAssetDigest: setup?.digest || ""
};
}
@ -391,65 +361,17 @@ function uniqueStrings(values: string[]): string[] {
return out;
}
function deriveTagFromReleaseUrl(releaseUrl: string): string {
const raw = String(releaseUrl || "").trim();
if (!raw) {
return "";
}
try {
const parsed = new URL(raw);
const match = parsed.pathname.match(/\/releases\/tag\/([^/?#]+)/i);
return match?.[1] ? decodeURIComponent(match[1]) : "";
} catch {
return "";
}
}
function extractFileNameFromUrl(url: string): string {
const raw = String(url || "").trim();
if (!raw) {
return "";
}
try {
const parsed = new URL(raw);
const fileName = path.basename(parsed.pathname || "");
return fileName ? decodeURIComponent(fileName) : "";
} catch {
return "";
}
}
function deriveSetupNameVariants(setupAssetName: string, setupAssetUrl: string): string[] {
const directName = String(setupAssetName || "").trim();
const fromUrlName = extractFileNameFromUrl(setupAssetUrl);
const source = directName || fromUrlName;
if (!source) {
return [];
}
const ext = path.extname(source);
const stem = ext ? source.slice(0, -ext.length) : source;
const dashed = `${stem.replace(/\s+/g, "-")}${ext}`;
return uniqueStrings([source, fromUrlName, dashed]);
}
function buildDownloadCandidates(safeRepo: string, check: UpdateCheckResult): string[] {
const setupAssetName = String(check.setupAssetName || "").trim();
const setupAssetUrl = String(check.setupAssetUrl || "").trim();
const latestTag = String(check.latestTag || "").trim() || deriveTagFromReleaseUrl(String(check.releaseUrl || ""));
const latestTag = String(check.latestTag || "").trim();
const candidates = [setupAssetUrl];
const nameVariants = deriveSetupNameVariants(setupAssetName, setupAssetUrl);
if (latestTag && nameVariants.length > 0) {
for (const name of nameVariants) {
const encodedName = encodeURIComponent(name);
candidates.push(`${UPDATE_WEB_BASE}/${safeRepo}/releases/download/${encodeURIComponent(latestTag)}/${encodedName}`);
}
}
if (!latestTag && nameVariants.length > 0) {
for (const name of nameVariants) {
const encodedName = encodeURIComponent(name);
if (setupAssetName) {
const encodedName = encodeURIComponent(setupAssetName);
candidates.push(`${UPDATE_WEB_BASE}/${safeRepo}/releases/latest/download/${encodedName}`);
if (latestTag) {
candidates.push(`${UPDATE_WEB_BASE}/${safeRepo}/releases/download/${encodeURIComponent(latestTag)}/${encodedName}`);
}
}
@ -719,7 +641,7 @@ export async function checkGitHubUpdate(repo: string): Promise<UpdateCheckResult
}
}
async function downloadFile(url: string, targetPath: string, onProgress?: UpdateProgressCallback): Promise<{ expectedBytes: number | null }> {
async function downloadFile(url: string, targetPath: string, onProgress?: UpdateProgressCallback): Promise<void> {
const shutdownSignal = activeUpdateAbortController?.signal;
if (shutdownSignal?.aborted) {
throw new Error("aborted:update_shutdown");
@ -730,8 +652,7 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
try {
response = await fetch(url, {
headers: {
"User-Agent": UPDATE_USER_AGENT,
"Accept-Encoding": "identity"
"User-Agent": UPDATE_USER_AGENT
},
redirect: "follow",
signal: combineSignals(timeout.signal, shutdownSignal)
@ -772,78 +693,73 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
emitDownloadProgress(true);
await fs.promises.mkdir(path.dirname(targetPath), { recursive: true });
const source = Readable.fromWeb(response.body as unknown as NodeReadableStream<Uint8Array>);
const target = fs.createWriteStream(targetPath);
const idleTimeoutMs = getDownloadBodyIdleTimeoutMs();
let idleTimer: NodeJS.Timeout | null = null;
let idleTimedOut = false;
const clearIdleTimer = (): void => {
if (idleTimer) {
clearTimeout(idleTimer);
idleTimer = null;
}
};
const onIdleTimeout = (): void => {
const timeoutError = new Error(`Update Download Body Timeout nach ${Math.ceil(idleTimeoutMs / 1000)}s`);
source.destroy(timeoutError);
target.destroy(timeoutError);
};
const resetIdleTimer = (): void => {
if (idleTimeoutMs <= 0) {
return;
}
clearIdleTimer();
idleTimer = setTimeout(() => {
idleTimedOut = true;
reader.cancel().catch(() => undefined);
}, idleTimeoutMs);
idleTimer = setTimeout(onIdleTimeout, idleTimeoutMs);
};
const reader = response.body.getReader();
const tempPath = targetPath + ".tmp";
const writeStream = fs.createWriteStream(tempPath);
try {
resetIdleTimer();
for (;;) {
if (shutdownSignal?.aborted) {
await reader.cancel().catch(() => undefined);
throw new Error("aborted:update_shutdown");
}
const { done, value } = await reader.read();
if (done) {
break;
}
const buf = Buffer.from(value.buffer, value.byteOffset, value.byteLength);
if (!writeStream.write(buf)) {
await new Promise<void>((resolve) => writeStream.once("drain", resolve));
}
downloadedBytes += buf.byteLength;
const onSourceData = (chunk: string | Buffer): void => {
downloadedBytes += typeof chunk === "string" ? Buffer.byteLength(chunk) : chunk.byteLength;
resetIdleTimer();
emitDownloadProgress(false);
};
const onSourceDone = (): void => {
clearIdleTimer();
};
if (idleTimeoutMs > 0) {
source.on("data", onSourceData);
source.on("end", onSourceDone);
source.on("close", onSourceDone);
source.on("error", onSourceDone);
target.on("close", onSourceDone);
target.on("error", onSourceDone);
resetIdleTimer();
}
try {
await pipeline(source, target);
} catch (error) {
writeStream.destroy();
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
try {
source.destroy();
} catch {
// ignore
}
try {
target.destroy();
} catch {
// ignore
}
throw error;
} finally {
clearIdleTimer();
source.off("data", onSourceData);
source.off("end", onSourceDone);
source.off("close", onSourceDone);
source.off("error", onSourceDone);
target.off("close", onSourceDone);
target.off("error", onSourceDone);
}
await new Promise<void>((resolve, reject) => {
writeStream.end(() => resolve());
writeStream.on("error", reject);
});
if (idleTimedOut) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download Body Timeout nach ${Math.ceil(idleTimeoutMs / 1000)}s`);
}
if (totalBytes && downloadedBytes !== totalBytes) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download unvollständig (${downloadedBytes} / ${totalBytes} Bytes)`);
}
await fs.promises.rename(tempPath, targetPath);
emitDownloadProgress(true);
logger.info(`Update-Download abgeschlossen: ${targetPath} (${downloadedBytes} Bytes)`);
return { expectedBytes: totalBytes };
logger.info(`Update-Download abgeschlossen: ${targetPath}`);
}
async function sleep(ms: number, signal?: AbortSignal): Promise<void> {
@ -956,7 +872,6 @@ export async function installLatestUpdate(
: await checkGitHubUpdate(safeRepo);
if (check.error) {
activeUpdateAbortController = null;
safeEmitProgress(onProgress, {
stage: "error",
percent: null,
@ -967,7 +882,6 @@ export async function installLatestUpdate(
return { started: false, message: check.error };
}
if (!check.updateAvailable) {
activeUpdateAbortController = null;
safeEmitProgress(onProgress, {
stage: "error",
percent: null,
@ -1008,16 +922,8 @@ export async function installLatestUpdate(
}
}
let candidates = buildDownloadCandidates(safeRepo, effectiveCheck);
const candidates = buildDownloadCandidates(safeRepo, effectiveCheck);
if (candidates.length === 0) {
activeUpdateAbortController = null;
safeEmitProgress(onProgress, {
stage: "error",
percent: null,
downloadedBytes: 0,
totalBytes: null,
message: "Setup-Asset nicht gefunden"
});
return { started: false, message: "Setup-Asset nicht gefunden" };
}
@ -1035,16 +941,7 @@ export async function installLatestUpdate(
if (updateAbortController.signal.aborted) {
throw new Error("aborted:update_shutdown");
}
let verified = false;
let lastVerifyError: unknown = null;
let integrityError: unknown = null;
for (let pass = 0; pass < 3 && !verified; pass += 1) {
logger.info(`Update-Download Kandidaten (${pass + 1}/3): ${candidates.join(" | ")}`);
lastVerifyError = null;
for (let index = 0; index < candidates.length; index += 1) {
const candidate = candidates[index];
try {
await downloadWithRetries(candidate, targetPath, onProgress);
await downloadFromCandidates(candidates, targetPath, onProgress);
if (updateAbortController.signal.aborted) {
throw new Error("aborted:update_shutdown");
}
@ -1053,77 +950,9 @@ export async function installLatestUpdate(
percent: 100,
downloadedBytes: 0,
totalBytes: null,
message: `Prüfe Installer-Integrität (${index + 1}/${candidates.length})`
message: "Prüfe Installer-Integrität"
});
await verifyDownloadedInstaller(targetPath, String(effectiveCheck.setupAssetDigest || ""));
verified = true;
break;
} catch (error) {
lastVerifyError = error;
const errorText = compactErrorText(error).toLowerCase();
if (!integrityError && (errorText.includes("integrit") || errorText.includes("mismatch"))) {
integrityError = error;
}
try {
await fs.promises.rm(targetPath, { force: true });
} catch {
// ignore
}
if (index < candidates.length - 1) {
logger.warn(`Update-Kandidat ${index + 1}/${candidates.length} verworfen: ${compactErrorText(error)}`);
}
}
}
if (verified) {
break;
}
const status = readHttpStatusFromError(lastVerifyError);
const wasIntegrityError = integrityError !== null;
let shouldRetryAfterRefresh = false;
if (pass < 2 && (status === 404 || wasIntegrityError)) {
const refreshed = await resolveSetupAssetFromApi(safeRepo, effectiveCheck.latestTag);
if (refreshed) {
effectiveCheck = {
...effectiveCheck,
setupAssetUrl: refreshed.setupAssetUrl || effectiveCheck.setupAssetUrl,
setupAssetName: refreshed.setupAssetName || effectiveCheck.setupAssetName,
setupAssetDigest: refreshed.setupAssetDigest || effectiveCheck.setupAssetDigest
};
}
const digestFromYml = await resolveSetupDigestFromLatestYml(safeRepo, effectiveCheck.latestTag, effectiveCheck.setupAssetName || "");
if (digestFromYml) {
effectiveCheck = {
...effectiveCheck,
setupAssetDigest: digestFromYml
};
logger.info("Update-Integritätsdigest aus latest.yml übernommen");
}
const refreshedCandidates = buildDownloadCandidates(safeRepo, effectiveCheck);
const changed = refreshedCandidates.length > 0
&& (refreshedCandidates.length !== candidates.length
|| refreshedCandidates.some((value, idx) => value !== candidates[idx]));
if (changed) {
logger.warn(`Update-Fehler erkannt (${wasIntegrityError ? "integrity" : "404"}), Kandidatenliste aus API neu geladen`);
candidates = refreshedCandidates;
}
shouldRetryAfterRefresh = true;
if (wasIntegrityError) {
integrityError = null;
logger.warn("SHA512-Mismatch erkannt, erneuter Download-Versuch");
}
}
if (!shouldRetryAfterRefresh) {
break;
}
}
if (!verified) {
throw integrityError || lastVerifyError || new Error("Update-Download fehlgeschlagen");
}
safeEmitProgress(onProgress, {
stage: "launching",
percent: 100,

View File

@ -3,9 +3,6 @@ import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
@ -31,7 +28,6 @@ const api: ElectronApi = {
ipcRenderer.invoke(IPC_CHANNELS.RESOLVE_START_CONFLICT, packageId, policy),
clearAll: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_ALL),
start: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START),
startPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_PACKAGES, packageIds),
stop: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.STOP),
togglePause: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PAUSE),
cancelPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CANCEL_PACKAGE, packageId),
@ -39,28 +35,11 @@ const api: ElectronApi = {
reorderPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REORDER_PACKAGES, packageIds),
removeItem: (itemId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_ITEM, itemId),
togglePackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PACKAGE, packageId),
exportQueue: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
exportQueue: (): Promise<string> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
importQueue: (json: string): Promise<{ addedPackages: number; addedLinks: number }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_QUEUE, json),
toggleClipboard: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_CLIPBOARD),
pickFolder: (): Promise<string | null> => ipcRenderer.invoke(IPC_CHANNELS.PICK_FOLDER),
pickContainers: (): Promise<string[]> => ipcRenderer.invoke(IPC_CHANNELS.PICK_CONTAINERS),
getSessionStats: (): Promise<SessionStats> => ipcRenderer.invoke(IPC_CHANNELS.GET_SESSION_STATS),
restart: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESTART),
quit: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.QUIT),
exportBackup: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_BACKUP),
importBackup: (): Promise<{ restored: boolean; message: string }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_BACKUP),
openLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_LOG),
openSessionLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_SESSION_LOG),
retryExtraction: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RETRY_EXTRACTION, packageId),
extractNow: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.EXTRACT_NOW, packageId),
resetPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_PACKAGE, packageId),
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),
onStateUpdate: (callback: (snapshot: UiSnapshot) => void): (() => void) => {
const listener = (_event: unknown, snapshot: UiSnapshot): void => callback(snapshot);
ipcRenderer.on(IPC_CHANNELS.STATE_UPDATE, listener);

File diff suppressed because it is too large Load Diff

View File

@ -1,25 +0,0 @@
import type { PackageEntry } from "../shared/types";
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
const fromIndex = order.indexOf(draggedPackageId);
const toIndex = order.indexOf(targetPackageId);
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
return order;
}
const next = [...order];
const [dragged] = next.splice(fromIndex, 1);
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
next.splice(insertIndex, 0, dragged);
return next;
}
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
const sorted = [...order];
sorted.sort((a, b) => {
const nameA = (packages[a]?.name ?? "").toLowerCase();
const nameB = (packages[b]?.name ?? "").toLowerCase();
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
return descending ? -cmp : cmp;
});
return sorted;
}

File diff suppressed because it is too large Load Diff

View File

@ -12,7 +12,6 @@ export const IPC_CHANNELS = {
RESOLVE_START_CONFLICT: "queue:resolve-start-conflict",
CLEAR_ALL: "queue:clear-all",
START: "queue:start",
START_PACKAGES: "queue:start-packages",
STOP: "queue:stop",
TOGGLE_PAUSE: "queue:toggle-pause",
CANCEL_PACKAGE: "queue:cancel-package",
@ -26,22 +25,5 @@ export const IPC_CHANNELS = {
PICK_CONTAINERS: "dialog:pick-containers",
STATE_UPDATE: "state:update",
CLIPBOARD_DETECTED: "clipboard:detected",
TOGGLE_CLIPBOARD: "clipboard:toggle",
GET_SESSION_STATS: "stats:get-session-stats",
RESTART: "app:restart",
QUIT: "app:quit",
EXPORT_BACKUP: "app:export-backup",
IMPORT_BACKUP: "app:import-backup",
OPEN_LOG: "app:open-log",
OPEN_SESSION_LOG: "app:open-session-log",
RETRY_EXTRACTION: "queue:retry-extraction",
EXTRACT_NOW: "queue:extract-now",
RESET_PACKAGE: "queue:reset-package",
GET_HISTORY: "history:get",
CLEAR_HISTORY: "history:clear",
REMOVE_HISTORY_ENTRY: "history:remove-entry",
SET_PACKAGE_PRIORITY: "queue:set-package-priority",
SKIP_ITEMS: "queue:skip-items",
RESET_ITEMS: "queue:reset-items",
START_ITEMS: "queue:start-items"
TOGGLE_CLIPBOARD: "clipboard:toggle"
} as const;

View File

@ -2,9 +2,6 @@ import type {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
@ -26,7 +23,6 @@ export interface ElectronApi {
resolveStartConflict: (packageId: string, policy: DuplicatePolicy) => Promise<StartConflictResolutionResult>;
clearAll: () => Promise<void>;
start: () => Promise<void>;
startPackages: (packageIds: string[]) => Promise<void>;
stop: () => Promise<void>;
togglePause: () => Promise<boolean>;
cancelPackage: (packageId: string) => Promise<void>;
@ -34,28 +30,11 @@ export interface ElectronApi {
reorderPackages: (packageIds: string[]) => Promise<void>;
removeItem: (itemId: string) => Promise<void>;
togglePackage: (packageId: string) => Promise<void>;
exportQueue: () => Promise<{ saved: boolean }>;
exportQueue: () => Promise<string>;
importQueue: (json: string) => Promise<{ addedPackages: number; addedLinks: number }>;
toggleClipboard: () => Promise<boolean>;
pickFolder: () => Promise<string | null>;
pickContainers: () => Promise<string[]>;
getSessionStats: () => Promise<SessionStats>;
restart: () => Promise<void>;
quit: () => Promise<void>;
exportBackup: () => Promise<{ saved: boolean }>;
importBackup: () => Promise<{ restored: boolean; message: string }>;
openLog: () => Promise<void>;
openSessionLog: () => Promise<void>;
retryExtraction: (packageId: string) => Promise<void>;
extractNow: (packageId: string) => Promise<void>;
resetPackage: (packageId: string) => Promise<void>;
getHistory: () => Promise<HistoryEntry[]>;
clearHistory: () => Promise<void>;
removeHistoryEntry: (entryId: string) => Promise<void>;
setPackagePriority: (packageId: string, priority: PackagePriority) => Promise<void>;
skipItems: (itemIds: string[]) => Promise<void>;
resetItems: (itemIds: string[]) => Promise<void>;
startItems: (itemIds: string[]) => Promise<void>;
onStateUpdate: (callback: (snapshot: UiSnapshot) => void) => () => void;
onClipboardDetected: (callback: (links: string[]) => void) => () => void;
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void) => () => void;

View File

@ -14,11 +14,9 @@ export type CleanupMode = "none" | "trash" | "delete";
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
export type SpeedMode = "global" | "per_download";
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload" | "onefichier";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid";
export type DebridFallbackProvider = DebridProvider | "none";
export type AppTheme = "dark" | "light";
export type PackagePriority = "high" | "normal" | "low";
export type ExtractCpuPriority = "high" | "middle" | "low";
export interface BandwidthScheduleEntry {
id: string;
@ -30,7 +28,6 @@ export interface BandwidthScheduleEntry {
export interface DownloadStats {
totalDownloaded: number;
totalDownloadedAllTime: number;
totalFiles: number;
totalPackages: number;
sessionStartedAt: number;
@ -42,9 +39,6 @@ export interface AppSettings {
megaPassword: string;
bestToken: string;
allDebridToken: string;
ddownloadLogin: string;
ddownloadPassword: string;
oneFichierApiKey: string;
archivePasswordList: string;
rememberToken: boolean;
providerPrimary: DebridProvider;
@ -70,7 +64,6 @@ export interface AppSettings {
reconnectWaitSeconds: number;
completedCleanupPolicy: FinishedCleanupPolicy;
maxParallel: number;
maxParallelExtract: number;
retryLimit: number;
speedLimitEnabled: boolean;
speedLimitKbps: number;
@ -80,14 +73,7 @@ export interface AppSettings {
clipboardWatch: boolean;
minimizeToTray: boolean;
theme: AppTheme;
collapseNewPackages: boolean;
autoSkipExtracted: boolean;
confirmDeleteSelection: boolean;
totalDownloadedAllTime: number;
bandwidthSchedules: BandwidthScheduleEntry[];
columnOrder: string[];
extractCpuPriority: ExtractCpuPriority;
autoExtractWhenStopped: boolean;
}
export interface DownloadItem {
@ -109,7 +95,6 @@ export interface DownloadItem {
fullStatus: string;
createdAt: number;
updatedAt: number;
onlineStatus?: "online" | "offline" | "checking";
}
export interface PackageEntry {
@ -121,8 +106,6 @@ export interface PackageEntry {
itemIds: string[];
cancelled: boolean;
enabled: boolean;
priority: PackagePriority;
postProcessLabel?: string;
createdAt: number;
updatedAt: number;
}
@ -175,7 +158,6 @@ export interface UiSnapshot {
canPause: boolean;
clipboardActive: boolean;
reconnectSeconds: number;
packageSpeedBps: Record<string, number>;
}
export interface AddLinksPayload {
@ -223,7 +205,6 @@ export interface UpdateCheckResult {
setupAssetUrl?: string;
setupAssetName?: string;
setupAssetDigest?: string;
releaseNotes?: string;
error?: string;
}
@ -245,45 +226,3 @@ export interface ParsedHashEntry {
algorithm: "crc32" | "md5" | "sha1";
digest: string;
}
export interface BandwidthSample {
timestamp: number;
speedBps: number;
}
export interface BandwidthStats {
samples: BandwidthSample[];
currentSpeedBps: number;
averageSpeedBps: number;
maxSpeedBps: number;
totalBytesSession: number;
sessionDurationSeconds: number;
}
export interface SessionStats {
bandwidth: BandwidthStats;
totalDownloads: number;
completedDownloads: number;
failedDownloads: number;
activeDownloads: number;
queuedDownloads: number;
}
export interface HistoryEntry {
id: string;
name: string;
totalBytes: number;
downloadedBytes: number;
fileCount: number;
provider: DebridProvider | null;
completedAt: number;
durationSeconds: number;
status: "completed" | "deleted";
outputDir: string;
urls?: string[];
}
export interface HistoryState {
entries: HistoryEntry[];
maxEntries: number;
}

View File

@ -1,5 +1,5 @@
import { describe, expect, it } from "vitest";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/App";
describe("reorderPackageOrderByDrop", () => {
it("moves adjacent package down by one on drop", () => {
@ -25,9 +25,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg3", "pkg1", "pkg2"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
},
false
);
@ -38,9 +38,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg1", "pkg2", "pkg3"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
},
true
);

View File

@ -62,22 +62,6 @@ describe("extractEpisodeToken", () => {
it("extracts from episode token at end of string", () => {
expect(extractEpisodeToken("show.s02e03")).toBe("S02E03");
});
it("extracts double episode token s01e01e02", () => {
expect(extractEpisodeToken("tvr-mammon-s01e01e02-720p")).toBe("S01E01E02");
});
it("extracts double episode with dot separators", () => {
expect(extractEpisodeToken("Show.S01E03E04.720p")).toBe("S01E03E04");
});
it("extracts double episode at end of string", () => {
expect(extractEpisodeToken("show.s02e05e06")).toBe("S02E05E06");
});
it("extracts double episode with single-digit numbers", () => {
expect(extractEpisodeToken("show-s1e1e2-720p")).toBe("S01E01E02");
});
});
describe("applyEpisodeTokenToFolderName", () => {
@ -112,21 +96,6 @@ describe("applyEpisodeTokenToFolderName", () => {
it("is case-insensitive for -4SF/-4SJ suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4SF", "S01E01")).toBe("Show.720p.S01E01-4SF");
});
it("applies double episode token to season-only folder", () => {
expect(applyEpisodeTokenToFolderName("Mammon.S01.German.1080P.Bluray.x264-SMAHD", "S01E01E02"))
.toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("replaces existing double episode in folder with new token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01E02.720p-4sf", "S01E03E04"))
.toBe("Show.S01E03E04.720p-4sf");
});
it("replaces existing single episode in folder with double episode token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S01E01E02"))
.toBe("Show.S01E01E02.720p-4sf");
});
});
describe("sourceHasRpToken", () => {
@ -269,7 +238,6 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
expect(result).not.toBeNull();
expect(result!).toContain("S99E999");
});
// Real-world scene release patterns
@ -344,7 +312,6 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
// "mkv" should not be treated as part of the filename match
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
});
it("does not match episode-like patterns in codec strings", () => {
@ -375,7 +342,6 @@ describe("buildAutoRenameBaseName", () => {
// Extreme edge case - sanitizeFilename trims leading dots
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
expect(result!).toContain("-4sf");
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
});
@ -546,149 +512,4 @@ describe("buildAutoRenameBaseNameFromFolders", () => {
);
expect(result).toBe("Lethal.Weapon.S02E11.German.DD51.Dubbed.DL.720p.AmazonHD.x264-TVS");
});
it("maps episode-only token e01 via season folder hint and keeps REPACK", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e01-720p-repack",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.REPACK.720p.WEB.h264-TMSF");
});
it("maps episode-only token e02 via season folder hint", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E02.GERMAN.720p.WEB.h264-TMSF");
});
it("keeps renaming for odd source order like 4sf-bs-720p-s01e05", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"4sf-bs-720p-s01e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E05.GERMAN.720p.WEB.h264-TMSF");
});
it("accepts lowercase scene group suffixes", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-tmsf"
],
"tmsf-cheatderbetrug-e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.720p.WEB.h264-tmsf");
});
it("renames double episode file into season folder (Mammon style)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e01e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("renames second double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e03e04-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E03E04.German.1080P.Bluray.x264-SMAHD");
});
it("renames third double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e05e06-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E05E06.German.1080P.Bluray.x264-SMAHD");
});
// Last-resort fallback: folder has season but no scene group suffix (user-renamed packages)
it("renames when folder has season but no scene group suffix (Mystery Road case)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mystery Road S02E05");
});
it("renames with season-only folder and custom name without dots", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Meine Serie S03"],
"meine-serie-s03e10-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Meine Serie S03E10");
});
it("prefers scene-group folder over season-only fallback", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mystery Road S02",
"Mystery.Road.S02.GERMAN.DL.AC3.720p.HDTV.x264-hrs"
],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
// Should use the scene-group folder (hrs), not the custom one
expect(result).toBe("Mystery.Road.S02E05.GERMAN.DL.AC3.720p.HDTV.x264-hrs");
});
it("does not use season-only fallback when forceEpisodeForSeasonFolder is false", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: false }
);
expect(result).toBeNull();
});
it("renames Riviera S02 with single-digit episode s02e2", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
"tvp-riviera-s02e2-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
});
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"r104.de.dl.web.7p-s04e02",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
it("renames Room 104 wayne source with episode", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"room.104.s04e01.german.dl.720p.web.h264-wayne",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
});

View File

@ -25,15 +25,15 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(dir, "movie.mkv"))).toBe(true);
});
it("removes sample artifacts and link files", async () => {
it("removes sample artifacts and link files", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.mkdirSync(path.join(dir, "Samples"), { recursive: true });
fs.writeFileSync(path.join(dir, "Samples", "demo-sample.mkv"), "x");
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://example.com/a\n");
const links = await removeDownloadLinkArtifacts(dir);
const samples = await removeSampleArtifacts(dir);
const links = removeDownloadLinkArtifacts(dir);
const samples = removeSampleArtifacts(dir);
expect(links).toBeGreaterThan(0);
expect(samples.files + samples.dirs).toBeGreaterThan(0);
});
@ -66,7 +66,7 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(sub2, "subtitle.srt"))).toBe(true);
});
it("detects link artifacts by URL content in text files", async () => {
it("detects link artifacts by URL content in text files", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
@ -81,7 +81,7 @@ describe("cleanup", () => {
// .dlc files should always be removed
fs.writeFileSync(path.join(dir, "container.dlc"), "encrypted-data");
const removed = await removeDownloadLinkArtifacts(dir);
const removed = removeDownloadLinkArtifacts(dir);
expect(removed).toBeGreaterThanOrEqual(3); // download_links.txt + bookmark.url + container.dlc
expect(fs.existsSync(path.join(dir, "download_links.txt"))).toBe(false);
expect(fs.existsSync(path.join(dir, "bookmark.url"))).toBe(false);
@ -90,7 +90,7 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(dir, "readme.txt"))).toBe(true);
});
it("does not recurse into sample symlink or junction targets", async () => {
it("does not recurse into sample symlink or junction targets", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
const external = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-ext-"));
tempDirs.push(dir, external);
@ -102,7 +102,7 @@ describe("cleanup", () => {
const linkType: fs.symlink.Type = process.platform === "win32" ? "junction" : "dir";
fs.symlinkSync(external, linkedSampleDir, linkType);
const result = await removeSampleArtifacts(dir);
const result = removeSampleArtifacts(dir);
expect(result.files).toBe(0);
expect(fs.existsSync(outsideFile)).toBe(true);
});

View File

@ -290,44 +290,6 @@ describe("debrid service", () => {
expect(fetchSpy).toHaveBeenCalledTimes(0);
});
it("aborts Mega web unrestrict when caller signal is cancelled", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const megaWeb = vi.fn((_link: string, signal?: AbortSignal): Promise<never> => new Promise((_, reject) => {
const onAbort = (): void => reject(new Error("aborted:mega-web-test"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const controller = new AbortController();
const abortTimer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);
expect(megaWeb).toHaveBeenCalledTimes(1);
expect(megaWeb.mock.calls[0]?.[1]).toBe(controller.signal);
} finally {
clearTimeout(abortTimer);
}
});
it("respects provider selection and does not append hidden providers", async () => {
const settings = {
...defaultSettings(),

View File

@ -177,7 +177,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "retry", links: ["https://dummy/retry"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -253,7 +253,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "same-name", links: ["https://dummy/first", "https://dummy/second"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const items = Object.values(manager.getSnapshot().session.items);
@ -465,7 +465,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "stall", links: ["https://dummy/stall"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -563,7 +563,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "connect-stall", links: ["https://dummy/connect-stall"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -666,7 +666,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "stall-connect", links: ["https://dummy/stall-connect"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -765,7 +765,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "watchdog-stall", links: ["https://dummy/watchdog-stall"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -880,14 +880,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "drain-stall", links: ["https://dummy/drain-stall"] }]);
const queuedSnapshot = manager.getSnapshot();
const packageId = queuedSnapshot.session.packageOrder[0] || "";
const itemId = queuedSnapshot.session.packages[packageId]?.itemIds[0] || "";
manager.start();
await waitFor(() => {
const status = manager.getSnapshot().session.items[itemId]?.fullStatus || "";
return status.includes("Warte auf Festplatte");
}, 12000);
await waitFor(() => !manager.getSnapshot().session.running, 40000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -961,7 +954,6 @@ describe("download manager", () => {
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract"),
retryLimit: 1,
autoExtract: false
},
emptySession(),
@ -969,7 +961,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "content-name", links: ["https://rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -1092,14 +1084,13 @@ describe("download manager", () => {
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract"),
retryLimit: 1,
autoExtract: false
},
session,
createStoragePaths(path.join(root, "state"))
);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1225,14 +1216,13 @@ describe("download manager", () => {
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract"),
retryLimit: 1,
autoExtract: false
},
session,
createStoragePaths(path.join(root, "state"))
);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1369,14 +1359,13 @@ describe("download manager", () => {
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract"),
retryLimit: 1,
autoExtract: false
},
session,
createStoragePaths(path.join(root, "state"))
);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1491,7 +1480,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 45000);
const item = manager.getSnapshot().session.items[itemId];
@ -1505,7 +1494,7 @@ describe("download manager", () => {
server.close();
await once(server, "close");
}
}, 70000);
}, 30000);
it("retries non-retriable HTTP statuses and eventually succeeds", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
@ -1572,7 +1561,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "status-retry", links: ["https://dummy/status-retry"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -1792,7 +1781,7 @@ describe("download manager", () => {
expect(fs.existsSync(targetPath)).toBe(false);
});
it("detects start conflicts when extract output already exists", async () => {
it("detects start conflicts when extract output already exists", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
@ -1851,7 +1840,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
const conflicts = await manager.getStartConflicts();
const conflicts = manager.getStartConflicts();
expect(conflicts.length).toBe(1);
expect(conflicts[0]?.packageId).toBe(packageId);
});
@ -1917,104 +1906,8 @@ describe("download manager", () => {
const result = await manager.resolveStartConflict(packageId, "skip");
expect(result.skipped).toBe(true);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]).toBeDefined();
expect(snapshot.session.packages[packageId]?.status).toBe("queued");
expect(snapshot.session.items[itemId]).toBeDefined();
expect(snapshot.session.items[itemId]?.status).toBe("queued");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Wartet");
});
it("keeps already completed items when skipping start conflict", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
const packageId = "skip-partial-pkg";
const completedItemId = "skip-partial-completed";
const pendingItemId = "skip-partial-pending";
const now = Date.now() - 5000;
const outputDir = path.join(root, "downloads", "skip-partial");
const extractDir = path.join(root, "extract", "skip-partial");
fs.mkdirSync(outputDir, { recursive: true });
fs.mkdirSync(extractDir, { recursive: true });
fs.writeFileSync(path.join(extractDir, "existing.mkv"), "x", "utf8");
const completedTarget = path.join(outputDir, "skip-partial.part01.rar");
fs.writeFileSync(completedTarget, "part", "utf8");
const session = emptySession();
session.packageOrder = [packageId];
session.packages[packageId] = {
id: packageId,
name: "skip-partial",
outputDir,
extractDir,
status: "queued",
itemIds: [completedItemId, pendingItemId],
cancelled: false,
enabled: true,
createdAt: now,
updatedAt: now
};
session.items[completedItemId] = {
id: completedItemId,
packageId,
url: "https://dummy/skip-partial/completed",
provider: "realdebrid",
status: "completed",
retries: 0,
speedBps: 0,
downloadedBytes: 123,
totalBytes: 123,
progressPercent: 100,
fileName: "skip-partial.part01.rar",
targetPath: completedTarget,
resumable: true,
attempts: 1,
lastError: "",
fullStatus: "Entpackt",
createdAt: now,
updatedAt: now
};
session.items[pendingItemId] = {
id: pendingItemId,
packageId,
url: "https://dummy/skip-partial/pending",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "skip-partial.part02.rar",
targetPath: path.join(outputDir, "skip-partial.part02.rar"),
resumable: true,
attempts: 0,
lastError: "",
fullStatus: "Wartet",
createdAt: now,
updatedAt: now
};
const manager = new DownloadManager(
{
...defaultSettings(),
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract")
},
session,
createStoragePaths(path.join(root, "state"))
);
const result = await manager.resolveStartConflict(packageId, "skip");
expect(result.skipped).toBe(true);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]).toBeDefined();
expect(snapshot.session.items[completedItemId]?.status).toBe("completed");
expect(snapshot.session.items[completedItemId]?.fullStatus).toBe("Entpackt");
expect(snapshot.session.items[pendingItemId]?.status).toBe("queued");
expect(snapshot.session.items[pendingItemId]?.fullStatus).toBe("Wartet");
expect(manager.getSnapshot().session.packages[packageId]).toBeUndefined();
expect(manager.getSnapshot().session.items[itemId]).toBeUndefined();
});
it("resolves start conflict by overwriting and resetting queued package", async () => {
@ -2495,7 +2388,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 5000);
const snapshot = manager.getSnapshot();
@ -2727,7 +2620,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "new", links: ["https://dummy/new"] }]);
await manager.start();
manager.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const runningSnapshot = manager.getSnapshot();
@ -2805,7 +2698,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "fresh-retry", links: ["https://dummy/fresh"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -2890,7 +2783,7 @@ describe("download manager", () => {
expect(extractDir).toBeTruthy();
expect(fs.existsSync(extractDir)).toBe(false);
await manager.start();
manager.start();
await new Promise((resolve) => setTimeout(resolve, 140));
expect(fs.existsSync(extractDir)).toBe(false);
@ -2899,14 +2792,14 @@ describe("download manager", () => {
const snapshot = manager.getSnapshot();
const item = Object.values(snapshot.session.items)[0];
expect(item?.status).toBe("completed");
expect(item?.fullStatus).toBe("Entpackt - Done");
expect(item?.fullStatus).toBe("Entpackt");
expect(fs.existsSync(extractDir)).toBe(true);
expect(fs.existsSync(path.join(extractDir, "inside.txt"))).toBe(true);
} finally {
server.close();
await once(server, "close");
}
}, 35000);
});
it("keeps accurate summary when completed items are cleaned immediately", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
@ -2967,7 +2860,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "cleanup", links: ["https://dummy/cleanup"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const snapshot = manager.getSnapshot();
@ -3048,7 +2941,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "cleanup-package", links: ["https://dummy/cleanup-package"] }]);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const snapshot = manager.getSnapshot();
@ -3061,7 +2954,7 @@ describe("download manager", () => {
server.close();
await once(server, "close");
}
}, 35000);
});
it("counts queued package cancellations in run summary", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
@ -3213,7 +3106,7 @@ describe("download manager", () => {
const disabledItemId = initial.session.packages[disabledPkgId]?.itemIds[0] || "";
manager.togglePackage(disabledPkgId);
await manager.start();
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const snapshot = manager.getSnapshot();
@ -3705,7 +3598,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(path.join(extractDir, "episode.txt")), 25000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
});
it("does not fail startup post-processing when source package dir is missing but extract output exists", async () => {
@ -3772,7 +3665,7 @@ describe("download manager", () => {
await waitFor(() => manager.getSnapshot().session.items[itemId]?.fullStatus.startsWith("Entpackt"), 12000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
});
it("marks missing source package dir as extracted instead of failed", async () => {
@ -4012,7 +3905,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(expectedPath), 12000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(fs.existsSync(expectedPath)).toBe(true);
expect(fs.existsSync(originalExtractedPath)).toBe(false);
});
@ -4135,7 +4028,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(flattenedPath), 12000);
expect(manager.getSnapshot().session.packages[packageId]?.status).toBe("completed");
expect(manager.getSnapshot().session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(manager.getSnapshot().session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(fs.existsSync(flattenedPath)).toBe(true);
expect(fs.existsSync(originalExtractedPath)).toBe(false);
});
@ -4190,7 +4083,6 @@ describe("download manager", () => {
const sourceFileName = `${nestedFolder}/tvr-gotham-s03e11-720p.mkv`;
const zip = new AdmZip();
zip.addFile(sourceFileName, Buffer.from("video"));
zip.addFile(`${nestedFolder}/tvr-gotham-s03-720p.nfo`, Buffer.from("info"));
zip.addFile(`${nestedFolder}/Thumbs.db`, Buffer.from("thumbs"));
zip.addFile("desktop.ini", Buffer.from("system"));
const archivePath = path.join(outputDir, "episode.zip");
@ -4306,7 +4198,7 @@ describe("download manager", () => {
expect(internal.globalSpeedLimitNextAt).toBeGreaterThan(start);
});
it("resets speed window head when start finds no runnable items", async () => {
it("resets speed window head when start finds no runnable items", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
@ -4330,69 +4222,12 @@ describe("download manager", () => {
internal.speedEventsHead = 5;
internal.speedBytesLastWindow = 999;
await manager.start();
manager.start();
expect(internal.speedEventsHead).toBe(0);
expect(internal.speedEvents.length).toBe(0);
expect(internal.speedBytesLastWindow).toBe(0);
});
it("does not trigger global stall abort while write-buffer is disk-blocked", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
const previousGlobalWatchdog = process.env.RD_GLOBAL_STALL_TIMEOUT_MS;
process.env.RD_GLOBAL_STALL_TIMEOUT_MS = "2500";
try {
const manager = new DownloadManager(
{
...defaultSettings(),
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract")
},
emptySession(),
createStoragePaths(path.join(root, "state"))
);
manager.addPackages([{ name: "disk-block-guard", links: ["https://dummy/disk-block-guard"] }]);
const snapshot = manager.getSnapshot();
const packageId = snapshot.session.packageOrder[0] || "";
const itemId = snapshot.session.packages[packageId]?.itemIds[0] || "";
const internal = manager as unknown as any;
internal.session.running = true;
internal.session.paused = false;
internal.session.reconnectUntil = 0;
internal.session.totalDownloadedBytes = 0;
internal.session.items[itemId].status = "downloading";
internal.lastGlobalProgressBytes = 0;
internal.lastGlobalProgressAt = Date.now() - 10000;
const abortController = new AbortController();
internal.activeTasks.set(itemId, {
itemId,
packageId,
abortController,
abortReason: "none",
resumable: true,
nonResumableCounted: false,
blockedOnDiskWrite: true,
blockedOnDiskSince: Date.now() - 5000
});
internal.runGlobalStallWatchdog(Date.now());
expect(abortController.signal.aborted).toBe(false);
expect(internal.lastGlobalProgressAt).toBeGreaterThan(Date.now() - 2000);
} finally {
if (previousGlobalWatchdog === undefined) {
delete process.env.RD_GLOBAL_STALL_TIMEOUT_MS;
} else {
process.env.RD_GLOBAL_STALL_TIMEOUT_MS = previousGlobalWatchdog;
}
}
});
it("cleans run tracking when start conflict is skipped", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);

View File

@ -1,204 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { spawnSync } from "node:child_process";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalBackend = process.env.RD_EXTRACT_BACKEND;
function hasJavaRuntime(): boolean {
const result = spawnSync("java", ["-version"], { stdio: "ignore" });
return result.status === 0;
}
function hasJvmExtractorRuntime(): boolean {
const root = path.join(process.cwd(), "resources", "extractor-jvm");
const classesMain = path.join(root, "classes", "com", "sucukdeluxe", "extractor", "JBindExtractorMain.class");
const requiredLibs = [
path.join(root, "lib", "sevenzipjbinding.jar"),
path.join(root, "lib", "sevenzipjbinding-all-platforms.jar"),
path.join(root, "lib", "zip4j.jar")
];
return fs.existsSync(classesMain) && requiredLibs.every((libPath) => fs.existsSync(libPath));
}
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalBackend;
}
});
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zipPath = path.join(packageDir, "release.zip");
const zip = new AdmZip();
zip.addFile("episode.txt", Buffer.from("ok"));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("emits progress callbacks with archiveName and percent", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create a ZIP with some content to trigger progress
const zipPath = path.join(packageDir, "progress-test.zip");
const zip = new AdmZip();
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
zip.writeZip(zipPath);
const progressUpdates: Array<{
archiveName: string;
percent: number;
phase: string;
archivePercent?: number;
}> = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
progressUpdates.push({
archiveName: update.archiveName,
percent: update.percent,
phase: update.phase,
archivePercent: update.archivePercent,
});
},
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
// Should have at least preparing, extracting, and done phases
const phases = new Set(progressUpdates.map((u) => u.phase));
expect(phases.has("preparing")).toBe(true);
expect(phases.has("extracting")).toBe(true);
// Extracting phase should include the archive name
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
expect(extracting.length).toBeGreaterThan(0);
// Should end at 100%
const lastExtracting = extracting[extracting.length - 1];
expect(lastExtracting.archivePercent).toBe(100);
// Files should exist
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
});
it("extracts multiple archives sequentially with progress for each", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create two separate ZIP archives
const zip1 = new AdmZip();
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
zip1.writeZip(path.join(packageDir, "archive1.zip"));
const zip2 = new AdmZip();
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
zip2.writeZip(path.join(packageDir, "archive2.zip"));
const archiveNames = new Set<string>();
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
if (update.phase === "extracting" && update.archiveName) {
archiveNames.add(update.archiveName);
}
},
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// Both archive names should have appeared in progress
expect(archiveNames.has("archive1.zip")).toBe(true);
expect(archiveNames.has("archive2.zip")).toBe(true);
// Both files extracted
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zipPath = path.join(packageDir, "conflict.zip");
const zip = new AdmZip();
zip.addFile("same.txt", Buffer.from("new"));
zip.writeZip(zipPath);
const existingPath = path.join(targetDir, "same.txt");
fs.writeFileSync(existingPath, "old", "utf8");
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "ask",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.readFileSync(existingPath, "utf8")).toBe("old");
});
});

View File

@ -2,41 +2,15 @@ import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import AdmZip from "adm-zip";
import { afterEach, beforeEach, describe, expect, it } from "vitest";
import {
buildExternalExtractArgs,
collectArchiveCleanupTargets,
extractPackageArchives,
archiveFilenamePasswords,
detectArchiveSignature,
classifyExtractionError,
findArchiveCandidates,
} from "../src/main/extractor";
import { afterEach, describe, expect, it } from "vitest";
import { buildExternalExtractArgs, collectArchiveCleanupTargets, extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalExtractBackend = process.env.RD_EXTRACT_BACKEND;
const originalStatfs = fs.promises.statfs;
const originalZipEntryMemoryLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
beforeEach(() => {
process.env.RD_EXTRACT_BACKEND = "legacy";
});
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalExtractBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalExtractBackend;
}
(fs.promises as any).statfs = originalStatfs;
if (originalZipEntryMemoryLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = originalZipEntryMemoryLimit;
}
});
describe("extractor", () => {
@ -582,6 +556,7 @@ describe("extractor", () => {
});
it("keeps original ZIP size guard error when external fallback is unavailable", async () => {
const previousLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = "8";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
@ -595,6 +570,7 @@ describe("extractor", () => {
zip.addFile("large.bin", Buffer.alloc(9 * 1024 * 1024, 7));
zip.writeZip(zipPath);
try {
const result = await extractPackageArchives({
packageDir,
targetDir,
@ -606,9 +582,20 @@ describe("extractor", () => {
expect(result.extracted).toBe(0);
expect(result.failed).toBe(1);
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
} finally {
if (previousLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = previousLimit;
}
}
});
it.skipIf(process.platform !== "win32")("matches resume-state archive names case-insensitively on Windows", async () => {
it("matches resume-state archive names case-insensitively on Windows", async () => {
if (process.platform !== "win32") {
return;
}
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
@ -631,459 +618,4 @@ describe("extractor", () => {
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
describe("disk space check", () => {
it("aborts extraction when disk space is insufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
(fs.promises as any).statfs = async () => ({ bfree: 1, bsize: 1 });
await expect(
extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
})
).rejects.toThrow(/Nicht genug Speicherplatz/);
});
it("proceeds when disk space is sufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-ok-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
});
describe("nested extraction", () => {
it("extracts archives found inside extracted output", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const innerZip = new AdmZip();
innerZip.addFile("deep.txt", Buffer.from("deep content"));
const outerZip = new AdmZip();
outerZip.addFile("inner.zip", innerZip.toBuffer());
outerZip.writeZip(path.join(packageDir, "outer.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "deep.txt"))).toBe(true);
});
it("does not extract blacklisted extensions like .iso", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-bl-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("disc.iso", Buffer.alloc(64, 0));
zip.addFile("readme.txt", Buffer.from("hello"));
zip.writeZip(path.join(packageDir, "package.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1);
expect(fs.existsSync(path.join(targetDir, "disc.iso"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "readme.txt"))).toBe(true);
});
});
describe("archiveFilenamePasswords", () => {
it("extracts stem and spaced variant from archive name", () => {
const result = archiveFilenamePasswords("MyRelease.S01E01.rar");
expect(result).toContain("MyRelease.S01E01");
expect(result).toContain("MyRelease S01E01");
});
it("strips multipart rar suffix", () => {
const result = archiveFilenamePasswords("Show.S02E03.part01.rar");
expect(result).toContain("Show.S02E03");
expect(result).toContain("Show S02E03");
});
it("strips .zip.001 suffix", () => {
const result = archiveFilenamePasswords("Movie.2024.zip.001");
expect(result).toContain("Movie.2024");
});
it("strips .tar.gz suffix", () => {
const result = archiveFilenamePasswords("backup.tar.gz");
expect(result).toContain("backup");
});
it("returns empty array for empty input", () => {
expect(archiveFilenamePasswords("")).toEqual([]);
});
it("returns single entry when no dots/underscores", () => {
const result = archiveFilenamePasswords("simple.zip");
expect(result).toEqual(["simple"]);
});
it("replaces underscores with spaces", () => {
const result = archiveFilenamePasswords("my_archive_name.7z");
expect(result).toContain("my_archive_name");
expect(result).toContain("my archive name");
});
});
describe(".rev cleanup", () => {
it("collects .rev files for single RAR cleanup", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-rev-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const mainRar = path.join(packageDir, "show.rar");
const rev = path.join(packageDir, "show.rev");
const r00 = path.join(packageDir, "show.r00");
fs.writeFileSync(mainRar, "a", "utf8");
fs.writeFileSync(rev, "b", "utf8");
fs.writeFileSync(r00, "c", "utf8");
const targets = new Set(collectArchiveCleanupTargets(mainRar));
expect(targets.has(mainRar)).toBe(true);
expect(targets.has(rev)).toBe(true);
expect(targets.has(r00)).toBe(true);
});
it("collects .rev files for multipart RAR cleanup", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-rev-mp-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const part1 = path.join(packageDir, "show.part01.rar");
const part2 = path.join(packageDir, "show.part02.rar");
const rev = path.join(packageDir, "show.rev");
fs.writeFileSync(part1, "a", "utf8");
fs.writeFileSync(part2, "b", "utf8");
fs.writeFileSync(rev, "c", "utf8");
const targets = new Set(collectArchiveCleanupTargets(part1));
expect(targets.has(part1)).toBe(true);
expect(targets.has(part2)).toBe(true);
expect(targets.has(rev)).toBe(true);
});
});
describe("generic .001 split cleanup", () => {
it("collects all numbered parts for generic splits", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-split-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const p001 = path.join(packageDir, "movie.001");
const p002 = path.join(packageDir, "movie.002");
const p003 = path.join(packageDir, "movie.003");
const other = path.join(packageDir, "other.001");
fs.writeFileSync(p001, "a", "utf8");
fs.writeFileSync(p002, "b", "utf8");
fs.writeFileSync(p003, "c", "utf8");
fs.writeFileSync(other, "x", "utf8");
const targets = new Set(collectArchiveCleanupTargets(p001));
expect(targets.has(p001)).toBe(true);
expect(targets.has(p002)).toBe(true);
expect(targets.has(p003)).toBe(true);
expect(targets.has(other)).toBe(false);
});
});
describe("detectArchiveSignature", () => {
it("detects RAR signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.rar");
// RAR5 signature: 52 61 72 21 1A 07
fs.writeFileSync(filePath, Buffer.from("526172211a0700", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("rar");
});
it("detects ZIP signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.zip");
fs.writeFileSync(filePath, Buffer.from("504b030414000000", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("zip");
});
it("detects 7z signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.7z");
fs.writeFileSync(filePath, Buffer.from("377abcaf271c0004", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("7z");
});
it("returns null for non-archive files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.txt");
fs.writeFileSync(filePath, "Hello World", "utf8");
const sig = await detectArchiveSignature(filePath);
expect(sig).toBeNull();
});
it("returns null for non-existent file", async () => {
const sig = await detectArchiveSignature("/nonexistent/file.rar");
expect(sig).toBeNull();
});
});
describe("findArchiveCandidates extended formats", () => {
it("finds .tar.gz files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-tar-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "backup.tar.gz"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "readme.txt"), "info", "utf8");
const candidates = await findArchiveCandidates(packageDir);
expect(candidates.map((c) => path.basename(c))).toContain("backup.tar.gz");
});
it("finds .tar.bz2 files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-tar-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "archive.tar.bz2"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
expect(candidates.map((c) => path.basename(c))).toContain("archive.tar.bz2");
});
it("finds generic .001 split files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-split-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "movie.001"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "movie.002"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
const names = candidates.map((c) => path.basename(c));
expect(names).toContain("movie.001");
// .002 should NOT be in candidates (only .001 is the entry point)
expect(names).not.toContain("movie.002");
});
it("does not duplicate .zip.001 as generic split", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dedup-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "movie.zip.001"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "movie.zip.002"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
const names = candidates.map((c) => path.basename(c));
// .zip.001 should appear once from zipSplit detection, not duplicated by genericSplit
expect(names.filter((n) => n === "movie.zip.001")).toHaveLength(1);
});
});
describe("classifyExtractionError", () => {
it("classifies CRC errors", () => {
expect(classifyExtractionError("CRC failed for file.txt")).toBe("crc_error");
expect(classifyExtractionError("Checksum error in data")).toBe("crc_error");
});
it("classifies wrong password", () => {
expect(classifyExtractionError("Wrong password")).toBe("wrong_password");
expect(classifyExtractionError("Falsches Passwort")).toBe("wrong_password");
});
it("classifies missing parts", () => {
expect(classifyExtractionError("Missing volume: part2.rar")).toBe("missing_parts");
expect(classifyExtractionError("Unexpected end of archive")).toBe("missing_parts");
});
it("classifies unsupported format", () => {
expect(classifyExtractionError("kein RAR-Archiv")).toBe("unsupported_format");
expect(classifyExtractionError("UNSUPPORTEDMETHOD")).toBe("unsupported_format");
});
it("classifies disk full", () => {
expect(classifyExtractionError("Nicht genug Speicherplatz")).toBe("disk_full");
expect(classifyExtractionError("No space left on device")).toBe("disk_full");
});
it("classifies timeout", () => {
expect(classifyExtractionError("Entpacken Timeout nach 360s")).toBe("timeout");
});
it("classifies abort", () => {
expect(classifyExtractionError("aborted:extract")).toBe("aborted");
});
it("classifies no extractor", () => {
expect(classifyExtractionError("WinRAR/UnRAR nicht gefunden")).toBe("no_extractor");
});
it("returns unknown for unrecognized errors", () => {
expect(classifyExtractionError("something weird happened")).toBe("unknown");
});
});
describe("password discovery", () => {
it("extracts first archive serially before parallel pool when multiple passwords", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create 3 zip archives
for (const name of ["ep01.zip", "ep02.zip", "ep03.zip"]) {
const zip = new AdmZip();
zip.addFile(`${name}.txt`, Buffer.from(name));
zip.writeZip(path.join(packageDir, name));
}
const seenOrder: string[] = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 2,
passwordList: "pw1|pw2|pw3",
onProgress: (update) => {
if (update.phase !== "extracting" || !update.archiveName) return;
if (seenOrder[seenOrder.length - 1] !== update.archiveName) {
seenOrder.push(update.archiveName);
}
}
});
expect(result.extracted).toBe(3);
expect(result.failed).toBe(0);
// First archive should be ep01 (natural order, extracted serially for discovery)
expect(seenOrder[0]).toBe("ep01.zip");
});
it("skips discovery when only one password candidate", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-skip-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
for (const name of ["a.zip", "b.zip"]) {
const zip = new AdmZip();
zip.addFile(`${name}.txt`, Buffer.from(name));
zip.writeZip(path.join(packageDir, name));
}
// No passwordList → only empty string → length=1 → no discovery phase
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 4
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
});
it("skips discovery when only one archive", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-one-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("single.txt", Buffer.from("single"));
zip.writeZip(path.join(packageDir, "only.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 4,
passwordList: "pw1|pw2|pw3"
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
});
});

View File

@ -123,56 +123,5 @@ describe("mega-web-fallback", () => {
// Generation fails -> resets cookie -> tries again -> fails again -> returns null
expect(result).toBeNull();
});
it("aborts pending Mega-Web polling when signal is cancelled", async () => {
globalThis.fetch = vi.fn((url: string | URL | Request, init?: RequestInit): Promise<Response> => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return Promise.resolve(new Response("", { headers, status: 200 }));
}
if (urlStr.includes("page=debrideur")) {
return Promise.resolve(new Response('<form id="debridForm"></form>', { status: 200 }));
}
if (urlStr.includes("form=debrid")) {
return Promise.resolve(new Response(`
<div class="acp-box">
<h3>Link: https://mega.debrid/link2</h3>
<a href="javascript:processDebrid(1,'secretcode456',0)">Download</a>
</div>
`, { status: 200 }));
}
if (urlStr.includes("ajax=debrid")) {
return new Promise<Response>((_resolve, reject) => {
const signal = init?.signal;
const onAbort = (): void => reject(new Error("aborted:ajax"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
});
}
return Promise.resolve(new Response("Not found", { status: 404 }));
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
const controller = new AbortController();
const timer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);
} finally {
clearTimeout(timer);
}
});
});
});

View File

@ -1,188 +0,0 @@
import { describe, expect, it } from "vitest";
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
type MinimalItem = {
targetPath?: string;
fileName?: string;
[key: string]: unknown;
};
function makeItems(names: string[]): MinimalItem[] {
return names.map((name) => ({
targetPath: `C:\\Downloads\\Package\\${name}`,
fileName: name,
id: name,
status: "completed",
}));
}
describe("resolveArchiveItemsFromList", () => {
// ── Multipart RAR (.partN.rar) ──
it("matches multipart .part1.rar archives", () => {
const items = makeItems([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
"Other.rar",
]);
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(3);
expect(result.map((i: any) => i.fileName)).toEqual([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
]);
});
it("matches multipart .part01.rar archives (zero-padded)", () => {
const items = makeItems([
"Film.part01.rar",
"Film.part02.rar",
"Film.part10.rar",
"Unrelated.zip",
]);
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
expect(result).toHaveLength(3);
});
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
it("matches old-style .rar + .rNN volumes", () => {
const items = makeItems([
"Archive.rar",
"Archive.r00",
"Archive.r01",
"Archive.r02",
"Other.zip",
]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(4);
});
// ── Single RAR ──
it("matches a single .rar file", () => {
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("SingleFile.rar");
});
// ── Split ZIP ──
it("matches split .zip.NNN files", () => {
const items = makeItems([
"Data.zip",
"Data.zip.001",
"Data.zip.002",
"Data.zip.003",
]);
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
expect(result).toHaveLength(4);
});
// ── Split 7z ──
it("matches split .7z.NNN files", () => {
const items = makeItems([
"Backup.7z.001",
"Backup.7z.002",
]);
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
expect(result).toHaveLength(2);
});
// ── Generic .NNN splits ──
it("matches generic .NNN split files", () => {
const items = makeItems([
"video.001",
"video.002",
"video.003",
]);
const result = resolveArchiveItemsFromList("video.001", items as any);
expect(result).toHaveLength(3);
});
// ── Exact filename match ──
it("matches a single .zip by exact name", () => {
const items = makeItems(["myarchive.zip", "other.rar"]);
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("myarchive.zip");
});
// ── Case insensitivity ──
it("matches case-insensitively", () => {
const items = makeItems([
"MOVIE.PART1.RAR",
"MOVIE.PART2.RAR",
]);
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Stem-based fallback ──
it("uses stem-based fallback when exact patterns fail", () => {
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
// but the disk file is "Movie.part1.rar"
const items = makeItems([
"Movie.rar",
]);
// The archive on disk is "Movie.part1.rar" but there's no item matching the
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
// stem fallback: "movie" starts with "movie" and ends with .rar
expect(result).toHaveLength(1);
});
// ── Single item fallback ──
it("returns single archive item when no pattern matches", () => {
const items = makeItems(["totally-different-name.rar"]);
const result = resolveArchiveItemsFromList("Original.rar", items as any);
// Single item in list with archive extension → return it
expect(result).toHaveLength(1);
});
// ── Empty when no match ──
it("returns empty when items have no archive extensions", () => {
const items = makeItems(["video.mkv", "subtitle.srt"]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(0);
});
// ── Items without targetPath ──
it("falls back to fileName when targetPath is missing", () => {
const items = [
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
];
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Multiple archives, should not cross-match ──
it("does not cross-match different archive groups", () => {
const items = makeItems([
"Episode.S01E01.part1.rar",
"Episode.S01E01.part2.rar",
"Episode.S01E02.part1.rar",
"Episode.S01E02.part2.rar",
]);
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
expect(result1).toHaveLength(2);
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
expect(result2).toHaveLength(2);
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
});
});

View File

@ -153,7 +153,7 @@ async function main(): Promise<void> {
createStoragePaths(path.join(tempRoot, "state-pause"))
);
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
await manager2.start();
manager2.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const paused = manager2.togglePause();
assert(paused, "Pause konnte nicht aktiviert werden");

View File

@ -1,163 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "../src/main/session-log";
import { setLogListener } from "../src/main/logger";
const tempDirs: string[] = [];
afterEach(() => {
// Ensure session log is shut down between tests
shutdownSessionLog();
// Ensure listener is cleared between tests
setLogListener(null);
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("session-log", () => {
it("initSessionLog creates directory and file", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath();
expect(logPath).not.toBeNull();
expect(fs.existsSync(logPath!)).toBe(true);
expect(fs.existsSync(path.join(baseDir, "session-logs"))).toBe(true);
expect(path.basename(logPath!)).toMatch(/^session_\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}\.txt$/);
const content = fs.readFileSync(logPath!, "utf8");
expect(content).toContain("=== Session gestartet:");
shutdownSessionLog();
});
it("logger listener writes to session log", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
// Simulate a log line via the listener
const { logger } = await import("../src/main/logger");
logger.info("Test-Nachricht für Session-Log");
// Wait for flush (200ms interval + margin)
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("Test-Nachricht für Session-Log");
shutdownSessionLog();
});
it("shutdownSessionLog writes closing line", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("=== Session beendet:");
});
it("shutdownSessionLog removes listener", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
// Log after shutdown - should NOT appear in session log
const { logger } = await import("../src/main/logger");
logger.info("Nach-Shutdown-Nachricht");
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).not.toContain("Nach-Shutdown-Nachricht");
});
it("cleanupOldSessionLogs deletes old files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a fake old session log
const oldFile = path.join(logsDir, "session_2020-01-01_00-00-00.txt");
fs.writeFileSync(oldFile, "old session");
// Set mtime to 30 days ago
const oldTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
fs.utimesSync(oldFile, oldTime, oldTime);
// Create a recent file
const newFile = path.join(logsDir, "session_2099-01-01_00-00-00.txt");
fs.writeFileSync(newFile, "new session");
// initSessionLog triggers cleanup
initSessionLog(baseDir);
// Wait for async cleanup
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(oldFile)).toBe(false);
expect(fs.existsSync(newFile)).toBe(true);
shutdownSessionLog();
});
it("cleanupOldSessionLogs keeps recent files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a file from 2 days ago (should be kept)
const recentFile = path.join(logsDir, "session_2025-12-01_00-00-00.txt");
fs.writeFileSync(recentFile, "recent session");
const recentTime = new Date(Date.now() - 2 * 24 * 60 * 60 * 1000);
fs.utimesSync(recentFile, recentTime, recentTime);
initSessionLog(baseDir);
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(recentFile)).toBe(true);
shutdownSessionLog();
});
it("multiple sessions create different files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const path1 = getSessionLogPath();
shutdownSessionLog();
// Small delay to ensure different timestamp
await new Promise((resolve) => setTimeout(resolve, 1100));
initSessionLog(baseDir);
const path2 = getSessionLogPath();
shutdownSessionLog();
expect(path1).not.toBeNull();
expect(path2).not.toBeNull();
expect(path1).not.toBe(path2);
expect(fs.existsSync(path1!)).toBe(true);
expect(fs.existsSync(path2!)).toBe(true);
});
});

View File

@ -304,58 +304,6 @@ describe("settings storage", () => {
expect(loaded.packageOrder).toEqual(empty.packageOrder);
});
it("loads backup session when primary session is corrupted", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const backupSession = emptySession();
backupSession.packageOrder = ["pkg-backup"];
backupSession.packages["pkg-backup"] = {
id: "pkg-backup",
name: "Backup Package",
outputDir: path.join(dir, "out"),
extractDir: path.join(dir, "extract"),
status: "queued",
itemIds: ["item-backup"],
cancelled: false,
enabled: true,
createdAt: Date.now(),
updatedAt: Date.now()
};
backupSession.items["item-backup"] = {
id: "item-backup",
packageId: "pkg-backup",
url: "https://example.com/backup-file",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "backup-file.rar",
targetPath: path.join(dir, "out", "backup-file.rar"),
resumable: true,
attempts: 0,
lastError: "",
fullStatus: "Wartet",
createdAt: Date.now(),
updatedAt: Date.now()
};
fs.writeFileSync(`${paths.sessionFile}.bak`, JSON.stringify(backupSession), "utf8");
fs.writeFileSync(paths.sessionFile, "{broken-session-json", "utf8");
const loaded = loadSession(paths);
expect(loaded.packageOrder).toEqual(["pkg-backup"]);
expect(loaded.packages["pkg-backup"]?.name).toBe("Backup Package");
expect(loaded.items["item-backup"]?.fileName).toBe("backup-file.rar");
const restoredPrimary = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { packages?: Record<string, unknown> };
expect(restoredPrimary.packages && "pkg-backup" in restoredPrimary.packages).toBe(true);
});
it("returns defaults when config file contains invalid JSON", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
@ -447,32 +395,6 @@ describe("settings storage", () => {
expect(persisted.summaryText).toBe("before-mutation");
});
it("creates session backup before sync and async session overwrites", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const first = emptySession();
first.summaryText = "first";
saveSession(paths, first);
const second = emptySession();
second.summaryText = "second";
saveSession(paths, second);
const backupAfterSync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
expect(backupAfterSync.summaryText).toBe("first");
const third = emptySession();
third.summaryText = "third";
await saveSessionAsync(paths, third);
const backupAfterAsync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
const primaryAfterAsync = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { summaryText?: string };
expect(backupAfterAsync.summaryText).toBe("second");
expect(primaryAfterAsync.summaryText).toBe("third");
});
it("applies defaults for missing fields when loading old config", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);

View File

@ -22,7 +22,7 @@ afterEach(() => {
describe("update", () => {
it("normalizes update repo input", () => {
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
@ -31,14 +31,14 @@ describe("update", () => {
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
});
it("uses normalized repo slug for API requests", async () => {
it("uses normalized repo slug for Codeberg API requests", async () => {
let requestedUrl = "";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
return new Response(
JSON.stringify({
tag_name: `v${APP_VERSION}`,
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
html_url: "https://codeberg.org/owner/repo/releases/tag/v1.0.0",
assets: []
}),
{
@ -48,8 +48,8 @@ describe("update", () => {
);
}) as typeof fetch;
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
const result = await checkGitHubUpdate("https://codeberg.org/owner/repo/releases");
expect(requestedUrl).toBe("https://codeberg.org/api/v1/repos/owner/repo/releases/latest");
expect(result.currentVersion).toBe(APP_VERSION);
expect(result.latestVersion).toBe(APP_VERSION);
expect(result.updateAvailable).toBe(false);
@ -95,7 +95,7 @@ describe("update", () => {
if (url.includes("stale-setup.exe")) {
return new Response("missing", { status: 404 });
}
if (url.includes("/releases/download/v9.9.9/")) {
if (url.includes("/releases/latest/download/")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
@ -117,7 +117,7 @@ describe("update", () => {
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.includes("/releases/download/v9.9.9/"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("/releases/latest/download/"))).toBe(true);
expect(requestedUrls.filter((url) => url.includes("stale-setup.exe"))).toHaveLength(1);
});
@ -362,73 +362,6 @@ describe("update", () => {
expect(requestedUrls.some((url) => url.includes("latest.yml"))).toBe(true);
});
it("rejects installer when latest.yml SHA512 digest does not match", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const wrongDigestBase64 = Buffer.alloc(64, 0x13).toString("base64");
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup-no-digest.exe"
},
{
name: "latest.yml",
browser_download_url: "https://example.invalid/latest.yml"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("latest.yml")) {
return new Response(
`version: 9.9.9\npath: Real-Debrid-Downloader Setup 9.9.9.exe\nsha512: ${wrongDigestBase64}\n`,
{
status: 200,
headers: { "Content-Type": "text/yaml" }
}
);
}
if (url.includes("setup-no-digest.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/setup-no-digest.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/sha512|integrit|mismatch/i);
});
it("emits install progress events while downloading and launching update", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const digest = sha256Hex(executablePayload);
@ -484,14 +417,14 @@ describe("normalizeUpdateRepo extended", () => {
});
it("returns default for malformed inputs", () => {
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("just-one-part")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Sucukdeluxe/real-debrid-downloader");
});
it("rejects traversal-like owner or repo segments", () => {
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("../owner/repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
});
it("handles www prefix", () => {

View File

@ -12,5 +12,5 @@
"isolatedModules": true,
"types": ["node", "vite/client"]
},
"include": ["src", "tests", "vite.config.mts"]
"include": ["src", "tests", "vite.config.ts"]
}