Compare commits

..

No commits in common. "main" and "v1.0.3" have entirely different histories.
main ... v1.0.3

84 changed files with 1531 additions and 45004 deletions

59
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,59 @@
name: Build and Release
permissions:
contents: write
on:
push:
tags:
- "v*"
jobs:
build:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pyinstaller
- name: Build exe
run: |
pyinstaller --noconfirm --onefile --windowed --name "Real-Debrid-Downloader" real_debrid_downloader_gui.py
- name: Pack release zip
shell: pwsh
run: |
New-Item -ItemType Directory -Path release -Force | Out-Null
Copy-Item "dist/Real-Debrid-Downloader.exe" "release/Real-Debrid-Downloader.exe"
Compress-Archive -Path "release/*" -DestinationPath "Real-Debrid-Downloader-win64.zip" -Force
- name: Install Inno Setup
shell: pwsh
run: |
choco install innosetup --no-progress -y
- name: Build installer
shell: pwsh
run: |
$version = "${{ github.ref_name }}".TrimStart('v')
& "C:\Program Files (x86)\Inno Setup 6\ISCC.exe" "/DMyAppVersion=$version" "/DMySourceExe=..\\dist\\Real-Debrid-Downloader.exe" "/DMyOutputDir=..\\release" "installer\\RealDebridDownloader.iss"
- name: Publish GitHub Release
uses: softprops/action-gh-release@v2
with:
files: |
Real-Debrid-Downloader-win64.zip
release/Real-Debrid-Downloader-Setup-*.exe
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

24
.gitignore vendored
View File

@ -8,32 +8,8 @@ venv/
build/
dist/
release/
*.spec
rd_downloader_config.json
rd_downloader.log
rd_download_manifest.json
_update_staging/
apply_update.cmd
.claude/
.github/
docs/plans/
CHANGELOG.md
node_modules/
.vite/
coverage/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Forgejo deployment runtime files
deploy/forgejo/.env
deploy/forgejo/forgejo/
deploy/forgejo/postgres/
deploy/forgejo/caddy/data/
deploy/forgejo/caddy/config/
deploy/forgejo/caddy/logs/
deploy/forgejo/backups/

View File

@ -1,23 +0,0 @@
## Release + Update Source (Wichtig)
- Primäre Plattform ist `https://git.24-music.de`
- Standard-Repo: `Administrator/real-debrid-downloader`
- Nicht mehr primär über Codeberg/GitHub releasen
## Releasen
1. Token setzen:
- PowerShell: `$env:GITEA_TOKEN="<token>"`
2. Release ausführen:
- `npm run release:gitea -- <version> [notes]`
Das Script:
- bumped `package.json`
- baut Windows-Artefakte
- pusht `main` + Tag
- erstellt Release auf `git.24-music.de`
- lädt Assets hoch
## Auto-Update
- Updater nutzt aktuell `git.24-music.de` als Standardquelle

246
README.md
View File

@ -1,203 +1,93 @@
# Multi Debrid Downloader
# Real-Debrid Downloader GUI
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
Kleine Desktop-App mit GUI (Tkinter), um mehrere Links (z. B. 20+) einzufuegen,
ueber Real-Debrid zu unrestricten und direkt auf deinen PC zu laden.
![Platform](https://img.shields.io/badge/platform-Windows%2010%2F11-0078D6)
![Electron](https://img.shields.io/badge/Electron-31.x-47848F)
![React](https://img.shields.io/badge/React-18.x-149ECA)
![TypeScript](https://img.shields.io/badge/TypeScript-5.x-3178C6)
![License](https://img.shields.io/badge/license-MIT-green)
## Features
## Why this tool?
- Mehrere Links auf einmal (ein Link pro Zeile)
- Nutzt die Real-Debrid API (`/unrestrict/link`)
- Download-Status pro Link
- Download-Speed pro Link und gesamt
- Gesamt-Fortschritt
- Download-Ordner und Paketname waehlbar
- Einstellbare Parallel-Downloads (z. B. 20 gleichzeitig)
- Automatisches Entpacken nach dem Download
- `Entpacken nach` + optional `Unterordner erstellen (Paketname)` wie bei JDownloader
- ZIP-Passwort-Check mit `serienfans.org` und `serienjunkies.net`
- Multi-Part-RAR wird ueber `part1` entpackt (nur wenn alle Parts vorhanden sind)
- Auto-Update Check ueber GitHub Releases (fuer .exe)
- Optionales lokales Speichern vom API Token
- Familiar download-manager workflow: collect links, start, pause, resume, and finish cleanly.
- Multiple debrid providers in one app, including automatic fallback.
- Built for stability with large queues: session persistence, reconnect handling, resume support, and integrity verification.
## Voraussetzung
## Core features
### Queue and download engine
- Package-based queue with file status, progress, ETA, speed, and retry counters.
- Start, pause, stop, and cancel for both single items and full packages.
- Multi-select via Ctrl+Click for batch operations on packages and items.
- Duplicate handling when adding links: keep, skip, or overwrite.
- Session recovery after restart, including optional auto-resume.
- Circuit breaker with escalating backoff cooldowns to handle provider outages gracefully.
### Debrid and link handling
- Supported providers: `realdebrid`, `megadebrid`, `bestdebrid`, `alldebrid`.
- Configurable provider order: primary + secondary + tertiary.
- Optional automatic fallback to alternative providers on failures.
- `.dlc` import via file picker and drag-and-drop.
### Extraction, cleanup, and quality
- JVM-based extraction backend using SevenZipJBinding + Zip4j (supports RAR, 7z, ZIP, and more).
- Automatic fallback to legacy UnRAR/7z CLI tools when JVM is unavailable.
- Auto-extract with separate target directory and conflict strategies.
- Hybrid extraction: simultaneous downloading and extracting with smart I/O priority throttling.
- Nested extraction: archives within archives are automatically extracted (one level deep).
- Pre-extraction disk space validation to prevent incomplete extracts.
- Right-click "Extract now" on any package with at least one completed item.
- Post-download integrity checks (`CRC32`, `MD5`, `SHA1`) with auto-retry on failures.
- Completed-item cleanup policy: `never`, `immediate`, `on_start`, `package_done`.
- Optional removal of link artifacts and sample files after extraction.
### Auto-rename
- Automatic renaming of extracted files based on series/episode patterns.
- Multi-episode token parsing for batch renames.
### UI and progress
- Visual progress bars with percentage overlay for packages and individual items.
- Real-time bandwidth chart showing current download speeds.
- Persistent download counters: all-time totals and per-session statistics.
- Download history for completed packages.
- Vertical sidebar with organized settings tabs.
- Hoster display showing both the original source and the debrid provider used.
### Convenience and automation
- Clipboard watcher for automatic link detection.
- Minimize-to-tray with tray menu controls.
- Speed limits globally or per download.
- Bandwidth schedules for time-based speed profiles.
- Built-in auto-updater via `git.24-music.de` Releases.
- Long path support (>260 characters) on Windows.
- Python 3.10+
- Optional, aber empfohlen: 7-Zip im PATH fuer RAR/7Z-Entpackung
- Alternative fuer RAR: WinRAR `UnRAR.exe` (wird automatisch erkannt)
## Installation
### Option A: prebuilt releases (recommended)
```bash
python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
```
1. Download a release from the `git.24-music.de` Releases page.
2. Run the installer or portable build.
3. Add your debrid tokens in Settings.
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
### Option B: build from source
Requirements:
- Node.js `20+` (recommended `22+`)
- npm
- Windows `10/11` (for packaging and regular desktop use)
- Java Runtime `8+` (for SevenZipJBinding sidecar backend)
- Optional fallback: 7-Zip/UnRAR if you force legacy extraction mode
## Start
```bash
npm install
npm run dev
python real_debrid_downloader_gui.py
```
## NPM scripts
## Nutzung
| Command | Description |
| --- | --- |
| `npm run dev` | Starts main process, renderer, and Electron in dev mode |
| `npm run build` | Builds main and renderer bundles |
| `npm run start` | Starts the app locally in production mode |
| `npm test` | Runs Vitest unit tests |
| `npm run self-check` | Runs integrated end-to-end self-checks |
| `npm run release:win` | Creates Windows installer and portable build |
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
1. API Token von Real-Debrid eintragen (`https://real-debrid.com/apitoken`)
2. Download-Ordner waehlen
3. Optional Paketname setzen (sonst wird automatisch einer erzeugt)
4. Optional Entpack-Ordner waehlen (`Entpacken nach`)
5. Optional `Unterordner erstellen (Paketname)` aktiv lassen
6. Parallel-Wert setzen (z. B. 20)
7. Links in das Textfeld eintragen (pro Zeile ein Link)
8. `Download starten` klicken
### One-command git.24-music release
Wenn du 20 Links einfuegst, werden sie als ein Paket behandelt. Downloads landen in einem Paketordner. Beim Entpacken kann derselbe Paketname automatisch als Unterordner genutzt werden.
## Auto-Update (GitHub)
1. Standard-Repo ist bereits gesetzt: `Sucukdeluxe/real-debrid-downloader`
2. Optional kannst du es in der App mit `GitHub Repo (owner/name)` ueberschreiben
3. Klicke `Update suchen` oder aktiviere `Beim Start auf Updates pruefen`
4. In der .exe wird ein neues Release heruntergeladen und beim Neustart installiert
Hinweis: Beim Python-Skript gibt es nur einen Release-Hinweis, kein Self-Replace.
## Release Build (.exe)
```bash
npm run release:gitea -- 1.6.31 "- Maintenance update"
pip install pyinstaller
pyinstaller --noconfirm --onefile --windowed --name "Real-Debrid-Downloader" real_debrid_downloader_gui.py
```
This command will:
Danach liegt die EXE in `dist/`.
1. Bump `package.json` version.
2. Build setup/portable artifacts (`npm run release:win`).
3. Commit and push `main` to your `git.24-music.de` remote.
4. Create and push tag `v<version>`.
5. Create/update the Gitea release and upload required assets.
## GitHub Release Workflow
Required once before release:
- Workflow-Datei: `.github/workflows/release.yml`
- Bei Tag-Push wie `v1.0.1` wird automatisch eine Windows-EXE gebaut
- Release-Asset fuer Auto-Update: `Real-Debrid-Downloader-win64.zip`
- Zusaetzlich wird ein Installer gebaut: `Real-Debrid-Downloader-Setup-<version>.exe`
## Auto-Installer
- Im GitHub Release findest du direkt die Setup-Datei (`...Setup-<version>.exe`)
- Setup installiert die App unter `Programme/Real-Debrid Downloader`
- Optional erstellt Setup eine Desktop-Verknuepfung
Beispiel:
```bash
git remote add gitea https://git.24-music.de/<user>/<repo>.git
git tag v1.0.1
git push origin v1.0.1
```
PowerShell token setup:
```powershell
$env:GITEA_TOKEN="<dein-token>"
```
## Typical workflow
1. Add provider tokens in Settings.
2. Paste/import links or `.dlc` containers.
3. Optionally set package names, target folders, extraction, and cleanup rules.
4. Start the queue and monitor progress in the Downloads tab.
5. Review integrity results and summary after completion.
## Project structure
- `src/main` - Electron main process, queue/download/provider logic
- `src/preload` - secure IPC bridge between main and renderer
- `src/renderer` - React UI
- `src/shared` - shared types and IPC contracts
- `tests` - unit tests and self-check tests
- `resources/extractor-jvm` - SevenZipJBinding + Zip4j sidecar JAR and native libraries
## Data and logs
The app stores runtime files in Electron's `userData` directory, including:
- `rd_downloader_config.json`
- `rd_session_state.json`
- `rd_downloader.log`
## Troubleshooting
- Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords and native extractor installation (7-Zip/WinRAR). Optional JVM extractor can be forced with `RD_EXTRACT_BACKEND=jvm`.
- Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers.
- Stalled downloads: the app auto-detects stalls within 10 seconds and retries automatically.
## Changelog
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
### v1.6.61 (2026-03-05)
- Fixed leftover empty package folders in `Downloader Unfertig` after successful extraction.
- Resume marker files (`.rd_extract_progress*.json`) are now treated as ignorable for empty-folder cleanup.
- Deferred post-processing now clears resume markers before running empty-directory removal.
### v1.6.60 (2026-03-05)
- Added package-scoped password cache for extraction: once the first archive in a package is solved, following archives in the same package reuse that password first.
- Kept fallback behavior intact (`""` and other candidates are still tested), but moved empty-password probing behind the learned password to reduce per-archive delays.
- Added cache invalidation on real `wrong_password` failures so stale passwords are automatically discarded.
### v1.6.59 (2026-03-05)
- Switched default extraction backend to native tools (`legacy`) for more stable archive-to-archive flow.
- Prioritized 7-Zip as primary native extractor, with WinRAR/UnRAR as fallback.
- JVM extractor remains available as opt-in via `RD_EXTRACT_BACKEND=jvm`.
### v1.6.58 (2026-03-05)
- Fixed extraction progress oscillation (`1% -> 100% -> 1%` loops) during password retries.
- Kept strict archive completion logic, but normalized in-progress archive percent to avoid false visual done states before real completion.
### v1.6.57 (2026-03-05)
- Fixed extraction flow so archives are marked done only on real completion, not on temporary `100%` progress spikes.
- Improved password handling: after the first successful archive, the discovered password is prioritized for subsequent archives.
- Fixed progress parsing for password retries (reset/restart handling), reducing visible and real gaps between archive extractions.
## License
MIT - see `LICENSE`.
Hinweis: Die App kann nur Links laden, die von Real-Debrid unterstuetzt werden.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 279 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 292 KiB

6
build_exe.ps1 Normal file
View File

@ -0,0 +1,6 @@
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pyinstaller
pyinstaller --noconfirm --onefile --windowed --name "Real-Debrid-Downloader" real_debrid_downloader_gui.py
Write-Host "Build fertig: dist/Real-Debrid-Downloader.exe"

View File

@ -5,18 +5,14 @@
#define MyAppVersion "1.0.0"
#endif
#ifndef MySourceDir
#define MySourceDir "..\\dist\\Real-Debrid-Downloader"
#ifndef MySourceExe
#define MySourceExe "dist\\Real-Debrid-Downloader.exe"
#endif
#ifndef MyOutputDir
#define MyOutputDir "release"
#endif
#ifndef MyIconFile
#define MyIconFile "..\\assets\\app_icon.ico"
#endif
[Setup]
AppId={{C0E95B39-389E-4D2C-8E1E-12A44E8AE8E0}
AppName={#MyAppName}
@ -25,26 +21,27 @@ AppPublisher=Sucukdeluxe
DefaultDirName={autopf}\{#MyAppName}
DefaultGroupName={#MyAppName}
OutputDir={#MyOutputDir}
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
Compression=lzma
SolidCompression=yes
WizardStyle=modern
PrivilegesRequired=lowest
ArchitecturesInstallIn64BitMode=x64compatible
UninstallDisplayIcon={app}\{#MyAppExeName}
SetupIconFile={#MyIconFile}
[Languages]
Name: "german"; MessagesFile: "compiler:Languages\German.isl"
Name: "english"; MessagesFile: "compiler:Default.isl"
[Tasks]
Name: "desktopicon"; Description: "Desktop-Verknuepfung erstellen"; GroupDescription: "Zusaetzliche Aufgaben:"; Flags: unchecked
[Files]
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
Source: "{#MySourceExe}"; DestDir: "{app}"; Flags: ignoreversion
[Icons]
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"
Name: "{autodesktop}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"
Name: "{autodesktop}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; Tasks: desktopicon
[Run]
Filename: "{app}\{#MyAppExeName}"; Description: "{#MyAppName} starten"; Flags: nowait postinstall skipifsilent

9765
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,80 +0,0 @@
{
"name": "real-debrid-downloader",
"version": "1.6.66",
"description": "Desktop downloader",
"main": "build/main/main/main.js",
"author": "Sucukdeluxe",
"license": "MIT",
"scripts": {
"dev": "concurrently -k \"npm:dev:main:watch\" \"npm:dev:renderer\" \"npm:dev:electron\"",
"dev:renderer": "vite",
"dev:main:watch": "tsup src/main/main.ts src/preload/preload.ts --out-dir build/main --format cjs --target node20 --external electron --sourcemap --watch",
"dev:electron": "wait-on tcp:5173 file:build/main/main/main.js && cross-env NODE_ENV=development electron .",
"build": "npm run build:main && npm run build:renderer",
"build:main": "tsup src/main/main.ts src/preload/preload.ts --out-dir build/main --format cjs --target node20 --external electron --sourcemap",
"build:renderer": "vite build",
"start": "cross-env NODE_ENV=production electron .",
"test": "vitest run",
"self-check": "tsx tests/self-check.ts",
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
"release:gitea": "node scripts/release_gitea.mjs",
"release:forgejo": "node scripts/release_gitea.mjs"
},
"dependencies": {
"adm-zip": "^0.5.16",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"uuid": "^11.1.0"
},
"devDependencies": {
"@types/adm-zip": "^0.5.7",
"@types/node": "^24.0.13",
"@types/react": "^18.3.12",
"@types/react-dom": "^18.3.1",
"@types/uuid": "^10.0.0",
"@vitejs/plugin-react": "^4.3.4",
"concurrently": "^9.0.1",
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
"vite": "^6.0.5",
"vitest": "^2.1.8",
"wait-on": "^8.0.1"
},
"build": {
"appId": "com.sucukdeluxe.realdebrid",
"productName": "Real-Debrid-Downloader",
"directories": {
"buildResources": "assets",
"output": "release"
},
"files": [
"build/main/**/*",
"build/renderer/**/*",
"resources/extractor-jvm/**/*",
"package.json"
],
"asarUnpack": [
"resources/extractor-jvm/**/*"
],
"win": {
"target": [
"nsis",
"portable"
],
"icon": "assets/app_icon.ico",
"signAndEditExecutable": false
},
"nsis": {
"oneClick": false,
"perMachine": false,
"allowToChangeInstallationDirectory": true,
"createDesktopShortcut": true
},
"afterPack": "scripts/afterPack.cjs"
}
}

File diff suppressed because it is too large Load Diff

2
requirements.txt Normal file
View File

@ -0,0 +1,2 @@
requests>=2.31.0
pyzipper>=0.3.6

View File

@ -1,22 +0,0 @@
# JVM extractor runtime
This directory contains the Java sidecar runtime used by `src/main/extractor.ts`.
## Included backends
- `sevenzipjbinding` for the primary extraction path (RAR/7z/ZIP and others)
- `zip4j` for ZIP multipart handling (JD-style split ZIP behavior)
## Layout
- `classes/` compiled `JBindExtractorMain` classes
- `lib/` runtime jars required by the sidecar
- `src/` Java source for the sidecar
## Rebuild notes
The checked-in classes are Java 8 compatible and built from:
`resources/extractor-jvm/src/com/sucukdeluxe/extractor/JBindExtractorMain.java`
If you need to rebuild, compile against the jars in `lib/` with a Java 8-compatible compiler.

View File

@ -1,12 +0,0 @@
Bundled JVM extractor dependencies:
1) sevenzipjbinding (16.02-2.01)
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding-all-platforms
- Upstream: https://sevenzipjbind.sourceforge.net/
2) zip4j (2.11.5)
- Maven artifact: net.lingala.zip4j:zip4j
- Upstream: https://github.com/srikanth-lingala/zip4j
Please review upstream licenses and notices before redistribution.

View File

@ -1,18 +0,0 @@
const path = require("path");
const { rcedit } = require("rcedit");
module.exports = async function afterPack(context) {
const productFilename = context.packager?.appInfo?.productFilename;
if (!productFilename) {
console.warn(" • rcedit: skipped — productFilename not available");
return;
}
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
console.log(` • rcedit: patching icon → ${exePath}`);
try {
await rcedit(exePath, { icon: iconPath });
} catch (error) {
console.warn(` • rcedit: failed — ${String(error)}`);
}
};

View File

@ -1,51 +0,0 @@
import { DebridService } from "../src/main/debrid";
import { defaultSettings } from "../src/main/constants";
import { MegaWebFallback } from "../src/main/mega-web-fallback";
const links = [
"https://rapidgator.net/file/837ef967aede4935e3e0374c4e663b40/GTHDERTPIIP7P401.part1.rar.html",
"https://rapidgator.net/file/ef3c9d64c899f801d69d6888dad89dcd/GTHDERTPIIP7P401.part2.rar.html",
"https://rapidgator.net/file/b38130fcf1e8448953250b9a1ed7958d/GTHDERTPIIP7P401.part3.rar.html"
];
const settings = {
...defaultSettings(),
token: process.env.RD_TOKEN || "",
megaLogin: process.env.MEGA_LOGIN || "",
megaPassword: process.env.MEGA_PASSWORD || "",
bestToken: process.env.BEST_TOKEN || "",
allDebridToken: process.env.ALLDEBRID_TOKEN || "",
providerPrimary: "alldebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
if (!settings.token && !(settings.megaLogin && settings.megaPassword) && !settings.bestToken && !settings.allDebridToken) {
console.error("No provider credentials set. Use RD_TOKEN or MEGA_LOGIN+MEGA_PASSWORD or BEST_TOKEN or ALLDEBRID_TOKEN.");
process.exit(1);
}
async function main(): Promise<void> {
const megaWeb = new MegaWebFallback(() => ({
login: settings.megaLogin,
password: settings.megaPassword
}));
try {
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
for (const link of links) {
try {
const result = await service.unrestrictLink(link);
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
} catch (error) {
console.log(`[FAIL] ${String(error)}`);
}
}
} finally {
megaWeb.dispose();
}
}
main().catch(e => { console.error(e); process.exit(1); });

View File

@ -1,148 +0,0 @@
const LOGIN = process.env.MEGA_LOGIN || "";
const PASSWORD = process.env.MEGA_PASSWORD || "";
const LINKS = [
"https://rapidgator.net/file/90b5397dfc3e1a0e561db7d6b89d5604/scnb-rrw7-S08E01.part1.rar.html",
"https://rapidgator.net/file/8ddf856dc833310c5cae9db82caf9682/scnb-rrw7-S08E01.part2.rar.html",
"https://rapidgator.net/file/440eed67d266476866332ae224c3fad5/scnb-rrw7-S08E01.part3.rar.html"
];
if (!LOGIN || !PASSWORD) {
throw new Error("Set MEGA_LOGIN and MEGA_PASSWORD env vars");
}
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
function cookieFrom(headers) {
const cookies = headers.getSetCookie();
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
}
function parseDebridCodes(html) {
const re = /processDebrid\((\d+),'([^']+)',0\)/g;
const out = [];
let m;
while ((m = re.exec(html)) !== null) {
out.push({ id: Number(m[1]), code: m[2] });
}
return out;
}
async function resolveCode(cookie, code) {
for (let attempt = 1; attempt <= 50; attempt += 1) {
const res = await fetch("https://www.mega-debrid.eu/index.php?ajax=debrid&json", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({
code,
autodl: "0"
})
});
const text = (await res.text()).trim();
if (text === "reload") {
if (attempt % 5 === 0) {
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
}
await sleep(800);
continue;
}
if (text === "false") {
return { ok: false, reason: "false" };
}
try {
const parsed = JSON.parse(text);
if (parsed?.link) {
return { ok: true, link: String(parsed.link), text: String(parsed.text || "") };
}
return { ok: false, reason: text };
} catch {
return { ok: false, reason: text };
}
}
return { ok: false, reason: "timeout" };
}
async function probeDownload(url) {
const res = await fetch(url, {
method: "GET",
headers: {
Range: "bytes=0-4095",
"User-Agent": "Mozilla/5.0"
},
redirect: "manual"
});
return {
status: res.status,
location: res.headers.get("location") || "",
contentType: res.headers.get("content-type") || "",
contentLength: res.headers.get("content-length") || ""
};
}
async function main() {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({
login: LOGIN,
password: PASSWORD,
remember: "on"
}),
redirect: "manual"
});
if (loginRes.status >= 400) {
throw new Error(`Login failed with HTTP ${loginRes.status}`);
}
const cookie = cookieFrom(loginRes.headers);
if (!cookie) {
throw new Error("Login returned no session cookie");
}
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({
links: LINKS.join("\n"),
password: "",
showLinks: "1"
})
});
const html = await debridRes.text();
const codes = parseDebridCodes(html);
console.log("codes", codes.length);
if (codes.length === 0) {
throw new Error("No processDebrid codes found");
}
for (let i = 0; i < Math.min(3, codes.length); i += 1) {
const c = codes[i];
const resolved = await resolveCode(cookie, c.code);
if (!resolved.ok) {
console.log(`[FAIL] code ${c.code}: ${resolved.reason}`);
continue;
}
console.log(`[OK] code ${c.code} -> ${resolved.link}`);
const probe = await probeDownload(resolved.link);
console.log(` probe status=${probe.status} type=${probe.contentType} len=${probe.contentLength} loc=${probe.location}`);
}
}
await main().catch((e) => { console.error(e); process.exit(1); });

View File

@ -1,297 +0,0 @@
const RAPIDGATOR_LINKS = [
"https://rapidgator.net/file/837ef967aede4935e3e0374c4e663b40/GTHDERTPIIP7P401.part1.rar.html",
"https://rapidgator.net/file/ef3c9d64c899f801d69d6888dad89dcd/GTHDERTPIIP7P401.part2.rar.html",
"https://rapidgator.net/file/b38130fcf1e8448953250b9a1ed7958d/GTHDERTPIIP7P401.part3.rar.html"
];
const rdToken = process.env.RD_TOKEN || "";
const megaLogin = process.env.MEGA_LOGIN || "";
const megaPassword = process.env.MEGA_PASSWORD || "";
const bestToken = process.env.BEST_TOKEN || "";
const allDebridToken = process.env.ALLDEBRID_TOKEN || "";
let megaCookie = "";
if (!rdToken && !(megaLogin && megaPassword) && !bestToken && !allDebridToken) {
console.error("No provider credentials configured. Set RD_TOKEN and/or MEGA_LOGIN+MEGA_PASSWORD and/or BEST_TOKEN and/or ALLDEBRID_TOKEN.");
process.exit(1);
}
function asRecord(value) {
if (!value || typeof value !== "object" || Array.isArray(value)) {
return null;
}
return value;
}
function pickString(payload, keys) {
if (!payload) {
return "";
}
for (const key of keys) {
const value = payload[key];
if (typeof value === "string" && value.trim()) {
return value.trim();
}
}
return "";
}
function parseResponseError(status, bodyText, payload) {
return pickString(payload, ["response_text", "error", "message", "error_description"]) || bodyText || `HTTP ${status}`;
}
async function callRealDebrid(link) {
const response = await fetch("https://api.real-debrid.com/rest/1.0/unrestrict/link", {
method: "POST",
headers: {
Authorization: `Bearer ${rdToken}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "RD-Node-Downloader/1.1.12"
},
body: new URLSearchParams({ link })
});
const text = await response.text();
const payload = asRecord(safeJson(text));
if (!response.ok) {
return { ok: false, error: parseResponseError(response.status, text, payload) };
}
const direct = pickString(payload, ["download", "link"]);
if (!direct) {
return { ok: false, error: "Real-Debrid returned no download URL" };
}
return {
ok: true,
direct,
fileName: pickString(payload, ["filename", "fileName"])
};
}
// megaCookie is intentionally cached at module scope so that multiple
// callMegaDebrid() invocations reuse the same session cookie.
async function callMegaDebrid(link) {
if (!megaCookie) {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
redirect: "manual"
});
if (loginRes.status >= 400) {
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
}
megaCookie = loginRes.headers.getSetCookie()
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
if (!megaCookie) {
return { ok: false, error: "Mega-Web login returned no session cookie" };
}
}
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: megaCookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({ links: link, password: "", showLinks: "1" })
});
const html = await debridRes.text();
const code = html.match(/processDebrid\(\d+,'([^']+)',0\)/i)?.[1] || "";
if (!code) {
return { ok: false, error: "Mega-Web returned no processDebrid code" };
}
for (let attempt = 1; attempt <= 40; attempt += 1) {
const ajaxRes = await fetch("https://www.mega-debrid.eu/index.php?ajax=debrid&json", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: megaCookie,
Referer: "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de"
},
body: new URLSearchParams({ code, autodl: "0" })
});
const txt = (await ajaxRes.text()).trim();
if (txt === "reload") {
await new Promise((resolve) => setTimeout(resolve, 650));
continue;
}
if (txt === "false") {
return { ok: false, error: "Mega-Web returned false" };
}
const payload = safeJson(txt);
const direct = String(payload?.link || "");
if (!direct) {
const msg = String(payload?.text || txt || "Mega-Web no link");
if (/hoster does not respond correctly|could not be done for this moment/i.test(msg)) {
await new Promise((resolve) => setTimeout(resolve, 1200));
continue;
}
return { ok: false, error: msg };
}
return {
ok: true,
direct,
fileName: pickString(asRecord(payload), ["filename"]) || ""
};
}
return { ok: false, error: "Mega-Web timeout while generating link" };
}
async function callBestDebrid(link) {
const encoded = encodeURIComponent(link);
const requests = [
{
url: `https://bestdebrid.com/api/v1/generateLink?link=${encoded}`,
useHeader: true
},
{
url: `https://bestdebrid.com/api/v1/generateLink?auth=${encodeURIComponent(bestToken)}&link=${encoded}`,
useHeader: false
}
];
let lastError = "Unknown BestDebrid error";
for (const req of requests) {
const headers = {
"User-Agent": "RD-Node-Downloader/1.1.12"
};
if (req.useHeader) {
headers.Authorization = bestToken;
}
const response = await fetch(req.url, {
method: "GET",
headers
});
const text = await response.text();
const parsed = safeJson(text);
const payload = Array.isArray(parsed) ? asRecord(parsed[0]) : asRecord(parsed);
if (!response.ok) {
lastError = parseResponseError(response.status, text, payload);
continue;
}
const direct = pickString(payload, ["download", "debridLink", "link"]);
if (!direct) {
lastError = pickString(payload, ["response_text", "message", "error"]) || "BestDebrid returned no download URL";
continue;
}
return {
ok: true,
direct,
fileName: pickString(payload, ["filename", "fileName"])
};
}
return { ok: false, error: lastError };
}
async function callAllDebrid(link) {
const response = await fetch("https://api.alldebrid.com/v4/link/unlock", {
method: "POST",
headers: {
Authorization: `Bearer ${allDebridToken}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "RD-Node-Downloader/1.1.12"
},
body: new URLSearchParams({ link })
});
const text = await response.text();
const payload = asRecord(safeJson(text));
if (!response.ok) {
return { ok: false, error: parseResponseError(response.status, text, payload) };
}
if (pickString(payload, ["status"]) === "error") {
const err = asRecord(payload?.error);
return { ok: false, error: pickString(err, ["message", "code"]) || "AllDebrid API error" };
}
const data = asRecord(payload?.data);
const direct = pickString(data, ["link"]);
if (!direct) {
return { ok: false, error: "AllDebrid returned no download URL" };
}
return {
ok: true,
direct,
fileName: pickString(data, ["filename"])
};
}
function safeJson(text) {
try {
return JSON.parse(text);
} catch {
return null;
}
}
function hostFromUrl(url) {
try {
return new URL(url).host;
} catch {
return "invalid-url";
}
}
async function main() {
const providers = [];
if (rdToken) {
providers.push({ name: "Real-Debrid", run: callRealDebrid });
}
if (megaLogin && megaPassword) {
providers.push({ name: "Mega-Debrid", run: callMegaDebrid });
}
if (bestToken) {
providers.push({ name: "BestDebrid", run: callBestDebrid });
}
if (allDebridToken) {
providers.push({ name: "AllDebrid", run: callAllDebrid });
}
let failures = 0;
for (const link of RAPIDGATOR_LINKS) {
console.log(`\nLink: ${link}`);
const results = [];
for (const provider of providers) {
try {
const result = await provider.run(link);
results.push({ provider: provider.name, ...result });
} catch (error) {
results.push({ provider: provider.name, ok: false, error: String(error) });
}
}
for (const result of results) {
if (result.ok) {
console.log(` [OK] ${result.provider} -> ${hostFromUrl(result.direct)} ${result.fileName ? `(${result.fileName})` : ""}`);
} else {
console.log(` [FAIL] ${result.provider} -> ${result.error}`);
}
}
const fallbackPick = results.find((entry) => entry.ok);
if (fallbackPick) {
console.log(` [AUTO] Selected by fallback order: ${fallbackPick.provider}`);
} else {
failures += 1;
console.log(" [AUTO] No provider could unrestrict this link");
}
}
if (failures > 0) {
process.exitCode = 2;
}
}
await main().catch((e) => { console.error(e); process.exit(1); });

View File

@ -1,356 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const NPM_RELEASE_WIN = process.platform === "win32"
? {
command: process.env.ComSpec || "cmd.exe",
args: ["/d", "/s", "/c", "npm run release:win"]
}
: {
command: "npm",
args: ["run", "release:win"]
};
function run(command, args, options = {}) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
stdio: options.capture ? ["pipe", "pipe", "pipe"] : "inherit"
});
if (result.status !== 0) {
const stderr = result.stderr ? String(result.stderr).trim() : "";
const stdout = result.stdout ? String(result.stdout).trim() : "";
const details = [stderr, stdout].filter(Boolean).join("\n");
throw new Error(`Command failed: ${command} ${args.join(" ")}${details ? `\n${details}` : ""}`);
}
return options.capture ? String(result.stdout || "") : "";
}
function runCapture(command, args) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
stdio: ["pipe", "pipe", "pipe"]
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
throw new Error(stderr || `Command failed: ${command} ${args.join(" ")}`);
}
return String(result.stdout || "").trim();
}
function runWithInput(command, args, input) {
const result = spawnSync(command, args, {
cwd: process.cwd(),
encoding: "utf8",
input,
stdio: ["pipe", "pipe", "pipe"],
timeout: 10000
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
throw new Error(stderr || `Command failed: ${command} ${args.join(" ")}`);
}
return String(result.stdout || "");
}
function parseArgs(argv) {
const args = argv.slice(2);
if (args.includes("--help") || args.includes("-h")) {
return { help: true };
}
const dryRun = args.includes("--dry-run");
const cleaned = args.filter((arg) => arg !== "--dry-run");
const version = cleaned[0] || "";
const notes = cleaned.slice(1).join(" ").trim();
return { help: false, dryRun, version, notes };
}
function parseRemoteUrl(url) {
const raw = String(url || "").trim();
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (httpsMatch) {
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
}
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshMatch) {
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
}
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshAltMatch) {
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
}
throw new Error(`Cannot parse remote URL: ${raw}`);
}
function normalizeBaseUrl(url) {
const raw = String(url || "").trim().replace(/\/+$/, "");
if (!raw) {
return "";
}
if (!/^https?:\/\//i.test(raw)) {
throw new Error("GITEA_BASE_URL must start with http:// or https://");
}
return raw;
}
function getGiteaRepo() {
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
const remotes = forcedRemote
? [forcedRemote]
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
for (const remote of remotes) {
try {
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
const parsed = parseRemoteUrl(remoteUrl);
const remoteBase = `https://${parsed.host}`.toLowerCase();
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
continue;
}
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
} catch {
// try next remote
}
}
if (preferredBase) {
throw new Error(
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
);
}
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
}
function getAuthHeader(host) {
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
if (explicitToken) {
return `token ${explicitToken}`;
}
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
const map = new Map();
for (const line of credentialText.split(/\r?\n/)) {
if (!line.includes("=")) {
continue;
}
const [key, value] = line.split("=", 2);
map.set(key, value);
}
const username = map.get("username") || "";
const password = map.get("password") || "";
if (!username || !password) {
throw new Error(
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
);
}
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
return `Basic ${token}`;
}
async function apiRequest(method, url, authHeader, body, contentType = "application/json") {
const headers = {
Accept: "application/json",
Authorization: authHeader
};
if (body !== undefined) {
headers["Content-Type"] = contentType;
}
const response = await fetch(url, {
method,
headers,
body
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
return { ok: response.ok, status: response.status, body: parsed };
}
function ensureVersionString(version) {
const trimmed = String(version || "").trim();
if (!/^\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?$/.test(trimmed)) {
throw new Error("Invalid version format. Expected e.g. 1.4.42");
}
return trimmed;
}
function updatePackageVersion(rootDir, version) {
const packagePath = path.join(rootDir, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
if (String(packageJson.version || "") === version) {
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
return;
}
packageJson.version = version;
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
}
function patchLatestYml(releaseDir, version) {
const ymlPath = path.join(releaseDir, "latest.yml");
let content = fs.readFileSync(ymlPath, "utf8");
const setupName = `Real-Debrid-Downloader Setup ${version}.exe`;
const dashedName = `Real-Debrid-Downloader-Setup-${version}.exe`;
if (content.includes(dashedName)) {
content = content.split(dashedName).join(setupName);
fs.writeFileSync(ymlPath, content, "utf8");
process.stdout.write(`Patched latest.yml: replaced "${dashedName}" with "${setupName}"\n`);
}
}
function ensureAssetsExist(rootDir, version) {
const releaseDir = path.join(rootDir, "release");
const files = [
`Real-Debrid-Downloader Setup ${version}.exe`,
`Real-Debrid-Downloader ${version}.exe`,
"latest.yml",
`Real-Debrid-Downloader Setup ${version}.exe.blockmap`
];
for (const fileName of files) {
const fullPath = path.join(releaseDir, fileName);
if (!fs.existsSync(fullPath)) {
throw new Error(`Missing release artifact: ${fullPath}`);
}
}
patchLatestYml(releaseDir, version);
return { releaseDir, files };
}
function ensureNoTrackedChanges() {
const output = runCapture("git", ["status", "--porcelain"]);
const lines = output.split(/\r?\n/).filter(Boolean);
const tracked = lines.filter((line) => !line.startsWith("?? "));
if (tracked.length > 0) {
throw new Error(`Working tree has tracked changes:\n${tracked.join("\n")}`);
}
}
function ensureTagMissing(tag) {
const result = spawnSync("git", ["rev-parse", "--verify", `refs/tags/${tag}`], {
cwd: process.cwd(),
stdio: "ignore"
});
if (result.status === 0) {
throw new Error(`Tag already exists: ${tag}`);
}
}
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
if (byTag.ok) {
return byTag.body;
}
const payload = {
tag_name: tag,
target_commitish: "main",
name: tag,
body: notes || `Release ${tag}`,
draft: false,
prerelease: false
};
const created = await apiRequest("POST", `${baseApi}/releases`, authHeader, JSON.stringify(payload));
if (!created.ok) {
throw new Error(`Failed to create release (${created.status}): ${JSON.stringify(created.body)}`);
}
return created.body;
}
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
for (const fileName of files) {
const filePath = path.join(releaseDir, fileName);
const fileSize = fs.statSync(filePath).size;
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
// Stream large files instead of loading them entirely into memory
const fileStream = fs.createReadStream(filePath);
const response = await fetch(uploadUrl, {
method: "POST",
headers: {
Accept: "application/json",
Authorization: authHeader,
"Content-Type": "application/octet-stream",
"Content-Length": String(fileSize)
},
body: fileStream,
duplex: "half"
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
if (response.ok) {
process.stdout.write(`Uploaded: ${fileName}\n`);
continue;
}
if (response.status === 409 || response.status === 422) {
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
continue;
}
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
}
}
async function main() {
const rootDir = process.cwd();
const args = parseArgs(process.argv);
if (args.help) {
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
return;
}
const version = ensureVersionString(args.version);
const tag = `v${version}`;
const releaseNotes = args.notes || `- Release ${tag}`;
const repo = getGiteaRepo();
ensureNoTrackedChanges();
ensureTagMissing(tag);
if (args.dryRun) {
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
return;
}
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_RELEASE_WIN.command, NPM_RELEASE_WIN.args);
const assets = ensureAssetsExist(rootDir, version);
run("git", ["add", "package.json"]);
run("git", ["commit", "-m", `Release ${tag}`]);
run("git", ["push", repo.remote, "main"]);
run("git", ["tag", tag]);
run("git", ["push", repo.remote, tag]);
const authHeader = getAuthHeader(repo.host);
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
}
main().catch((error) => {
process.stderr.write(`${String(error?.message || error)}\n`);
process.exit(1);
});

View File

@ -1,384 +0,0 @@
import path from "node:path";
import { app } from "electron";
import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
ParsedPackageInput,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress,
UpdateInstallResult
} from "../shared/types";
import { importDlcContainers } from "./container";
import { APP_VERSION } from "./constants";
import { DownloadManager } from "./download-manager";
import { parseCollectorInput } from "./link-parser";
import { configureLogger, getLogFilePath, logger } from "./logger";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
import { MegaWebFallback } from "./mega-web-fallback";
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
import { startDebugServer, stopDebugServer } from "./debug-server";
function sanitizeSettingsPatch(partial: Partial<AppSettings>): Partial<AppSettings> {
const entries = Object.entries(partial || {}).filter(([, value]) => value !== undefined);
return Object.fromEntries(entries) as Partial<AppSettings>;
}
function settingsFingerprint(settings: AppSettings): string {
return JSON.stringify(normalizeSettings(settings));
}
export class AppController {
private settings: AppSettings;
private manager: DownloadManager;
private megaWebFallback: MegaWebFallback;
private lastUpdateCheck: UpdateCheckResult | null = null;
private lastUpdateCheckAt = 0;
private storagePaths = createStoragePaths(path.join(app.getPath("userData"), "runtime"));
private onStateHandler: ((snapshot: UiSnapshot) => void) | null = null;
private autoResumePending = false;
public constructor() {
configureLogger(this.storagePaths.baseDir);
initSessionLog(this.storagePaths.baseDir);
this.settings = loadSettings(this.storagePaths);
const session = loadSession(this.storagePaths);
this.megaWebFallback = new MegaWebFallback(() => ({
login: this.settings.megaLogin,
password: this.settings.megaPassword
}));
this.manager = new DownloadManager(this.settings, session, this.storagePaths, {
megaWebUnrestrict: (link: string, signal?: AbortSignal) => this.megaWebFallback.unrestrict(link, signal),
invalidateMegaSession: () => this.megaWebFallback.invalidateSession(),
onHistoryEntry: (entry: HistoryEntry) => {
addHistoryEntry(this.storagePaths, entry);
}
});
this.manager.on("state", (snapshot: UiSnapshot) => {
this.onStateHandler?.(snapshot);
});
logger.info(`App gestartet v${APP_VERSION}`);
logger.info(`Log-Datei: ${getLogFilePath()}`);
startDebugServer(this.manager, this.storagePaths.baseDir);
if (this.settings.autoResumeOnStart) {
const snapshot = this.manager.getSnapshot();
const hasPending = Object.values(snapshot.session.items).some((item) => item.status === "queued" || item.status === "reconnect_wait");
if (hasPending) {
void this.manager.getStartConflicts().then((conflicts) => {
const hasConflicts = conflicts.length > 0;
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
// If the onState handler is already set (renderer connected), start immediately.
// Otherwise mark as pending so the onState setter triggers the start.
if (this.onStateHandler) {
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
} else {
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
}
} else if (hasConflicts) {
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
}
}).catch((err) => logger.warn(`getStartConflicts Fehler (constructor): ${String(err)}`));
}
}
}
private hasAnyProviderToken(settings: AppSettings): boolean {
return Boolean(
settings.token.trim()
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|| settings.bestToken.trim()
|| settings.allDebridToken.trim()
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
|| settings.oneFichierApiKey.trim()
);
}
public get onState(): ((snapshot: UiSnapshot) => void) | null {
return this.onStateHandler;
}
public set onState(handler: ((snapshot: UiSnapshot) => void) | null) {
this.onStateHandler = handler;
if (handler) {
handler(this.manager.getSnapshot());
if (this.autoResumePending) {
this.autoResumePending = false;
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
logger.info("Auto-Resume beim Start aktiviert");
} else {
// Trigger pending extractions without starting the session
this.manager.triggerIdleExtractions();
}
}
}
public getSnapshot(): UiSnapshot {
return this.manager.getSnapshot();
}
public getVersion(): string {
return APP_VERSION;
}
public getSettings(): AppSettings {
return this.settings;
}
public updateSettings(partial: Partial<AppSettings>): AppSettings {
const sanitizedPatch = sanitizeSettingsPatch(partial);
const nextSettings = normalizeSettings({
...this.settings,
...sanitizedPatch
});
if (settingsFingerprint(nextSettings) === settingsFingerprint(this.settings)) {
return this.settings;
}
// Preserve the live totalDownloadedAllTime from the download manager
const liveSettings = this.manager.getSettings();
nextSettings.totalDownloadedAllTime = Math.max(nextSettings.totalDownloadedAllTime || 0, liveSettings.totalDownloadedAllTime || 0);
this.settings = nextSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
return this.settings;
}
public async checkUpdates(): Promise<UpdateCheckResult> {
const result = await checkGitHubUpdate(this.settings.updateRepo);
if (!result.error) {
this.lastUpdateCheck = result;
this.lastUpdateCheckAt = Date.now();
}
return result;
}
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
// Stop active downloads before installing. Extractions may continue briefly
// until prepareForShutdown() is called during app quit.
if (this.manager.isSessionRunning()) {
this.manager.stop();
}
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
? this.lastUpdateCheck
: undefined;
const result = await installLatestUpdate(this.settings.updateRepo, cached, onProgress);
if (result.started) {
this.lastUpdateCheck = null;
this.lastUpdateCheckAt = 0;
}
return result;
}
public addLinks(payload: AddLinksPayload): { addedPackages: number; addedLinks: number; invalidCount: number } {
const parsed = parseCollectorInput(payload.rawText, payload.packageName || this.settings.packageName);
if (parsed.length === 0) {
return { addedPackages: 0, addedLinks: 0, invalidCount: 1 };
}
const result = this.manager.addPackages(parsed);
return { ...result, invalidCount: 0 };
}
public async addContainers(filePaths: string[]): Promise<{ addedPackages: number; addedLinks: number }> {
const packages = await importDlcContainers(filePaths);
const merged: ParsedPackageInput[] = packages.map((pkg) => ({
name: pkg.name,
links: pkg.links,
...(pkg.fileNames ? { fileNames: pkg.fileNames } : {})
}));
const result = this.manager.addPackages(merged);
return result;
}
public async getStartConflicts(): Promise<StartConflictEntry[]> {
return this.manager.getStartConflicts();
}
public async resolveStartConflict(packageId: string, policy: DuplicatePolicy): Promise<StartConflictResolutionResult> {
return this.manager.resolveStartConflict(packageId, policy);
}
public clearAll(): void {
this.manager.clearAll();
}
public async start(): Promise<void> {
await this.manager.start();
}
public async startPackages(packageIds: string[]): Promise<void> {
await this.manager.startPackages(packageIds);
}
public async startItems(itemIds: string[]): Promise<void> {
await this.manager.startItems(itemIds);
}
public stop(): void {
this.manager.stop();
}
public togglePause(): boolean {
return this.manager.togglePause();
}
public retryExtraction(packageId: string): void {
this.manager.retryExtraction(packageId);
}
public extractNow(packageId: string): void {
this.manager.extractNow(packageId);
}
public resetPackage(packageId: string): void {
this.manager.resetPackage(packageId);
}
public cancelPackage(packageId: string): void {
this.manager.cancelPackage(packageId);
}
public renamePackage(packageId: string, newName: string): void {
this.manager.renamePackage(packageId, newName);
}
public reorderPackages(packageIds: string[]): void {
this.manager.reorderPackages(packageIds);
}
public removeItem(itemId: string): void {
this.manager.removeItem(itemId);
}
public togglePackage(packageId: string): void {
this.manager.togglePackage(packageId);
}
public exportQueue(): string {
return this.manager.exportQueue();
}
public importQueue(json: string): { addedPackages: number; addedLinks: number } {
return this.manager.importQueue(json);
}
public getSessionStats(): SessionStats {
return this.manager.getSessionStats();
}
public exportBackup(): string {
const settings = { ...this.settings };
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = settings[key];
if (typeof val === "string" && val.length > 0) {
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
}
}
const session = this.manager.getSession();
return JSON.stringify({ version: 1, settings, session }, null, 2);
}
public importBackup(json: string): { restored: boolean; message: string } {
let parsed: Record<string, unknown>;
try {
parsed = JSON.parse(json) as Record<string, unknown>;
} catch {
return { restored: false, message: "Ungültiges JSON" };
}
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
}
const importedSettings = parsed.settings as AppSettings;
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = (importedSettings as Record<string, unknown>)[key];
if (typeof val === "string" && val.startsWith("***")) {
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
}
}
const restoredSettings = normalizeSettings(importedSettings);
this.settings = restoredSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
// Full stop including extraction abort — the old session is being replaced,
// so no extraction tasks from it should keep running.
this.manager.stop();
this.manager.abortAllPostProcessing();
// Cancel any deferred persist timer and queued async writes so the old
// in-memory session does not overwrite the restored session file on disk.
this.manager.clearPersistTimer();
cancelPendingAsyncSaves();
const restoredSession = normalizeLoadedSessionTransientFields(
normalizeLoadedSession(parsed.session)
);
saveSession(this.storagePaths, restoredSession);
// Prevent prepareForShutdown from overwriting the restored session file
// with the old in-memory session when the app quits after backup restore.
this.manager.skipShutdownPersist = true;
// Block all persistence (including persistSoon from any IPC operations
// the user might trigger before restarting) to protect the restored backup.
this.manager.blockAllPersistence = true;
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
}
public getSessionLogPath(): string | null {
return getSessionLogPath();
}
public shutdown(): void {
stopDebugServer();
abortActiveUpdateDownload();
this.manager.prepareForShutdown();
this.megaWebFallback.dispose();
shutdownSessionLog();
logger.info("App beendet");
}
public getHistory(): HistoryEntry[] {
return loadHistory(this.storagePaths);
}
public clearHistory(): void {
clearHistory(this.storagePaths);
}
public setPackagePriority(packageId: string, priority: PackagePriority): void {
this.manager.setPackagePriority(packageId, priority);
}
public skipItems(itemIds: string[]): void {
this.manager.skipItems(itemIds);
}
public resetItems(itemIds: string[]): void {
this.manager.resetItems(itemIds);
}
public removeHistoryEntry(entryId: string): void {
removeHistoryEntry(this.storagePaths, entryId);
}
public addToHistory(entry: HistoryEntry): void {
addHistoryEntry(this.storagePaths, entry);
}
}

View File

@ -1,66 +0,0 @@
import crypto from "node:crypto";
export const SENSITIVE_KEYS = [
"token",
"megaLogin",
"megaPassword",
"bestToken",
"allDebridToken",
"archivePasswordList"
] as const;
export type SensitiveKey = (typeof SENSITIVE_KEYS)[number];
export interface EncryptedCredentials {
salt: string;
iv: string;
tag: string;
data: string;
}
const PBKDF2_ITERATIONS = 100_000;
const KEY_LENGTH = 32; // 256 bit
const IV_LENGTH = 12; // 96 bit for GCM
const SALT_LENGTH = 16;
function deriveKey(username: string, salt: Buffer): Buffer {
return crypto.pbkdf2Sync(username, salt, PBKDF2_ITERATIONS, KEY_LENGTH, "sha256");
}
export function encryptCredentials(
fields: Record<string, string>,
username: string
): EncryptedCredentials {
const salt = crypto.randomBytes(SALT_LENGTH);
const iv = crypto.randomBytes(IV_LENGTH);
const key = deriveKey(username, salt);
const cipher = crypto.createCipheriv("aes-256-gcm", key, iv);
const plaintext = JSON.stringify(fields);
const encrypted = Buffer.concat([cipher.update(plaintext, "utf8"), cipher.final()]);
const tag = cipher.getAuthTag();
return {
salt: salt.toString("hex"),
iv: iv.toString("hex"),
tag: tag.toString("hex"),
data: encrypted.toString("hex")
};
}
export function decryptCredentials(
encrypted: EncryptedCredentials,
username: string
): Record<string, string> {
const salt = Buffer.from(encrypted.salt, "hex");
const iv = Buffer.from(encrypted.iv, "hex");
const tag = Buffer.from(encrypted.tag, "hex");
const data = Buffer.from(encrypted.data, "hex");
const key = deriveKey(username, salt);
const decipher = crypto.createDecipheriv("aes-256-gcm", key, iv);
decipher.setAuthTag(tag);
const decrypted = Buffer.concat([decipher.update(data), decipher.final()]);
return JSON.parse(decrypted.toString("utf8")) as Record<string, string>;
}

View File

@ -1,241 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { ARCHIVE_TEMP_EXTENSIONS, LINK_ARTIFACT_EXTENSIONS, MAX_LINK_ARTIFACT_BYTES, RAR_SPLIT_RE, SAMPLE_DIR_NAMES, SAMPLE_TOKEN_RE, SAMPLE_VIDEO_EXTENSIONS } from "./constants";
async function yieldToLoop(): Promise<void> {
await new Promise<void>((resolve) => {
setTimeout(resolve, 0);
});
}
export function isArchiveOrTempFile(filePath: string): boolean {
const lowerName = path.basename(filePath).toLowerCase();
const ext = path.extname(lowerName);
if (ARCHIVE_TEMP_EXTENSIONS.has(ext)) {
return true;
}
if (lowerName.includes(".part") && lowerName.endsWith(".rar")) {
return true;
}
return RAR_SPLIT_RE.test(lowerName);
}
export function cleanupCancelledPackageArtifacts(packageDir: string): number {
if (!fs.existsSync(packageDir)) {
return 0;
}
let removed = 0;
const stack = [packageDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
} else if (entry.isFile() && isArchiveOrTempFile(full)) {
try {
fs.rmSync(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
}
}
return removed;
}
export async function cleanupCancelledPackageArtifactsAsync(packageDir: string): Promise<number> {
try {
await fs.promises.access(packageDir, fs.constants.F_OK);
} catch {
return 0;
}
let removed = 0;
let touched = 0;
const stack = [packageDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try {
entries = await fs.promises.readdir(current, { withFileTypes: true });
} catch {
continue;
}
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
} else if (entry.isFile() && isArchiveOrTempFile(full)) {
try {
await fs.promises.rm(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
touched += 1;
if (touched % 80 === 0) {
await yieldToLoop();
}
}
}
return removed;
}
export async function removeDownloadLinkArtifacts(extractDir: string): Promise<number> {
try {
await fs.promises.access(extractDir);
} catch {
return 0;
}
let removed = 0;
const stack = [extractDir];
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() && !entry.isSymbolicLink()) {
stack.push(full);
continue;
}
if (!entry.isFile()) {
continue;
}
const ext = path.extname(entry.name).toLowerCase();
const name = entry.name.toLowerCase();
let shouldDelete = LINK_ARTIFACT_EXTENSIONS.has(ext);
if (!shouldDelete && [".txt", ".html", ".htm", ".nfo"].includes(ext)) {
if (/[._\- ](links?|downloads?|urls?|dlc)([._\- ]|$)/i.test(name)) {
try {
const stat = await fs.promises.stat(full);
if (stat.size <= MAX_LINK_ARTIFACT_BYTES) {
const text = await fs.promises.readFile(full, "utf8");
shouldDelete = /https?:\/\//i.test(text);
}
} catch {
shouldDelete = false;
}
}
}
if (shouldDelete) {
try {
await fs.promises.rm(full, { force: true });
removed += 1;
} catch {
// ignore
}
}
}
}
return removed;
}
export async function removeSampleArtifacts(extractDir: string): Promise<{ files: number; dirs: number }> {
try {
await fs.promises.access(extractDir);
} catch {
return { files: 0, dirs: 0 };
}
let removedFiles = 0;
let removedDirs = 0;
const sampleDirs: string[] = [];
const stack = [extractDir];
const countFilesRecursive = async (rootDir: string): Promise<number> => {
let count = 0;
const dirs = [rootDir];
while (dirs.length > 0) {
const current = dirs.pop() as string;
let entries: fs.Dirent[] = [];
try {
entries = await fs.promises.readdir(current, { withFileTypes: true });
} catch {
continue;
}
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory()) {
try {
const stat = await fs.promises.lstat(full);
if (stat.isSymbolicLink()) {
continue;
}
} catch {
continue;
}
dirs.push(full);
} else if (entry.isFile()) {
count += 1;
}
}
}
return count;
};
while (stack.length > 0) {
const current = stack.pop() as string;
let entries: fs.Dirent[] = [];
try { entries = await fs.promises.readdir(current, { withFileTypes: true }); } catch { continue; }
for (const entry of entries) {
const full = path.join(current, entry.name);
if (entry.isDirectory() || entry.isSymbolicLink()) {
const base = entry.name.toLowerCase();
if (SAMPLE_DIR_NAMES.has(base)) {
sampleDirs.push(full);
continue;
}
if (entry.isDirectory()) {
stack.push(full);
}
continue;
}
if (!entry.isFile()) {
continue;
}
const stem = path.parse(entry.name).name.toLowerCase();
const ext = path.extname(entry.name).toLowerCase();
const isSampleVideo = SAMPLE_VIDEO_EXTENSIONS.has(ext) && SAMPLE_TOKEN_RE.test(stem);
if (isSampleVideo) {
try {
await fs.promises.rm(full, { force: true });
removedFiles += 1;
} catch {
// ignore
}
}
}
}
sampleDirs.sort((a, b) => b.length - a.length);
for (const dir of sampleDirs) {
try {
const stat = await fs.promises.lstat(dir);
if (stat.isSymbolicLink()) {
await fs.promises.rm(dir, { force: true });
removedDirs += 1;
continue;
}
const filesInDir = await countFilesRecursive(dir);
await fs.promises.rm(dir, { recursive: true, force: true });
removedFiles += filesInDir;
removedDirs += 1;
} catch {
// ignore
}
}
return { files: removedFiles, dirs: removedDirs };
}

View File

@ -1,95 +0,0 @@
import path from "node:path";
import os from "node:os";
import { AppSettings } from "../shared/types";
import packageJson from "../../package.json";
export const APP_NAME = "Multi Debrid Downloader";
export const APP_VERSION: string = packageJson.version;
export const API_BASE_URL = "https://api.real-debrid.com/rest/1.0";
export const DCRYPT_UPLOAD_URL = "https://dcrypt.it/decrypt/upload";
export const DCRYPT_PASTE_URL = "https://dcrypt.it/decrypt/paste";
export const DLC_SERVICE_URL = "https://service.jdownloader.org/dlcrypt/service.php?srcType=dlc&destType=pylo&data={KEY}";
export const DLC_AES_KEY = Buffer.from("cb99b5cbc24db398", "utf8");
export const DLC_AES_IV = Buffer.from("9bc24cb995cb8db3", "utf8");
export const REQUEST_RETRIES = 3;
export const CHUNK_SIZE = 512 * 1024;
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
export const LINK_ARTIFACT_EXTENSIONS = new Set([".url", ".webloc", ".dlc", ".rsdf", ".ccf"]);
export const SAMPLE_TOKEN_RE = /(^|[._\-\s])sample([._\-\s]|$)/i;
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz", ".rev"]);
export const RAR_SPLIT_RE = /\.r\d{2,3}$/i;
export const MAX_MANIFEST_FILE_BYTES = 5 * 1024 * 1024;
export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
export const SPEED_WINDOW_SECONDS = 1;
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
export function defaultSettings(): AppSettings {
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
return {
token: "",
megaLogin: "",
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: "",
archivePasswordList: "",
rememberToken: true,
providerPrimary: "realdebrid",
providerSecondary: "megadebrid",
providerTertiary: "bestdebrid",
autoProviderFallback: true,
outputDir: baseDir,
packageName: "",
autoExtract: true,
autoRename4sf4sj: false,
extractDir: path.join(baseDir, "_entpackt"),
collectMkvToLibrary: false,
mkvLibraryDir: path.join(baseDir, "_mkv"),
createExtractSubfolder: true,
hybridExtract: true,
cleanupMode: "none",
extractConflictMode: "overwrite",
removeLinkFilesAfterExtract: false,
removeSamplesAfterExtract: false,
enableIntegrityCheck: true,
autoResumeOnStart: true,
autoReconnect: false,
reconnectWaitSeconds: 45,
completedCleanupPolicy: "never",
maxParallel: 4,
maxParallelExtract: 2,
retryLimit: 0,
speedLimitEnabled: false,
speedLimitKbps: 0,
speedLimitMode: "global",
updateRepo: DEFAULT_UPDATE_REPO,
autoUpdateCheck: true,
clipboardWatch: false,
minimizeToTray: false,
theme: "dark" as const,
collapseNewPackages: true,
autoSkipExtracted: false,
confirmDeleteSelection: true,
totalDownloadedAllTime: 0,
bandwidthSchedules: [],
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
extractCpuPriority: "high",
autoExtractWhenStopped: true
};
}

View File

@ -1,319 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import crypto from "node:crypto";
import { DCRYPT_PASTE_URL, DCRYPT_UPLOAD_URL, DLC_AES_IV, DLC_AES_KEY, DLC_SERVICE_URL } from "./constants";
import { compactErrorText, inferPackageNameFromLinks, isHttpLink, sanitizeFilename, uniquePreserveOrder } from "./utils";
import { ParsedPackageInput } from "../shared/types";
const MAX_DLC_FILE_BYTES = 8 * 1024 * 1024;
function isContainerSizeValidationError(error: unknown): boolean {
const text = compactErrorText(error);
return /zu groß/i.test(text) || /DLC-Datei ungültig oder zu groß/i.test(text);
}
function decodeDcryptPayload(responseText: string): unknown {
let text = String(responseText || "").trim();
const m = text.match(/<textarea[^>]*>([\s\S]*?)<\/textarea>/i);
if (m) {
text = m[1].replace(/&quot;/g, '"').replace(/&amp;/g, "&").trim();
}
if (!text) {
return "";
}
try {
return JSON.parse(text);
} catch {
return text;
}
}
function extractUrlsRecursive(data: unknown): string[] {
if (typeof data === "string") {
const found = data.match(/https?:\/\/[^\s"'<>]+/gi) ?? [];
return uniquePreserveOrder(found.filter((url) => isHttpLink(url)));
}
if (Array.isArray(data)) {
return uniquePreserveOrder(data.flatMap((item) => extractUrlsRecursive(item)));
}
if (data && typeof data === "object") {
return uniquePreserveOrder(Object.values(data as Record<string, unknown>).flatMap((value) => extractUrlsRecursive(value)));
}
return [];
}
function groupLinksByName(links: string[]): ParsedPackageInput[] {
const unique = uniquePreserveOrder(links.filter((link) => isHttpLink(link)));
const grouped = new Map<string, string[]>();
for (const link of unique) {
const name = sanitizeFilename(inferPackageNameFromLinks([link]) || "Paket");
const current = grouped.get(name) ?? [];
current.push(link);
grouped.set(name, current);
}
return Array.from(grouped.entries()).map(([name, packageLinks]) => ({ name, links: packageLinks }));
}
function extractPackagesFromPayload(payload: unknown): ParsedPackageInput[] {
const urls = extractUrlsRecursive(payload);
if (urls.length === 0) {
return [];
}
return groupLinksByName(urls);
}
function decryptRcPayload(base64Rc: string): Buffer {
const rcBytes = Buffer.from(base64Rc, "base64");
const decipher = crypto.createDecipheriv("aes-128-cbc", DLC_AES_KEY, DLC_AES_IV);
decipher.setAutoPadding(false);
return Buffer.concat([decipher.update(rcBytes), decipher.final()]);
}
function readDlcFileWithLimit(filePath: string): Buffer {
const stat = fs.statSync(filePath);
if (stat.size <= 0 || stat.size > MAX_DLC_FILE_BYTES) {
throw new Error(`DLC-Datei ungültig oder zu groß (${Math.floor(stat.size)} B)`);
}
return fs.readFileSync(filePath);
}
function parsePackagesFromDlcXml(xml: string): ParsedPackageInput[] {
const packages: ParsedPackageInput[] = [];
const packageRegex = /<package\s+[^>]*name="([^"]*)"[^>]*>([\s\S]*?)<\/package>/gi;
for (let m = packageRegex.exec(xml); m; m = packageRegex.exec(xml)) {
const encodedName = m[1] || "";
const packageBody = m[2] || "";
let packageName = "";
if (encodedName) {
try {
packageName = Buffer.from(encodedName, "base64").toString("utf8");
} catch {
packageName = encodedName;
}
}
const links: string[] = [];
const fileNames: string[] = [];
const fileRegex = /<file>([\s\S]*?)<\/file>/gi;
for (let fm = fileRegex.exec(packageBody); fm; fm = fileRegex.exec(packageBody)) {
const fileBody = fm[1] || "";
const urlMatch = fileBody.match(/<url>(.*?)<\/url>/i);
if (!urlMatch) {
continue;
}
try {
const url = Buffer.from((urlMatch[1] || "").trim(), "base64").toString("utf8").trim();
if (!isHttpLink(url)) {
continue;
}
let fileName = "";
const fnMatch = fileBody.match(/<filename>(.*?)<\/filename>/i);
if (fnMatch?.[1]) {
try {
fileName = Buffer.from(fnMatch[1].trim(), "base64").toString("utf8").trim();
} catch {
// ignore
}
}
links.push(url);
fileNames.push(sanitizeFilename(fileName));
} catch {
// skip broken entries
}
}
if (links.length === 0) {
const urlRegex = /<url>(.*?)<\/url>/gi;
for (let um = urlRegex.exec(packageBody); um; um = urlRegex.exec(packageBody)) {
try {
const url = Buffer.from((um[1] || "").trim(), "base64").toString("utf8").trim();
if (isHttpLink(url)) {
links.push(url);
}
} catch {
// skip broken entries
}
}
}
const uniqueLinks = uniquePreserveOrder(links);
const hasFileNames = fileNames.some((fn) => fn.length > 0);
if (uniqueLinks.length > 0) {
const pkg: ParsedPackageInput = {
name: sanitizeFilename(packageName || inferPackageNameFromLinks(uniqueLinks) || `Paket-${packages.length + 1}`),
links: uniqueLinks
};
if (hasFileNames) {
pkg.fileNames = fileNames;
}
packages.push(pkg);
}
}
return packages;
}
async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]> {
const content = readDlcFileWithLimit(filePath).toString("ascii").trim();
if (content.length < 89) {
return [];
}
const dlcKey = content.slice(-88);
const dlcData = content.slice(0, -88);
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
if (!rcResponse.ok) {
return [];
}
const rcText = await rcResponse.text();
const rcMatch = rcText.match(/<rc>(.*?)<\/rc>/i);
if (!rcMatch) {
return [];
}
const realKey = decryptRcPayload(rcMatch[1]).subarray(0, 16);
const encrypted = Buffer.from(dlcData, "base64");
const decipher = crypto.createDecipheriv("aes-128-cbc", realKey, realKey);
decipher.setAutoPadding(false);
let decrypted = Buffer.concat([decipher.update(encrypted), decipher.final()]);
if (decrypted.length === 0) {
return [];
}
const pad = decrypted[decrypted.length - 1];
if (pad > 0 && pad <= 16 && pad <= decrypted.length) {
let validPad = true;
for (let index = 1; index <= pad; index += 1) {
if (decrypted[decrypted.length - index] !== pad) {
validPad = false;
break;
}
}
if (validPad) {
decrypted = decrypted.subarray(0, decrypted.length - pad);
}
}
const xmlData = Buffer.from(decrypted.toString("utf8"), "base64").toString("utf8");
return parsePackagesFromDlcXml(xmlData);
}
function extractLinksFromResponse(text: string): string[] {
const payload = decodeDcryptPayload(text);
let links = extractUrlsRecursive(payload);
if (links.length === 0) {
links = extractUrlsRecursive(text);
}
return uniquePreserveOrder(links.filter((l) => isHttpLink(l)));
}
async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<string[] | null> {
const blob = new Blob([new Uint8Array(fileContent)]);
const form = new FormData();
form.set("dlcfile", blob, fileName);
const response = await fetch(DCRYPT_UPLOAD_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;
}
const text = await response.text();
if (!response.ok) {
throw new Error(compactErrorText(text));
}
return extractLinksFromResponse(text);
}
async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
const form = new FormData();
form.set("content", fileContent.toString("ascii"));
const response = await fetch(DCRYPT_PASTE_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;
}
const text = await response.text();
if (!response.ok) {
throw new Error(compactErrorText(text));
}
return extractLinksFromResponse(text);
}
async function decryptDlcViaDcrypt(filePath: string): Promise<ParsedPackageInput[]> {
const fileContent = readDlcFileWithLimit(filePath);
const fileName = path.basename(filePath);
const packageName = sanitizeFilename(path.basename(filePath, ".dlc")) || "Paket";
let links = await tryDcryptUpload(fileContent, fileName);
if (links === null) {
links = await tryDcryptPaste(fileContent);
}
if (links === null) {
throw new Error("DLC-Datei zu groß für dcrypt.it");
}
if (links.length === 0) {
return [];
}
return [{ name: packageName, links }];
}
export async function importDlcContainers(filePaths: string[]): Promise<ParsedPackageInput[]> {
const out: ParsedPackageInput[] = [];
const failures: string[] = [];
let sawDlc = false;
for (const filePath of filePaths) {
if (path.extname(filePath).toLowerCase() !== ".dlc") {
continue;
}
sawDlc = true;
let packages: ParsedPackageInput[] = [];
let fileFailed = false;
let fileFailureReasons: string[] = [];
try {
packages = await decryptDlcLocal(filePath);
} catch (error) {
if (isContainerSizeValidationError(error)) {
failures.push(`${path.basename(filePath)}: ${compactErrorText(error)}`);
continue;
}
fileFailed = true;
fileFailureReasons.push(`lokal: ${compactErrorText(error)}`);
packages = [];
}
if (packages.length === 0) {
try {
packages = await decryptDlcViaDcrypt(filePath);
} catch (error) {
if (isContainerSizeValidationError(error)) {
failures.push(`${path.basename(filePath)}: ${compactErrorText(error)}`);
continue;
}
fileFailed = true;
fileFailureReasons.push(`dcrypt: ${compactErrorText(error)}`);
packages = [];
}
}
if (packages.length === 0 && fileFailed) {
failures.push(`${path.basename(filePath)}: ${fileFailureReasons.join("; ")}`);
}
out.push(...packages);
}
if (out.length === 0 && sawDlc && failures.length > 0) {
const details = failures.slice(0, 2).join(" | ");
const suffix = failures.length > 2 ? ` (+${failures.length - 2} weitere)` : "";
throw new Error(`DLC konnte nicht importiert werden: ${details}${suffix}`);
}
return out;
}

File diff suppressed because it is too large Load Diff

View File

@ -1,279 +0,0 @@
import http from "node:http";
import fs from "node:fs";
import path from "node:path";
import { logger, getLogFilePath } from "./logger";
import type { DownloadManager } from "./download-manager";
const DEFAULT_PORT = 9868;
const MAX_LOG_LINES = 10000;
let server: http.Server | null = null;
let manager: DownloadManager | null = null;
let authToken = "";
function loadToken(baseDir: string): string {
const tokenPath = path.join(baseDir, "debug_token.txt");
try {
return fs.readFileSync(tokenPath, "utf8").trim();
} catch {
return "";
}
}
function getPort(baseDir: string): number {
const portPath = path.join(baseDir, "debug_port.txt");
try {
const n = Number(fs.readFileSync(portPath, "utf8").trim());
if (Number.isFinite(n) && n >= 1024 && n <= 65535) {
return n;
}
} catch {
// ignore
}
return DEFAULT_PORT;
}
function checkAuth(req: http.IncomingMessage): boolean {
if (!authToken) {
return false;
}
const header = req.headers.authorization || "";
if (header === `Bearer ${authToken}`) {
return true;
}
const url = new URL(req.url || "/", "http://localhost");
return url.searchParams.get("token") === authToken;
}
function jsonResponse(res: http.ServerResponse, status: number, data: unknown): void {
const body = JSON.stringify(data, null, 2);
res.writeHead(status, {
"Content-Type": "application/json; charset=utf-8",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-cache"
});
res.end(body);
}
function readLogTail(lines: number): string[] {
const logPath = getLogFilePath();
try {
const content = fs.readFileSync(logPath, "utf8");
const allLines = content.split("\n").filter((l) => l.trim().length > 0);
return allLines.slice(-Math.min(lines, MAX_LOG_LINES));
} catch {
return ["(Log-Datei nicht lesbar)"];
}
}
function handleRequest(req: http.IncomingMessage, res: http.ServerResponse): void {
if (req.method === "OPTIONS") {
res.writeHead(204, {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Headers": "Authorization"
});
res.end();
return;
}
if (!checkAuth(req)) {
jsonResponse(res, 401, { error: "Unauthorized" });
return;
}
const url = new URL(req.url || "/", "http://localhost");
const pathname = url.pathname;
if (pathname === "/health") {
jsonResponse(res, 200, {
status: "ok",
uptime: Math.floor(process.uptime()),
memoryMB: Math.round(process.memoryUsage().rss / 1024 / 1024)
});
return;
}
if (pathname === "/log") {
const count = Math.min(Number(url.searchParams.get("lines") || "100"), MAX_LOG_LINES);
const grep = url.searchParams.get("grep") || "";
let lines = readLogTail(count);
if (grep) {
const pattern = grep.toLowerCase();
lines = lines.filter((l) => l.toLowerCase().includes(pattern));
}
jsonResponse(res, 200, { lines, count: lines.length });
return;
}
if (pathname === "/status") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const items = Object.values(snapshot.session.items);
const packages = Object.values(snapshot.session.packages);
const byStatus: Record<string, number> = {};
for (const item of items) {
byStatus[item.status] = (byStatus[item.status] || 0) + 1;
}
const activeItems = items
.filter((i) => i.status === "downloading" || i.status === "validating")
.map((i) => ({
id: i.id,
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}));
const failedItems = items
.filter((i) => i.status === "failed")
.map((i) => ({
fileName: i.fileName,
lastError: i.lastError,
retries: i.retries,
provider: i.provider
}));
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
speed: snapshot.speedText,
eta: snapshot.etaText,
itemCounts: byStatus,
totalItems: items.length,
packages: packages.map((p) => ({
name: p.name,
status: p.status,
items: p.itemIds.length
})),
activeItems,
failedItems: failedItems.length > 0 ? failedItems : undefined
});
return;
}
if (pathname === "/items") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const filter = url.searchParams.get("status");
const pkg = url.searchParams.get("package");
let items = Object.values(snapshot.session.items);
if (filter) {
items = items.filter((i) => i.status === filter);
}
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
items = items.filter((i) => ids.has(i.id));
}
}
jsonResponse(res, 200, {
count: items.length,
items: items.map((i) => ({
fileName: i.fileName,
status: i.status,
fullStatus: i.fullStatus,
provider: i.provider,
progress: i.progressPercent,
speedMBs: +(i.speedBps / 1024 / 1024).toFixed(2),
downloadedMB: +(i.downloadedBytes / 1024 / 1024).toFixed(1),
totalMB: i.totalBytes ? +(i.totalBytes / 1024 / 1024).toFixed(1) : null,
retries: i.retries,
lastError: i.lastError
}))
});
return;
}
if (pathname === "/session") {
if (!manager) {
jsonResponse(res, 503, { error: "Manager not initialized" });
return;
}
const snapshot = manager.getSnapshot();
const pkg = url.searchParams.get("package");
if (pkg) {
const pkgLower = pkg.toLowerCase();
const matchedPkg = Object.values(snapshot.session.packages)
.find((p) => p.name.toLowerCase().includes(pkgLower));
if (matchedPkg) {
const ids = new Set(matchedPkg.itemIds);
const pkgItems = Object.values(snapshot.session.items)
.filter((i) => ids.has(i.id));
jsonResponse(res, 200, {
package: matchedPkg,
items: pkgItems
});
return;
}
}
jsonResponse(res, 200, {
running: snapshot.session.running,
paused: snapshot.session.paused,
packageCount: Object.keys(snapshot.session.packages).length,
itemCount: Object.keys(snapshot.session.items).length,
packages: Object.values(snapshot.session.packages).map((p) => ({
id: p.id,
name: p.name,
status: p.status,
items: p.itemIds.length
}))
});
return;
}
jsonResponse(res, 404, {
error: "Not found",
endpoints: [
"GET /health",
"GET /log?lines=100&grep=keyword",
"GET /status",
"GET /items?status=downloading&package=Bloodline",
"GET /session?package=Criminal"
]
});
}
export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
authToken = loadToken(baseDir);
if (!authToken) {
logger.info("Debug-Server: Kein Token in debug_token.txt, Server wird nicht gestartet");
return;
}
manager = mgr;
const port = getPort(baseDir);
server = http.createServer(handleRequest);
server.listen(port, "127.0.0.1", () => {
logger.info(`Debug-Server gestartet auf Port ${port}`);
});
server.on("error", (err) => {
logger.warn(`Debug-Server Fehler: ${String(err)}`);
server = null;
});
}
export function stopDebugServer(): void {
if (server) {
server.close();
server = null;
logger.info("Debug-Server gestoppt");
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,163 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import crypto from "node:crypto";
import { ParsedHashEntry } from "../shared/types";
import { MAX_MANIFEST_FILE_BYTES } from "./constants";
const manifestCache = new Map<string, { at: number; entries: Map<string, ParsedHashEntry> }>();
const MANIFEST_CACHE_TTL_MS = 15000;
function normalizeManifestKey(value: string): string {
return String(value || "")
.replace(/\\/g, "/")
.replace(/^\.\//, "")
.trim()
.toLowerCase();
}
export function parseHashLine(line: string): ParsedHashEntry | null {
const text = String(line || "").trim();
if (!text || text.startsWith(";")) {
return null;
}
const md = text.match(/^([0-9a-fA-F]{32}|[0-9a-fA-F]{40})\s+\*?(.+)$/);
if (md) {
const digest = md[1].toLowerCase();
return {
fileName: md[2].trim(),
algorithm: digest.length === 32 ? "md5" : "sha1",
digest
};
}
const sfv = text.match(/^(.+?)\s+([0-9A-Fa-f]{8})$/);
if (sfv) {
return {
fileName: sfv[1].trim(),
algorithm: "crc32",
digest: sfv[2].toLowerCase()
};
}
return null;
}
export function readHashManifest(packageDir: string): Map<string, ParsedHashEntry> {
const cacheKey = path.resolve(packageDir);
const cached = manifestCache.get(cacheKey);
if (cached && Date.now() - cached.at <= MANIFEST_CACHE_TTL_MS) {
return new Map(cached.entries);
}
const map = new Map<string, ParsedHashEntry>();
const patterns: Array<[string, "crc32" | "md5" | "sha1"]> = [
[".sfv", "crc32"],
[".md5", "md5"],
[".sha1", "sha1"]
];
if (!fs.existsSync(packageDir)) {
return map;
}
const manifestFiles = fs.readdirSync(packageDir, { withFileTypes: true })
.filter((entry) => {
if (!entry.isFile()) {
return false;
}
const ext = path.extname(entry.name).toLowerCase();
return patterns.some(([pattern]) => pattern === ext);
})
.sort((a, b) => a.name.localeCompare(b.name, undefined, { numeric: true, sensitivity: "base" }));
for (const entry of manifestFiles) {
if (!entry.isFile()) {
continue;
}
const ext = path.extname(entry.name).toLowerCase();
const hit = patterns.find(([pattern]) => pattern === ext);
if (!hit) {
continue;
}
const filePath = path.join(packageDir, entry.name);
let lines: string[];
try {
const stat = fs.statSync(filePath);
if (stat.size > MAX_MANIFEST_FILE_BYTES) {
continue;
}
lines = fs.readFileSync(filePath, "utf8").split(/\r?\n/);
} catch {
continue;
}
for (const line of lines) {
const parsed = parseHashLine(line);
if (!parsed) {
continue;
}
const normalized: ParsedHashEntry = {
...parsed,
algorithm: hit[1]
};
const key = normalizeManifestKey(parsed.fileName);
if (map.has(key)) {
continue;
}
map.set(key, normalized);
}
}
manifestCache.set(cacheKey, { at: Date.now(), entries: new Map(map) });
return map;
}
const crcTable = new Int32Array(256);
for (let i = 0; i < 256; i++) {
let c = i;
for (let j = 0; j < 8; j++) c = c & 1 ? (0xedb88320 ^ (c >>> 1)) : (c >>> 1);
crcTable[i] = c;
}
function crc32Buffer(data: Buffer, seed = 0): number {
let crc = seed ^ -1;
for (let i = 0; i < data.length; i++) {
crc = (crc >>> 8) ^ crcTable[(crc ^ data[i]) & 0xff];
}
return crc ^ -1;
}
async function hashFile(filePath: string, algorithm: "crc32" | "md5" | "sha1"): Promise<string> {
if (algorithm === "crc32") {
const stream = fs.createReadStream(filePath, { highWaterMark: 1024 * 1024 });
let crc = 0;
for await (const chunk of stream) {
crc = crc32Buffer(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk), crc);
await new Promise(r => setImmediate(r));
}
return (crc >>> 0).toString(16).padStart(8, "0").toLowerCase();
}
const hash = crypto.createHash(algorithm);
const stream = fs.createReadStream(filePath, { highWaterMark: 1024 * 1024 });
return await new Promise<string>((resolve, reject) => {
stream.on("data", (chunk: string | Buffer) => hash.update(typeof chunk === "string" ? Buffer.from(chunk) : chunk));
stream.on("error", reject);
stream.on("end", () => resolve(hash.digest("hex").toLowerCase()));
});
}
export async function validateFileAgainstManifest(filePath: string, packageDir: string): Promise<{ ok: boolean; message: string }> {
const manifest = readHashManifest(packageDir);
if (manifest.size === 0) {
return { ok: true, message: "Kein Hash verfügbar" };
}
const keyByBaseName = normalizeManifestKey(path.basename(filePath));
const keyByRelativePath = normalizeManifestKey(path.relative(packageDir, filePath));
const entry = manifest.get(keyByRelativePath) || manifest.get(keyByBaseName);
if (!entry) {
return { ok: true, message: "Kein Hash für Datei" };
}
const actual = await hashFile(filePath, entry.algorithm);
if (actual === entry.digest.toLowerCase()) {
return { ok: true, message: `${entry.algorithm.toUpperCase()} ok` };
}
return { ok: false, message: `${entry.algorithm.toUpperCase()} mismatch` };
}

View File

@ -1,26 +0,0 @@
import { ParsedPackageInput } from "../shared/types";
import { inferPackageNameFromLinks, parsePackagesFromLinksText, sanitizeFilename, uniquePreserveOrder } from "./utils";
export function mergePackageInputs(packages: ParsedPackageInput[]): ParsedPackageInput[] {
const grouped = new Map<string, string[]>();
for (const pkg of packages) {
const name = sanitizeFilename(pkg.name || inferPackageNameFromLinks(pkg.links));
const list = grouped.get(name) ?? [];
for (const link of pkg.links) {
list.push(link);
}
grouped.set(name, list);
}
return Array.from(grouped.entries()).map(([name, links]) => ({
name,
links: uniquePreserveOrder(links)
}));
}
export function parseCollectorInput(rawText: string, packageName = ""): ParsedPackageInput[] {
const parsed = parsePackagesFromLinksText(rawText, packageName);
if (parsed.length === 0) {
return [];
}
return mergePackageInputs(parsed);
}

View File

@ -1,225 +0,0 @@
import fs from "node:fs";
import path from "node:path";
let logFilePath = path.resolve(process.cwd(), "rd_downloader.log");
let fallbackLogFilePath: string | null = null;
const LOG_FLUSH_INTERVAL_MS = 120;
const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
const rotateCheckAtByFile = new Map<string, number>();
type LogListener = (line: string) => void;
let logListener: LogListener | null = null;
let pendingLines: string[] = [];
let pendingChars = 0;
let flushTimer: NodeJS.Timeout | null = null;
let flushInFlight = false;
let exitHookAttached = false;
export function setLogListener(listener: LogListener | null): void {
logListener = listener;
}
export function configureLogger(baseDir: string): void {
logFilePath = path.join(baseDir, "rd_downloader.log");
const cwdLogPath = path.resolve(process.cwd(), "rd_downloader.log");
fallbackLogFilePath = cwdLogPath === logFilePath ? null : cwdLogPath;
}
function appendLine(filePath: string, line: string): { ok: boolean; errorText: string } {
try {
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.appendFileSync(filePath, line, "utf8");
return { ok: true, errorText: "" };
} catch (error) {
return { ok: false, errorText: String(error) };
}
}
async function appendChunk(filePath: string, chunk: string): Promise<{ ok: boolean; errorText: string }> {
try {
await fs.promises.mkdir(path.dirname(filePath), { recursive: true });
await fs.promises.appendFile(filePath, chunk, "utf8");
return { ok: true, errorText: "" };
} catch (error) {
return { ok: false, errorText: String(error) };
}
}
function writeStderr(text: string): void {
try {
process.stderr.write(text);
} catch {
// ignore stderr failures
}
}
function flushSyncPending(): void {
if (pendingLines.length === 0) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
pendingChars = 0;
rotateIfNeeded(logFilePath);
const primary = appendLine(logFilePath, chunk);
if (fallbackLogFilePath) {
rotateIfNeeded(fallbackLogFilePath);
const fallback = appendLine(fallbackLogFilePath, chunk);
if (!primary.ok && !fallback.ok) {
writeStderr(`LOGGER write failed (primary+fallback): ${primary.errorText} | ${fallback.errorText}\n`);
}
return;
}
if (!primary.ok) {
writeStderr(`LOGGER write failed: ${primary.errorText}\n`);
}
}
function scheduleFlush(immediate = false): void {
if (flushInFlight) {
return;
}
if (immediate) {
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
void flushAsync();
return;
}
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
void flushAsync();
}, LOG_FLUSH_INTERVAL_MS);
}
function rotateIfNeeded(filePath: string): void {
try {
const now = Date.now();
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
if (now - lastRotateCheckAt < 60_000) {
return;
}
rotateCheckAtByFile.set(filePath, now);
const stat = fs.statSync(filePath);
if (stat.size < LOG_MAX_FILE_BYTES) {
return;
}
const backup = `${filePath}.old`;
try {
fs.rmSync(backup, { force: true });
} catch {
// ignore
}
fs.renameSync(filePath, backup);
} catch {
// ignore - file may not exist yet
}
}
async function rotateIfNeededAsync(filePath: string): Promise<void> {
try {
const now = Date.now();
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
if (now - lastRotateCheckAt < 60_000) {
return;
}
rotateCheckAtByFile.set(filePath, now);
const stat = await fs.promises.stat(filePath);
if (stat.size < LOG_MAX_FILE_BYTES) {
return;
}
const backup = `${filePath}.old`;
await fs.promises.rm(backup, { force: true }).catch(() => {});
await fs.promises.rename(filePath, backup);
} catch {
// ignore - file may not exist yet
}
}
async function flushAsync(): Promise<void> {
if (flushInFlight || pendingLines.length === 0) {
return;
}
flushInFlight = true;
const linesSnapshot = pendingLines.slice();
const chunk = linesSnapshot.join("");
try {
await rotateIfNeededAsync(logFilePath);
const primary = await appendChunk(logFilePath, chunk);
let wroteAny = primary.ok;
if (fallbackLogFilePath) {
await rotateIfNeededAsync(fallbackLogFilePath);
const fallback = await appendChunk(fallbackLogFilePath, chunk);
wroteAny = wroteAny || fallback.ok;
if (!primary.ok && !fallback.ok) {
writeStderr(`LOGGER write failed (primary+fallback): ${primary.errorText} | ${fallback.errorText}\n`);
}
} else if (!primary.ok) {
writeStderr(`LOGGER write failed: ${primary.errorText}\n`);
}
if (wroteAny) {
pendingLines = pendingLines.slice(linesSnapshot.length);
pendingChars = Math.max(0, pendingChars - chunk.length);
}
} finally {
flushInFlight = false;
if (pendingLines.length > 0) {
scheduleFlush();
}
}
}
function ensureExitHook(): void {
if (exitHookAttached) {
return;
}
exitHookAttached = true;
process.once("beforeExit", flushSyncPending);
process.once("exit", flushSyncPending);
}
function write(level: "INFO" | "WARN" | "ERROR", message: string): void {
ensureExitHook();
const line = `${new Date().toISOString()} [${level}] ${message}\n`;
pendingLines.push(line);
pendingChars += line.length;
if (logListener) {
try { logListener(line); } catch { /* ignore */ }
}
while (pendingChars > LOG_BUFFER_LIMIT_CHARS && pendingLines.length > 1) {
const removed = pendingLines.shift();
if (!removed) {
break;
}
pendingChars = Math.max(0, pendingChars - removed.length);
}
if (level === "ERROR") {
scheduleFlush(true);
return;
}
scheduleFlush();
}
export const logger = {
info: (msg: string): void => write("INFO", msg),
warn: (msg: string): void => write("WARN", msg),
error: (msg: string): void => write("ERROR", msg)
};
export function getLogFilePath(): string {
return logFilePath;
}

View File

@ -1,524 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { app, BrowserWindow, clipboard, dialog, ipcMain, IpcMainInvokeEvent, Menu, shell, Tray } from "electron";
import { AddLinksPayload, AppSettings, UpdateInstallProgress } from "../shared/types";
import { AppController } from "./app-controller";
import { IPC_CHANNELS } from "../shared/ipc";
import { getLogFilePath, logger } from "./logger";
import { APP_NAME } from "./constants";
import { extractHttpLinksFromText } from "./utils";
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
/* ── IPC validation helpers ────────────────────────────────────── */
function validateString(value: unknown, name: string): string {
if (typeof value !== "string") {
throw new Error(`${name} muss ein String sein`);
}
return value;
}
function validatePlainObject(value: unknown, name: string): Record<string, unknown> {
if (!value || typeof value !== "object" || Array.isArray(value)) {
throw new Error(`${name} muss ein Objekt sein`);
}
return value as Record<string, unknown>;
}
const IMPORT_QUEUE_MAX_BYTES = 10 * 1024 * 1024;
const RENAME_PACKAGE_MAX_CHARS = 240;
function validateStringArray(value: unknown, name: string): string[] {
if (!Array.isArray(value) || !value.every(v => typeof v === "string")) {
throw new Error(`${name} muss ein String-Array sein`);
}
return value as string[];
}
/* ── Single Instance Lock ───────────────────────────────────────── */
const gotLock = app.requestSingleInstanceLock();
if (!gotLock) {
app.exit(0);
process.exit(0);
}
/* ── Unhandled error protection ─────────────────────────────────── */
process.on("uncaughtException", (error) => {
logger.error(`Uncaught Exception: ${String(error?.stack || error)}`);
});
process.on("unhandledRejection", (reason) => {
logger.error(`Unhandled Rejection: ${String(reason)}`);
});
let mainWindow: BrowserWindow | null = null;
let tray: Tray | null = null;
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
let lastClipboardText = "";
const controller = new AppController();
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
function isDevMode(): boolean {
return process.env.NODE_ENV === "development";
}
function createWindow(): BrowserWindow {
const window = new BrowserWindow({
width: 1440,
height: 940,
minWidth: 1120,
minHeight: 760,
backgroundColor: "#070b14",
title: `${APP_NAME} - v${controller.getVersion()}`,
icon: path.join(app.getAppPath(), "assets", "app_icon.ico"),
webPreferences: {
contextIsolation: true,
nodeIntegration: false,
preload: path.join(__dirname, "../preload/preload.js")
}
});
if (!isDevMode()) {
window.webContents.session.webRequest.onHeadersReceived((details, callback) => {
callback({
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
]
}
});
});
}
window.setMenuBarVisibility(false);
window.setAutoHideMenuBar(true);
if (isDevMode()) {
void window.loadURL("http://localhost:5173");
} else {
void window.loadFile(path.join(app.getAppPath(), "build", "renderer", "index.html"));
}
return window;
}
function bindMainWindowLifecycle(window: BrowserWindow): void {
window.on("close", (event) => {
const settings = controller.getSettings();
if (settings.minimizeToTray && tray) {
event.preventDefault();
window.hide();
}
});
window.on("closed", () => {
if (mainWindow === window) {
mainWindow = null;
}
});
}
function createTray(): void {
if (tray) {
return;
}
const iconPath = path.join(app.getAppPath(), "assets", "app_icon.ico");
try {
tray = new Tray(iconPath);
} catch {
return;
}
tray.setToolTip(APP_NAME);
const contextMenu = Menu.buildFromTemplate([
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
{ type: "separator" },
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
{ label: "Stop", click: () => { controller.stop(); } },
{ type: "separator" },
{ label: "Beenden", click: () => { app.quit(); } }
]);
tray.setContextMenu(contextMenu);
tray.on("double-click", () => {
mainWindow?.show();
mainWindow?.focus();
});
}
function destroyTray(): void {
if (tray) {
tray.destroy();
tray = null;
}
}
function extractLinksFromText(text: string): string[] {
return extractHttpLinksFromText(text);
}
function normalizeClipboardText(text: string): string {
const truncateUnicodeSafe = (value: string, maxChars: number): string => {
if (value.length <= maxChars) {
return value;
}
const points = Array.from(value);
if (points.length <= maxChars) {
return value;
}
return points.slice(0, maxChars).join("");
};
const normalized = String(text || "");
if (normalized.length <= CLIPBOARD_MAX_TEXT_CHARS) {
return normalized;
}
const truncated = truncateUnicodeSafe(normalized, CLIPBOARD_MAX_TEXT_CHARS);
const lastBreak = Math.max(
truncated.lastIndexOf("\n"),
truncated.lastIndexOf("\r"),
truncated.lastIndexOf("\t"),
truncated.lastIndexOf(" ")
);
if (lastBreak >= Math.floor(CLIPBOARD_MAX_TEXT_CHARS * 0.7)) {
return truncated.slice(0, lastBreak);
}
return truncated;
}
function startClipboardWatcher(): void {
if (clipboardTimer) {
return;
}
lastClipboardText = normalizeClipboardText(clipboard.readText());
clipboardTimer = setInterval(() => {
let text: string;
try {
text = normalizeClipboardText(clipboard.readText());
} catch {
return;
}
if (text === lastClipboardText || !text.trim()) {
return;
}
lastClipboardText = text;
const links = extractLinksFromText(text);
if (links.length > 0 && mainWindow && !mainWindow.isDestroyed()) {
mainWindow.webContents.send(IPC_CHANNELS.CLIPBOARD_DETECTED, links);
}
}, 2000);
}
function stopClipboardWatcher(): void {
if (clipboardTimer) {
clearInterval(clipboardTimer);
clipboardTimer = null;
}
}
function updateClipboardWatcher(): void {
const settings = controller.getSettings();
if (settings.clipboardWatch) {
startClipboardWatcher();
} else {
stopClipboardWatcher();
}
}
function updateTray(): void {
const settings = controller.getSettings();
if (settings.minimizeToTray) {
createTray();
} else {
destroyTray();
}
}
function registerIpcHandlers(): void {
ipcMain.handle(IPC_CHANNELS.GET_SNAPSHOT, () => controller.getSnapshot());
ipcMain.handle(IPC_CHANNELS.GET_VERSION, () => controller.getVersion());
ipcMain.handle(IPC_CHANNELS.CHECK_UPDATES, async () => controller.checkUpdates());
ipcMain.handle(IPC_CHANNELS.INSTALL_UPDATE, async () => {
const result = await controller.installUpdate((progress: UpdateInstallProgress) => {
if (!mainWindow || mainWindow.isDestroyed()) {
return;
}
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
});
if (result.started) {
updateQuitTimer = setTimeout(() => {
app.quit();
}, 2500);
}
return result;
});
ipcMain.handle(IPC_CHANNELS.OPEN_EXTERNAL, async (_event: IpcMainInvokeEvent, rawUrl: string) => {
try {
const parsed = new URL(String(rawUrl || "").trim());
if (parsed.protocol !== "https:" && parsed.protocol !== "http:") {
return false;
}
await shell.openExternal(parsed.toString());
return true;
} catch {
return false;
}
});
ipcMain.handle(IPC_CHANNELS.UPDATE_SETTINGS, (_event: IpcMainInvokeEvent, partial: Partial<AppSettings>) => {
const validated = validatePlainObject(partial ?? {}, "partial");
const result = controller.updateSettings(validated as Partial<AppSettings>);
updateClipboardWatcher();
updateTray();
return result;
});
ipcMain.handle(IPC_CHANNELS.ADD_LINKS, (_event: IpcMainInvokeEvent, payload: AddLinksPayload) => {
validatePlainObject(payload ?? {}, "payload");
validateString(payload?.rawText, "rawText");
if (payload.packageName !== undefined) {
validateString(payload.packageName, "packageName");
}
if (payload.duplicatePolicy !== undefined && payload.duplicatePolicy !== "keep" && payload.duplicatePolicy !== "skip" && payload.duplicatePolicy !== "overwrite") {
throw new Error("duplicatePolicy muss 'keep', 'skip' oder 'overwrite' sein");
}
return controller.addLinks(payload);
});
ipcMain.handle(IPC_CHANNELS.ADD_CONTAINERS, async (_event: IpcMainInvokeEvent, filePaths: string[]) => {
const validPaths = validateStringArray(filePaths ?? [], "filePaths");
const safePaths = validPaths.filter((p) => path.isAbsolute(p));
return controller.addContainers(safePaths);
});
ipcMain.handle(IPC_CHANNELS.GET_START_CONFLICTS, () => controller.getStartConflicts());
ipcMain.handle(IPC_CHANNELS.RESOLVE_START_CONFLICT, (_event: IpcMainInvokeEvent, packageId: string, policy: "keep" | "skip" | "overwrite") => {
validateString(packageId, "packageId");
validateString(policy, "policy");
if (policy !== "keep" && policy !== "skip" && policy !== "overwrite") {
throw new Error("policy muss 'keep', 'skip' oder 'overwrite' sein");
}
return controller.resolveStartConflict(packageId, policy);
});
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds ?? [], "packageIds");
return controller.startPackages(packageIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.startItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
ipcMain.handle(IPC_CHANNELS.CANCEL_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.cancelPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.RENAME_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string, newName: string) => {
validateString(packageId, "packageId");
validateString(newName, "newName");
if (newName.length > RENAME_PACKAGE_MAX_CHARS) {
throw new Error(`newName zu lang (max ${RENAME_PACKAGE_MAX_CHARS} Zeichen)`);
}
return controller.renamePackage(packageId, newName);
});
ipcMain.handle(IPC_CHANNELS.REORDER_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds, "packageIds");
return controller.reorderPackages(packageIds);
});
ipcMain.handle(IPC_CHANNELS.REMOVE_ITEM, (_event: IpcMainInvokeEvent, itemId: string) => {
validateString(itemId, "itemId");
return controller.removeItem(itemId);
});
ipcMain.handle(IPC_CHANNELS.TOGGLE_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.togglePackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.RETRY_EXTRACTION, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.retryExtraction(packageId);
});
ipcMain.handle(IPC_CHANNELS.EXTRACT_NOW, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.extractNow(packageId);
});
ipcMain.handle(IPC_CHANNELS.RESET_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.resetPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
validateString(packageId, "packageId");
validateString(priority, "priority");
if (priority !== "high" && priority !== "normal" && priority !== "low") {
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
}
return controller.setPackagePriority(packageId, priority);
});
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.skipItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.resetItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
ipcMain.handle(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, (_event: IpcMainInvokeEvent, entryId: string) => {
validateString(entryId, "entryId");
return controller.removeHistoryEntry(entryId);
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, async () => {
const options = {
defaultPath: `rd-queue-export.json`,
filters: [{ name: "Queue Export", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportQueue();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.IMPORT_QUEUE, (_event: IpcMainInvokeEvent, json: string) => {
validateString(json, "json");
const bytes = Buffer.byteLength(json, "utf8");
if (bytes > IMPORT_QUEUE_MAX_BYTES) {
throw new Error(`Queue-Import zu groß (max ${IMPORT_QUEUE_MAX_BYTES} Bytes)`);
}
return controller.importQueue(json);
});
ipcMain.handle(IPC_CHANNELS.TOGGLE_CLIPBOARD, () => {
const settings = controller.getSettings();
const next = !settings.clipboardWatch;
controller.updateSettings({ clipboardWatch: next });
updateClipboardWatcher();
return next;
});
ipcMain.handle(IPC_CHANNELS.PICK_FOLDER, async () => {
const options = {
properties: ["openDirectory", "createDirectory"] as Array<"openDirectory" | "createDirectory">
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
return result.canceled ? null : result.filePaths[0] || null;
});
ipcMain.handle(IPC_CHANNELS.PICK_CONTAINERS, async () => {
const options = {
properties: ["openFile", "multiSelections"] as Array<"openFile" | "multiSelections">,
filters: [
{ name: "Container", extensions: ["dlc"] },
{ name: "Alle Dateien", extensions: ["*"] }
]
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
return result.canceled ? [] : result.filePaths;
});
ipcMain.handle(IPC_CHANNELS.GET_SESSION_STATS, () => controller.getSessionStats());
ipcMain.handle(IPC_CHANNELS.RESTART, () => {
app.relaunch();
app.quit();
});
ipcMain.handle(IPC_CHANNELS.QUIT, () => {
app.quit();
});
ipcMain.handle(IPC_CHANNELS.EXPORT_BACKUP, async () => {
const options = {
defaultPath: `mdd-backup-${new Date().toISOString().slice(0, 10)}.json`,
filters: [{ name: "Backup", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportBackup();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.OPEN_LOG, async () => {
const logPath = getLogFilePath();
await shell.openPath(logPath);
});
ipcMain.handle(IPC_CHANNELS.OPEN_SESSION_LOG, async () => {
const logPath = controller.getSessionLogPath();
if (logPath) {
await shell.openPath(logPath);
}
});
ipcMain.handle(IPC_CHANNELS.IMPORT_BACKUP, async () => {
const options = {
properties: ["openFile"] as Array<"openFile">,
filters: [
{ name: "Backup", extensions: ["json"] },
{ name: "Alle Dateien", extensions: ["*"] }
]
};
const result = mainWindow ? await dialog.showOpenDialog(mainWindow, options) : await dialog.showOpenDialog(options);
if (result.canceled || result.filePaths.length === 0) {
return { restored: false, message: "Abgebrochen" };
}
const filePath = result.filePaths[0];
const stat = await fs.promises.stat(filePath);
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
if (stat.size > BACKUP_MAX_BYTES) {
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
}
const json = await fs.promises.readFile(filePath, "utf8");
return controller.importBackup(json);
});
controller.onState = (snapshot) => {
if (!mainWindow || mainWindow.isDestroyed()) {
return;
}
mainWindow.webContents.send(IPC_CHANNELS.STATE_UPDATE, snapshot);
};
}
app.on("second-instance", () => {
if (mainWindow) {
if (mainWindow.isMinimized()) {
mainWindow.restore();
}
mainWindow.show();
mainWindow.focus();
}
});
app.whenReady().then(() => {
cleanupStaleSubstDrives();
registerIpcHandlers();
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
updateClipboardWatcher();
updateTray();
app.on("activate", () => {
if (BrowserWindow.getAllWindows().length === 0) {
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
}
});
}).catch((error) => {
console.error("App startup failed:", error);
app.quit();
});
app.on("window-all-closed", () => {
if (process.platform !== "darwin") {
app.quit();
}
});
app.on("before-quit", () => {
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
stopClipboardWatcher();
destroyTray();
shutdownDaemon();
try {
controller.shutdown();
} catch (error) {
logger.error(`Fehler beim Shutdown: ${String(error)}`);
}
});

View File

@ -1,424 +0,0 @@
import { UnrestrictedLink } from "./realdebrid";
import { compactErrorText, filenameFromUrl, sleep } from "./utils";
type MegaCredentials = {
login: string;
password: string;
};
type CodeEntry = {
code: string;
linkHint: string;
};
const LOGIN_URL = "https://www.mega-debrid.eu/index.php?form=login";
const DEBRID_URL = "https://www.mega-debrid.eu/index.php?form=debrid";
const DEBRID_AJAX_URL = "https://www.mega-debrid.eu/index.php?ajax=debrid&json";
const DEBRID_REFERER = "https://www.mega-debrid.eu/index.php?page=debrideur&lang=de";
function normalizeLink(link: string): string {
return link.trim().toLowerCase();
}
function parseSetCookieFromHeaders(headers: Headers): string {
const getSetCookie = (headers as unknown as { getSetCookie?: () => string[] }).getSetCookie;
if (typeof getSetCookie === "function") {
const values = getSetCookie.call(headers)
.map((entry) => entry.split(";")[0].trim())
.filter(Boolean);
if (values.length > 0) {
return values.join("; ");
}
}
const raw = headers.get("set-cookie") || "";
if (!raw) {
return "";
}
return raw
.split(/,(?=[^;=]+?=)/g)
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
}
const PERMANENT_HOSTER_ERRORS = [
"hosternotavailable",
"filenotfound",
"file_unavailable",
"file not found",
"link is dead",
"file has been removed",
"file has been deleted",
"file was deleted",
"file was removed",
"not available",
"file is no longer available"
];
function parsePageErrors(html: string): string[] {
const errors: string[] = [];
const errorRegex = /class=["'][^"']*\berror\b[^"']*["'][^>]*>([^<]+)</gi;
let m: RegExpExecArray | null;
while ((m = errorRegex.exec(html)) !== null) {
const text = m[1].replace(/^Fehler:\s*/i, "").trim();
if (text) {
errors.push(text);
}
}
return errors;
}
function isPermanentHosterError(errors: string[]): string | null {
for (const err of errors) {
const lower = err.toLowerCase();
for (const pattern of PERMANENT_HOSTER_ERRORS) {
if (lower.includes(pattern)) {
return err;
}
}
}
return null;
}
function parseCodes(html: string): CodeEntry[] {
const entries: CodeEntry[] = [];
const cardRegex = /<div[^>]*class=['"][^'"]*acp-box[^'"]*['"][^>]*>[\s\S]*?<\/div>/gi;
let cardMatch: RegExpExecArray | null;
while ((cardMatch = cardRegex.exec(html)) !== null) {
const block = cardMatch[0];
const linkTitle = (block.match(/<h3>\s*Link:\s*([^<]+)<\/h3>/i)?.[1] || "").trim();
const code = block.match(/processDebrid\(\d+,'([^']+)',0\)/i)?.[1] || "";
if (!code) {
continue;
}
entries.push({ code, linkHint: normalizeLink(linkTitle) });
}
if (entries.length === 0) {
const fallbackRegex = /processDebrid\(\d+,'([^']+)',0\)/gi;
let m: RegExpExecArray | null;
while ((m = fallbackRegex.exec(html)) !== null) {
entries.push({ code: m[1], linkHint: "" });
}
}
return entries;
}
function pickCode(entries: CodeEntry[], link: string): string {
if (entries.length === 0) {
return "";
}
const target = normalizeLink(link);
const match = entries.find((entry) => entry.linkHint && entry.linkHint.includes(target));
return (match?.code || entries[0].code || "").trim();
}
function parseDebridJson(text: string): { link: string; text: string } | null {
try {
const parsed = JSON.parse(text) as { link?: string; text?: string };
return {
link: String(parsed.link || ""),
text: String(parsed.text || "")
};
} catch {
return null;
}
}
function abortError(): Error {
return new Error("aborted:mega-web");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
const timeoutSignal = AbortSignal.timeout(timeoutMs);
if (!signal) {
return timeoutSignal;
}
return AbortSignal.any([signal, timeoutSignal]);
}
function throwIfAborted(signal?: AbortSignal): void {
if (signal?.aborted) {
throw abortError();
}
}
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
if (!signal) {
await sleep(ms);
return;
}
if (signal.aborted) {
throw abortError();
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
signal.removeEventListener("abort", onAbort);
resolve();
}, Math.max(0, ms));
const onAbort = (): void => {
if (timer) {
clearTimeout(timer);
timer = null;
}
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
});
}
async function raceWithAbort<T>(promise: Promise<T>, signal?: AbortSignal): Promise<T> {
if (!signal) {
return promise;
}
if (signal.aborted) {
throw abortError();
}
return new Promise<T>((resolve, reject) => {
let settled = false;
const onAbort = (): void => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(abortError());
};
signal.addEventListener("abort", onAbort, { once: true });
promise.then((value) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
resolve(value);
}, (error) => {
if (settled) {
return;
}
settled = true;
signal.removeEventListener("abort", onAbort);
reject(error);
});
});
}
export class MegaWebFallback {
private queue: Promise<unknown> = Promise.resolve();
private getCredentials: () => MegaCredentials;
private cookie = "";
private cookieSetAt = 0;
public constructor(getCredentials: () => MegaCredentials) {
this.getCredentials = getCredentials;
}
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
const overallSignal = withTimeoutSignal(signal, 180000);
return this.runExclusive(async () => {
throwIfAborted(overallSignal);
const creds = this.getCredentials();
if (!creds.login.trim() || !creds.password.trim()) {
return null;
}
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
await this.login(creds.login, creds.password, overallSignal);
}
const generated = await this.generate(link, overallSignal);
if (!generated) {
this.cookie = "";
await this.login(creds.login, creds.password, overallSignal);
const retry = await this.generate(link, overallSignal);
if (!retry) {
return null;
}
return {
directUrl: retry.directUrl,
fileName: retry.fileName || filenameFromUrl(link),
fileSize: null,
retriesUsed: 0
};
}
return {
directUrl: generated.directUrl,
fileName: generated.fileName || filenameFromUrl(link),
fileSize: null,
retriesUsed: 0
};
}, overallSignal);
}
public invalidateSession(): void {
this.cookie = "";
this.cookieSetAt = 0;
}
private async runExclusive<T>(job: () => Promise<T>, signal?: AbortSignal): Promise<T> {
const queuedAt = Date.now();
const QUEUE_WAIT_TIMEOUT_MS = 90000;
const guardedJob = async (): Promise<T> => {
throwIfAborted(signal);
const waited = Date.now() - queuedAt;
if (waited > QUEUE_WAIT_TIMEOUT_MS) {
throw new Error(`Mega-Web Queue-Timeout (${Math.floor(waited / 1000)}s gewartet)`);
}
return job();
};
const run = this.queue.then(guardedJob, guardedJob);
this.queue = run.then(() => undefined, () => undefined);
return raceWithAbort(run, signal);
}
private async login(login: string, password: string, signal?: AbortSignal): Promise<void> {
throwIfAborted(signal);
const response = await fetch(LOGIN_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0"
},
body: new URLSearchParams({
login,
password,
remember: "on"
}),
redirect: "manual",
signal: withTimeoutSignal(signal, 30000)
});
const cookie = parseSetCookieFromHeaders(response.headers);
if (!cookie) {
throw new Error("Mega-Web Login liefert kein Session-Cookie");
}
const verify = await fetch(DEBRID_REFERER, {
method: "GET",
headers: {
"User-Agent": "Mozilla/5.0",
Cookie: cookie,
Referer: DEBRID_REFERER
},
signal: withTimeoutSignal(signal, 30000)
});
const verifyHtml = await verify.text();
const hasDebridForm = /id=["']debridForm["']/i.test(verifyHtml) || /name=["']links["']/i.test(verifyHtml);
if (!hasDebridForm) {
throw new Error("Mega-Web Login ungültig oder Session blockiert");
}
this.cookie = cookie;
this.cookieSetAt = Date.now();
}
private async generate(link: string, signal?: AbortSignal): Promise<{ directUrl: string; fileName: string } | null> {
throwIfAborted(signal);
const page = await fetch(DEBRID_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: this.cookie,
Referer: DEBRID_REFERER
},
body: new URLSearchParams({
links: link,
password: "",
showLinks: "1"
}),
signal: withTimeoutSignal(signal, 30000)
});
const html = await page.text();
// Check for permanent hoster errors before looking for debrid codes
const pageErrors = parsePageErrors(html);
const permanentError = isPermanentHosterError(pageErrors);
if (permanentError) {
throw new Error(`Mega-Web: Link permanent ungültig (${permanentError})`);
}
const code = pickCode(parseCodes(html), link);
if (!code) {
return null;
}
for (let attempt = 1; attempt <= 60; attempt += 1) {
throwIfAborted(signal);
const res = await fetch(DEBRID_AJAX_URL, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "Mozilla/5.0",
Cookie: this.cookie,
Referer: DEBRID_REFERER
},
body: new URLSearchParams({
code,
autodl: "0"
}),
signal: withTimeoutSignal(signal, 15000)
});
const text = (await res.text()).trim();
if (text === "reload") {
await sleepWithSignal(650, signal);
continue;
}
if (text === "false") {
return null;
}
const parsed = parseDebridJson(text);
if (!parsed) {
return null;
}
if (!parsed.link) {
if (/hoster does not respond correctly|could not be done for this moment/i.test(parsed.text || "")) {
await sleepWithSignal(1200, signal);
continue;
}
return null;
}
const fromText = parsed.text
.replace(/<[^>]*>/g, " ")
.replace(/\s+/g, " ")
.trim();
const nameMatch = fromText.match(/([\w .\-\[\]\(\)]+\.(?:rar|r\d{2}|zip|7z|mkv|mp4|avi|mp3|flac))/i);
const fileName = (nameMatch?.[1] || filenameFromUrl(link)).trim();
return {
directUrl: parsed.link,
fileName
};
}
return null;
}
public dispose(): void {
this.cookie = "";
}
}
export function compactMegaWebError(error: unknown): string {
return compactErrorText(error);
}

View File

@ -1,203 +0,0 @@
import { API_BASE_URL, APP_VERSION, REQUEST_RETRIES } from "./constants";
import { compactErrorText, sleep } from "./utils";
const DEBRID_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
export interface UnrestrictedLink {
fileName: string;
directUrl: string;
fileSize: number | null;
retriesUsed: number;
skipTlsVerify?: boolean;
}
function shouldRetryStatus(status: number): boolean {
return status === 429 || status >= 500;
}
function retryDelay(attempt: number): number {
return Math.min(5000, 400 * 2 ** attempt);
}
function parseRetryAfterMs(value: string | null): number {
const text = String(value || "").trim();
if (!text) {
return 0;
}
const asSeconds = Number(text);
if (Number.isFinite(asSeconds) && asSeconds >= 0) {
return Math.min(120000, Math.floor(asSeconds * 1000));
}
const asDate = Date.parse(text);
if (Number.isFinite(asDate)) {
return Math.min(120000, Math.max(0, asDate - Date.now()));
}
return 0;
}
function retryDelayForResponse(response: Response, attempt: number): number {
if (response.status !== 429) {
return retryDelay(attempt);
}
const fromHeader = parseRetryAfterMs(response.headers.get("retry-after"));
return fromHeader > 0 ? fromHeader : retryDelay(attempt);
}
function readHttpStatusFromErrorText(text: string): number {
const match = String(text || "").match(/HTTP\s+(\d{3})/i);
return match ? Number(match[1]) : 0;
}
function isRetryableErrorText(text: string): boolean {
const status = readHttpStatusFromErrorText(text);
if (status === 429 || status >= 500) {
return true;
}
const lower = String(text || "").toLowerCase();
return lower.includes("timeout")
|| lower.includes("network")
|| lower.includes("fetch failed")
|| lower.includes("aborted")
|| lower.includes("econnreset")
|| lower.includes("enotfound")
|| lower.includes("etimedout")
|| lower.includes("html statt json");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
if (!signal) {
return AbortSignal.timeout(timeoutMs);
}
return AbortSignal.any([signal, AbortSignal.timeout(timeoutMs)]);
}
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
if (!signal) {
await sleep(ms);
return;
}
// Check before entering the Promise constructor to avoid a race where the timer
// resolves before the aborted check runs (especially when ms=0).
if (signal.aborted) {
throw new Error("aborted");
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
signal.removeEventListener("abort", onAbort);
resolve();
}, Math.max(0, ms));
const onAbort = (): void => {
if (timer) {
clearTimeout(timer);
timer = null;
}
signal.removeEventListener("abort", onAbort);
reject(new Error("aborted"));
};
signal.addEventListener("abort", onAbort, { once: true });
});
}
function looksLikeHtmlResponse(contentType: string, body: string): boolean {
const type = String(contentType || "").toLowerCase();
if (type.includes("text/html") || type.includes("application/xhtml+xml")) {
return true;
}
return /^\s*<(!doctype\s+html|html\b)/i.test(String(body || ""));
}
function parseErrorBody(status: number, body: string, contentType: string): string {
if (looksLikeHtmlResponse(contentType, body)) {
return `Real-Debrid lieferte HTML statt JSON (HTTP ${status})`;
}
const clean = compactErrorText(body);
return clean || `HTTP ${status}`;
}
export class RealDebridClient {
private token: string;
public constructor(token: string) {
this.token = token;
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
try {
const body = new URLSearchParams({ link });
const response = await fetch(`${API_BASE_URL}/unrestrict/link`, {
method: "POST",
headers: {
Authorization: `Bearer ${this.token}`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": DEBRID_USER_AGENT
},
body,
signal: withTimeoutSignal(signal, 30000)
});
const text = await response.text();
const contentType = String(response.headers.get("content-type") || "");
if (!response.ok) {
const parsed = parseErrorBody(response.status, text, contentType);
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
throw new Error(parsed);
}
if (looksLikeHtmlResponse(contentType, text)) {
throw new Error("Real-Debrid lieferte HTML statt JSON");
}
let payload: Record<string, unknown>;
try {
payload = JSON.parse(text) as Record<string, unknown>;
} catch {
throw new Error("Ungültige JSON-Antwort von Real-Debrid");
}
const directUrl = String(payload.download || payload.link || "").trim();
if (!directUrl) {
throw new Error("Unrestrict ohne Download-URL");
}
try {
const parsedUrl = new URL(directUrl);
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
}
} catch (urlError) {
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
}
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
const fileSizeRaw = Number(payload.filesize ?? NaN);
return {
fileName,
directUrl,
fileSize: Number.isFinite(fileSizeRaw) && fileSizeRaw > 0 ? Math.floor(fileSizeRaw) : null,
retriesUsed: attempt - 1
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
break;
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}

View File

@ -1,128 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { setLogListener } from "./logger";
const SESSION_LOG_FLUSH_INTERVAL_MS = 200;
let sessionLogPath: string | null = null;
let sessionLogsDir: string | null = null;
let pendingLines: string[] = [];
let flushTimer: NodeJS.Timeout | null = null;
function formatTimestamp(): string {
const now = new Date();
const y = now.getFullYear();
const mo = String(now.getMonth() + 1).padStart(2, "0");
const d = String(now.getDate()).padStart(2, "0");
const h = String(now.getHours()).padStart(2, "0");
const mi = String(now.getMinutes()).padStart(2, "0");
const s = String(now.getSeconds()).padStart(2, "0");
return `${y}-${mo}-${d}_${h}-${mi}-${s}`;
}
function flushPending(): void {
if (pendingLines.length === 0 || !sessionLogPath) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
try {
fs.appendFileSync(sessionLogPath, chunk, "utf8");
} catch {
// ignore write errors
}
}
function scheduleFlush(): void {
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
flushPending();
}, SESSION_LOG_FLUSH_INTERVAL_MS);
}
function appendToSessionLog(line: string): void {
if (!sessionLogPath) {
return;
}
pendingLines.push(line);
scheduleFlush();
}
async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<void> {
try {
const files = await fs.promises.readdir(dir);
const cutoff = Date.now() - maxAgeDays * 24 * 60 * 60 * 1000;
for (const file of files) {
if (!file.startsWith("session_") || !file.endsWith(".txt")) {
continue;
}
const filePath = path.join(dir, file);
try {
const stat = await fs.promises.stat(filePath);
if (stat.mtimeMs < cutoff) {
await fs.promises.unlink(filePath);
}
} catch {
// ignore - file may be locked
}
}
} catch {
// ignore - dir may not exist
}
}
export function initSessionLog(baseDir: string): void {
sessionLogsDir = path.join(baseDir, "session-logs");
try {
fs.mkdirSync(sessionLogsDir, { recursive: true });
} catch {
sessionLogsDir = null;
return;
}
const timestamp = formatTimestamp();
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);
const isoTimestamp = new Date().toISOString();
try {
fs.writeFileSync(sessionLogPath, `=== Session gestartet: ${isoTimestamp} ===\n`, "utf8");
} catch {
sessionLogPath = null;
return;
}
setLogListener((line) => appendToSessionLog(line));
void cleanupOldSessionLogs(sessionLogsDir, 7);
}
export function getSessionLogPath(): string | null {
return sessionLogPath;
}
export function shutdownSessionLog(): void {
if (!sessionLogPath) {
return;
}
// Flush any pending lines
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
flushPending();
// Write closing line
const isoTimestamp = new Date().toISOString();
try {
fs.appendFileSync(sessionLogPath, `=== Session beendet: ${isoTimestamp} ===\n`, "utf8");
} catch {
// ignore
}
setLogListener(null);
sessionLogPath = null;
}

View File

@ -1,749 +0,0 @@
import fs from "node:fs";
import fsp from "node:fs/promises";
import path from "node:path";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, PackagePriority, SessionState } from "../shared/types";
import { defaultSettings } from "./constants";
import { logger } from "./logger";
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
const VALID_SPEED_MODES = new Set(["global", "per_download"]);
const VALID_THEMES = new Set(["dark", "light"]);
const VALID_EXTRACT_CPU_PRIORITIES = new Set(["high", "middle", "low"]);
const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
function asText(value: unknown): string {
return String(value ?? "").trim();
}
function clampNumber(value: unknown, fallback: number, min: number, max: number): number {
const num = Number(value);
if (!Number.isFinite(num)) {
return fallback;
}
return Math.max(min, Math.min(max, Math.floor(num)));
}
function createScheduleId(index: number): string {
return `sched-${Date.now().toString(36)}-${index.toString(36)}-${Math.random().toString(36).slice(2, 8)}`;
}
function normalizeBandwidthSchedules(raw: unknown): BandwidthScheduleEntry[] {
if (!Array.isArray(raw)) {
return [];
}
const normalized: BandwidthScheduleEntry[] = [];
for (let index = 0; index < raw.length; index += 1) {
const entry = raw[index];
if (!entry || typeof entry !== "object") {
continue;
}
const value = entry as Partial<BandwidthScheduleEntry>;
const rawId = typeof value.id === "string" ? value.id.trim() : "";
normalized.push({
id: rawId || createScheduleId(index),
startHour: clampNumber(value.startHour, 0, 0, 23),
endHour: clampNumber(value.endHour, 8, 0, 23),
speedLimitKbps: clampNumber(value.speedLimitKbps, 0, 0, 500000),
enabled: value.enabled === undefined ? true : Boolean(value.enabled)
});
}
return normalized;
}
function normalizeAbsoluteDir(value: unknown, fallback: string): string {
const text = asText(value);
if (!text || !path.isAbsolute(text)) {
return path.resolve(fallback);
}
return path.resolve(text);
}
const DEFAULT_COLUMN_ORDER = ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"];
const ALL_VALID_COLUMNS = new Set([...DEFAULT_COLUMN_ORDER, "added"]);
function normalizeColumnOrder(raw: unknown): string[] {
if (!Array.isArray(raw) || raw.length === 0) {
return [...DEFAULT_COLUMN_ORDER];
}
const valid = ALL_VALID_COLUMNS;
const seen = new Set<string>();
const result: string[] = [];
for (const col of raw) {
if (typeof col === "string" && valid.has(col) && !seen.has(col)) {
seen.add(col);
result.push(col);
}
}
// "name" is mandatory — ensure it's always present
if (!seen.has("name")) {
result.unshift("name");
}
return result;
}
const DEPRECATED_UPDATE_REPOS = new Set([
"sucukdeluxe/real-debrid-downloader"
]);
function migrateUpdateRepo(raw: string, fallback: string): string {
const trimmed = raw.trim();
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
return fallback;
}
return trimmed;
}
export function normalizeSettings(settings: AppSettings): AppSettings {
const defaults = defaultSettings();
const normalized: AppSettings = {
token: asText(settings.token),
megaLogin: asText(settings.megaLogin),
megaPassword: asText(settings.megaPassword),
bestToken: asText(settings.bestToken),
allDebridToken: asText(settings.allDebridToken),
ddownloadLogin: asText(settings.ddownloadLogin),
ddownloadPassword: asText(settings.ddownloadPassword),
oneFichierApiKey: asText(settings.oneFichierApiKey),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
rememberToken: Boolean(settings.rememberToken),
providerPrimary: settings.providerPrimary,
providerSecondary: settings.providerSecondary,
providerTertiary: settings.providerTertiary,
autoProviderFallback: Boolean(settings.autoProviderFallback),
outputDir: normalizeAbsoluteDir(settings.outputDir, defaults.outputDir),
packageName: asText(settings.packageName),
autoExtract: Boolean(settings.autoExtract),
autoRename4sf4sj: Boolean(settings.autoRename4sf4sj),
extractDir: normalizeAbsoluteDir(settings.extractDir, defaults.extractDir),
collectMkvToLibrary: Boolean(settings.collectMkvToLibrary),
mkvLibraryDir: normalizeAbsoluteDir(settings.mkvLibraryDir, defaults.mkvLibraryDir),
createExtractSubfolder: Boolean(settings.createExtractSubfolder),
hybridExtract: Boolean(settings.hybridExtract),
cleanupMode: settings.cleanupMode,
extractConflictMode: settings.extractConflictMode,
removeLinkFilesAfterExtract: Boolean(settings.removeLinkFilesAfterExtract),
removeSamplesAfterExtract: Boolean(settings.removeSamplesAfterExtract),
enableIntegrityCheck: Boolean(settings.enableIntegrityCheck),
autoResumeOnStart: Boolean(settings.autoResumeOnStart),
autoReconnect: Boolean(settings.autoReconnect),
maxParallel: clampNumber(settings.maxParallel, defaults.maxParallel, 1, 50),
maxParallelExtract: clampNumber(settings.maxParallelExtract, defaults.maxParallelExtract, 1, 8),
retryLimit: clampNumber(settings.retryLimit, defaults.retryLimit, 0, 99),
reconnectWaitSeconds: clampNumber(settings.reconnectWaitSeconds, defaults.reconnectWaitSeconds, 10, 600),
completedCleanupPolicy: settings.completedCleanupPolicy,
speedLimitEnabled: Boolean(settings.speedLimitEnabled),
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
speedLimitMode: settings.speedLimitMode,
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
clipboardWatch: Boolean(settings.clipboardWatch),
minimizeToTray: Boolean(settings.minimizeToTray),
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
autoSkipExtracted: settings.autoSkipExtracted !== undefined ? Boolean(settings.autoSkipExtracted) : defaults.autoSkipExtracted,
confirmDeleteSelection: settings.confirmDeleteSelection !== undefined ? Boolean(settings.confirmDeleteSelection) : defaults.confirmDeleteSelection,
totalDownloadedAllTime: typeof settings.totalDownloadedAllTime === "number" && settings.totalDownloadedAllTime >= 0 ? settings.totalDownloadedAllTime : defaults.totalDownloadedAllTime,
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
columnOrder: normalizeColumnOrder(settings.columnOrder),
extractCpuPriority: settings.extractCpuPriority,
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
};
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
normalized.providerPrimary = defaults.providerPrimary;
}
if (!VALID_FALLBACK_PROVIDERS.has(normalized.providerSecondary)) {
normalized.providerSecondary = "none";
}
if (!VALID_FALLBACK_PROVIDERS.has(normalized.providerTertiary)) {
normalized.providerTertiary = "none";
}
if (normalized.providerSecondary === normalized.providerPrimary) {
normalized.providerSecondary = "none";
}
if (normalized.providerTertiary === normalized.providerPrimary || normalized.providerTertiary === normalized.providerSecondary) {
normalized.providerTertiary = "none";
}
if (!VALID_CLEANUP_MODES.has(normalized.cleanupMode)) {
normalized.cleanupMode = defaults.cleanupMode;
}
if (!VALID_CONFLICT_MODES.has(normalized.extractConflictMode)) {
normalized.extractConflictMode = defaults.extractConflictMode;
}
if (!VALID_FINISHED_POLICIES.has(normalized.completedCleanupPolicy)) {
normalized.completedCleanupPolicy = defaults.completedCleanupPolicy;
}
if (!VALID_SPEED_MODES.has(normalized.speedLimitMode)) {
normalized.speedLimitMode = defaults.speedLimitMode;
}
if (!VALID_EXTRACT_CPU_PRIORITIES.has(normalized.extractCpuPriority)) {
normalized.extractCpuPriority = defaults.extractCpuPriority;
}
return normalized;
}
function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
if (settings.rememberToken) {
return settings;
}
return {
...settings,
token: "",
megaLogin: "",
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: ""
};
}
export interface StoragePaths {
baseDir: string;
configFile: string;
sessionFile: string;
historyFile: string;
}
export function createStoragePaths(baseDir: string): StoragePaths {
return {
baseDir,
configFile: path.join(baseDir, "rd_downloader_config.json"),
sessionFile: path.join(baseDir, "rd_session_state.json"),
historyFile: path.join(baseDir, "rd_history.json")
};
}
function ensureBaseDir(baseDir: string): void {
fs.mkdirSync(baseDir, { recursive: true });
}
function asRecord(value: unknown): Record<string, unknown> | null {
if (!value || typeof value !== "object" || Array.isArray(value)) {
return null;
}
return value as Record<string, unknown>;
}
function readSettingsFile(filePath: string): AppSettings | null {
try {
const parsed = JSON.parse(fs.readFileSync(filePath, "utf8")) as AppSettings;
const merged = normalizeSettings({
...defaultSettings(),
...parsed
});
return sanitizeCredentialPersistence(merged);
} catch {
return null;
}
}
export function normalizeLoadedSession(raw: unknown): SessionState {
const fallback = emptySession();
const parsed = asRecord(raw);
if (!parsed) {
return fallback;
}
const now = Date.now();
const itemsById: Record<string, DownloadItem> = {};
const rawItems = asRecord(parsed.items) ?? {};
for (const [entryId, rawItem] of Object.entries(rawItems)) {
const item = asRecord(rawItem);
if (!item) {
continue;
}
const id = asText(item.id) || entryId;
const packageId = asText(item.packageId);
const url = asText(item.url);
if (!id || !packageId || !url) {
continue;
}
const statusRaw = asText(item.status) as DownloadStatus;
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const providerRaw = asText(item.provider) as DebridProvider;
const onlineStatusRaw = asText(item.onlineStatus);
itemsById[id] = {
id,
packageId,
url,
provider: VALID_ITEM_PROVIDERS.has(providerRaw) ? providerRaw : null,
status,
retries: clampNumber(item.retries, 0, 0, 1_000_000),
speedBps: clampNumber(item.speedBps, 0, 0, 10_000_000_000),
downloadedBytes: clampNumber(item.downloadedBytes, 0, 0, 10_000_000_000_000),
totalBytes: item.totalBytes == null ? null : clampNumber(item.totalBytes, 0, 0, 10_000_000_000_000),
progressPercent: clampNumber(item.progressPercent, 0, 0, 100),
fileName: asText(item.fileName) || "download.bin",
targetPath: asText(item.targetPath),
resumable: item.resumable === undefined ? true : Boolean(item.resumable),
attempts: clampNumber(item.attempts, 0, 0, 10_000),
lastError: asText(item.lastError),
fullStatus: asText(item.fullStatus),
onlineStatus: VALID_ONLINE_STATUSES.has(onlineStatusRaw) ? onlineStatusRaw as "online" | "offline" | "checking" : undefined,
createdAt: clampNumber(item.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(item.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
const packagesById: Record<string, PackageEntry> = {};
const rawPackages = asRecord(parsed.packages) ?? {};
for (const [entryId, rawPkg] of Object.entries(rawPackages)) {
const pkg = asRecord(rawPkg);
if (!pkg) {
continue;
}
const id = asText(pkg.id) || entryId;
if (!id) {
continue;
}
const statusRaw = asText(pkg.status) as DownloadStatus;
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const rawItemIds = Array.isArray(pkg.itemIds) ? pkg.itemIds : [];
packagesById[id] = {
id,
name: asText(pkg.name) || "Paket",
outputDir: asText(pkg.outputDir),
extractDir: asText(pkg.extractDir),
status,
itemIds: rawItemIds
.map((value) => asText(value))
.filter((value) => value.length > 0),
cancelled: Boolean(pkg.cancelled),
enabled: pkg.enabled === undefined ? true : Boolean(pkg.enabled),
priority: VALID_PACKAGE_PRIORITIES.has(asText(pkg.priority)) ? asText(pkg.priority) as PackagePriority : "normal",
createdAt: clampNumber(pkg.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(pkg.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
for (const [itemId, item] of Object.entries(itemsById)) {
if (!packagesById[item.packageId]) {
delete itemsById[itemId];
}
}
for (const pkg of Object.values(packagesById)) {
pkg.itemIds = pkg.itemIds.filter((itemId) => {
const item = itemsById[itemId];
return Boolean(item) && item.packageId === pkg.id;
});
}
const rawOrder = Array.isArray(parsed.packageOrder) ? parsed.packageOrder : [];
const seenOrder = new Set<string>();
const packageOrder = rawOrder
.map((entry) => asText(entry))
.filter((id) => {
if (!(id in packagesById) || seenOrder.has(id)) {
return false;
}
seenOrder.add(id);
return true;
});
for (const packageId of Object.keys(packagesById)) {
if (!seenOrder.has(packageId)) {
seenOrder.add(packageId);
packageOrder.push(packageId);
}
}
return {
...fallback,
version: clampNumber(parsed.version, fallback.version, 1, 10),
packageOrder,
packages: packagesById,
items: itemsById,
runStartedAt: clampNumber(parsed.runStartedAt, 0, 0, Number.MAX_SAFE_INTEGER),
totalDownloadedBytes: clampNumber(parsed.totalDownloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
summaryText: asText(parsed.summaryText),
reconnectUntil: clampNumber(parsed.reconnectUntil, 0, 0, Number.MAX_SAFE_INTEGER),
reconnectReason: asText(parsed.reconnectReason),
paused: Boolean(parsed.paused),
running: Boolean(parsed.running),
updatedAt: clampNumber(parsed.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
}
export function loadSettings(paths: StoragePaths): AppSettings {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.configFile)) {
return defaultSettings();
}
const loaded = readSettingsFile(paths.configFile);
if (loaded) {
return loaded;
}
const backupFile = `${paths.configFile}.bak`;
const backupLoaded = fs.existsSync(backupFile) ? readSettingsFile(backupFile) : null;
if (backupLoaded) {
logger.warn("Konfiguration defekt, Backup-Datei wird verwendet");
try {
const payload = JSON.stringify(backupLoaded, null, 2);
const tempPath = `${paths.configFile}.tmp`;
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch {
// ignore restore write failure
}
return backupLoaded;
}
logger.error("Konfiguration konnte nicht geladen werden (auch Backup fehlgeschlagen)");
return defaultSettings();
}
function syncRenameWithExdevFallback(tempPath: string, targetPath: string): void {
try {
fs.renameSync(tempPath, targetPath);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
fs.copyFileSync(tempPath, targetPath);
try { fs.rmSync(tempPath, { force: true }); } catch {}
} else {
throw renameError;
}
}
}
function sessionTempPath(sessionFile: string, kind: "sync" | "async"): string {
return `${sessionFile}.${kind}.tmp`;
}
function sessionBackupPath(sessionFile: string): string {
return `${sessionFile}.bak`;
}
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
if (ACTIVE_STATUSES.has(item.status)) {
item.status = "queued";
item.lastError = "";
}
// Always clear stale speed values
item.speedBps = 0;
}
// Reset package-level active statuses to queued (mirrors item reset above)
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const pkg of Object.values(session.packages)) {
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
pkg.status = "queued";
}
pkg.postProcessLabel = undefined;
}
// Clear stale session-level running/paused flags
session.running = false;
session.paused = false;
return session;
}
function readSessionFile(filePath: string): SessionState | null {
try {
const parsed = JSON.parse(fs.readFileSync(filePath, "utf8")) as unknown;
return normalizeLoadedSessionTransientFields(normalizeLoadedSession(parsed));
} catch {
return null;
}
}
export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
ensureBaseDir(paths.baseDir);
// Create a backup of the existing config before overwriting
if (fs.existsSync(paths.configFile)) {
try {
fs.copyFileSync(paths.configFile, `${paths.configFile}.bak`);
} catch {
// Best-effort backup; proceed even if it fails
}
}
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
const tempPath = `${paths.configFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSettingsSaveRunning = false;
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.configFile, `${paths.configFile}.bak`).catch(() => {});
const tempPath = `${paths.configFile}.settings.tmp`;
await fsp.writeFile(tempPath, payload, "utf8");
try {
await fsp.rename(tempPath, paths.configFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
await fsp.copyFile(tempPath, paths.configFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
}
export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettings): Promise<void> {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
if (asyncSettingsSaveRunning) {
asyncSettingsSaveQueued = { paths, settings };
return;
}
asyncSettingsSaveRunning = true;
try {
await writeSettingsPayload(paths, payload);
} catch (error) {
logger.error(`Async Settings-Save fehlgeschlagen: ${String(error)}`);
} finally {
asyncSettingsSaveRunning = false;
if (asyncSettingsSaveQueued) {
const queued = asyncSettingsSaveQueued;
asyncSettingsSaveQueued = null;
void saveSettingsAsync(queued.paths, queued.settings);
}
}
}
export function emptySession(): SessionState {
return {
version: 2,
packageOrder: [],
packages: {},
items: {},
runStartedAt: 0,
totalDownloadedBytes: 0,
summaryText: "",
reconnectUntil: 0,
reconnectReason: "",
paused: false,
running: false,
updatedAt: Date.now()
};
}
export function loadSession(paths: StoragePaths): SessionState {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.sessionFile)) {
return emptySession();
}
const primary = readSessionFile(paths.sessionFile);
if (primary) {
return primary;
}
const backupFile = sessionBackupPath(paths.sessionFile);
const backup = fs.existsSync(backupFile) ? readSessionFile(backupFile) : null;
if (backup) {
logger.warn("Session defekt, Backup-Datei wird verwendet");
try {
const payload = JSON.stringify({ ...backup, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch {
// ignore restore write failure
}
return backup;
}
logger.error("Session konnte nicht geladen werden (auch Backup fehlgeschlagen)");
return emptySession();
}
export function saveSession(paths: StoragePaths, session: SessionState): void {
syncSaveGeneration += 1;
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.sessionFile)) {
try {
fs.copyFileSync(paths.sessionFile, sessionBackupPath(paths.sessionFile));
} catch {
// Best-effort backup; proceed even if it fails
}
}
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSaveRunning = false;
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let syncSaveGeneration = 0;
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
const tempPath = sessionTempPath(paths.sessionFile, "async");
await fsp.writeFile(tempPath, payload, "utf8");
// If a synchronous save occurred after this async save started, discard the stale write
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
try {
await fsp.rename(tempPath, paths.sessionFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
await fsp.copyFile(tempPath, paths.sessionFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
}
async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Promise<void> {
if (asyncSaveRunning) {
asyncSaveQueued = { paths, payload };
return;
}
asyncSaveRunning = true;
const gen = syncSaveGeneration;
try {
await writeSessionPayload(paths, payload, gen);
} catch (error) {
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
} finally {
asyncSaveRunning = false;
if (asyncSaveQueued) {
const queued = asyncSaveQueued;
asyncSaveQueued = null;
void saveSessionPayloadAsync(queued.paths, queued.payload);
}
}
}
export function cancelPendingAsyncSaves(): void {
asyncSaveQueued = null;
asyncSettingsSaveQueued = null;
syncSaveGeneration += 1;
}
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
await saveSessionPayloadAsync(paths, payload);
}
const MAX_HISTORY_ENTRIES = 500;
function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null {
const entry = asRecord(raw);
if (!entry) return null;
const id = asText(entry.id) || `hist-${Date.now().toString(36)}-${index}`;
const name = asText(entry.name) || "Unbenannt";
const providerRaw = asText(entry.provider);
return {
id,
name,
totalBytes: clampNumber(entry.totalBytes, 0, 0, Number.MAX_SAFE_INTEGER),
downloadedBytes: clampNumber(entry.downloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
fileCount: clampNumber(entry.fileCount, 0, 0, 100000),
provider: VALID_ITEM_PROVIDERS.has(providerRaw as DebridProvider) ? providerRaw as DebridProvider : null,
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
status: entry.status === "deleted" ? "deleted" : "completed",
outputDir: asText(entry.outputDir),
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
};
}
export function loadHistory(paths: StoragePaths): HistoryEntry[] {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.historyFile)) {
return [];
}
try {
const raw = JSON.parse(fs.readFileSync(paths.historyFile, "utf8")) as unknown;
if (!Array.isArray(raw)) return [];
const entries: HistoryEntry[] = [];
for (let i = 0; i < raw.length && entries.length < MAX_HISTORY_ENTRIES; i++) {
const normalized = normalizeHistoryEntry(raw[i], i);
if (normalized) entries.push(normalized);
}
return entries;
} catch {
return [];
}
}
export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void {
ensureBaseDir(paths.baseDir);
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
const payload = JSON.stringify(trimmed, null, 2);
const tempPath = `${paths.historyFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = [entry, ...existing].slice(0, MAX_HISTORY_ENTRIES);
saveHistory(paths, updated);
return updated;
}
export function removeHistoryEntry(paths: StoragePaths, entryId: string): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = existing.filter(e => e.id !== entryId);
saveHistory(paths, updated);
return updated;
}
export function clearHistory(paths: StoragePaths): void {
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.historyFile)) {
try {
fs.unlinkSync(paths.historyFile);
} catch {
// ignore
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,268 +0,0 @@
import path from "node:path";
import { ParsedPackageInput } from "../shared/types";
function safeDecodeURIComponent(value: string): string {
try {
return decodeURIComponent(value);
} catch {
return value;
}
}
const WINDOWS_RESERVED_BASENAMES = new Set([
"con", "prn", "aux", "nul",
"com1", "com2", "com3", "com4", "com5", "com6", "com7", "com8", "com9",
"lpt1", "lpt2", "lpt3", "lpt4", "lpt5", "lpt6", "lpt7", "lpt8", "lpt9"
]);
export function compactErrorText(message: unknown, maxLen = 220): string {
const raw = String(message ?? "").replace(/<[^>]+>/g, " ").replace(/\s+/g, " ").trim();
if (!raw) {
return "Unbekannter Fehler";
}
const safeMaxLen = Number.isFinite(maxLen) ? Math.max(4, Math.floor(maxLen)) : 220;
if (raw.length <= safeMaxLen) {
return raw;
}
return `${raw.slice(0, safeMaxLen - 3)}...`;
}
export function sanitizeFilename(name: string): string {
const cleaned = String(name || "")
.replace(/\0/g, "")
.replace(/[\\/:*?"<>|]/g, " ")
.replace(/\s+/g, " ")
.trim();
let normalized = cleaned
.replace(/^[.\s]+/g, "")
.replace(/[.\s]+$/g, "")
.trim();
if (!normalized || normalized === "." || normalized === ".." || /^\.+$/.test(normalized)) {
return "Paket";
}
const parsed = path.parse(normalized);
const reservedBase = (parsed.name.split(".")[0] || parsed.name).toLowerCase();
if (WINDOWS_RESERVED_BASENAMES.has(reservedBase)) {
normalized = `${parsed.name.replace(/^([^.]*)/, "$1_")}${parsed.ext}`;
}
return normalized || "Paket";
}
export function isHttpLink(value: string): boolean {
const text = String(value || "").trim();
if (!text) {
return false;
}
try {
const url = new URL(text);
return (url.protocol === "http:" || url.protocol === "https:") && !!url.hostname;
} catch {
return false;
}
}
export function extractHttpLinksFromText(text: string): string[] {
const matches = String(text || "").match(/https?:\/\/[^\s<>"']+/gi) ?? [];
const seen = new Set<string>();
const links: string[] = [];
for (const match of matches) {
let candidate = String(match || "").trim();
let openParen = 0;
let closeParen = 0;
let openBracket = 0;
let closeBracket = 0;
for (const char of candidate) {
if (char === "(") {
openParen += 1;
} else if (char === ")") {
closeParen += 1;
} else if (char === "[") {
openBracket += 1;
} else if (char === "]") {
closeBracket += 1;
}
}
while (candidate.length > 0) {
const lastChar = candidate[candidate.length - 1];
if (![")", "]", ",", ".", "!", "?", ";", ":"].includes(lastChar)) {
break;
}
if (lastChar === ")") {
if (closeParen <= openParen) {
break;
}
}
if (lastChar === "]") {
if (closeBracket <= openBracket) {
break;
}
}
if (lastChar === ")") {
closeParen = Math.max(0, closeParen - 1);
} else if (lastChar === "]") {
closeBracket = Math.max(0, closeBracket - 1);
}
candidate = candidate.slice(0, -1);
}
if (!candidate || !isHttpLink(candidate) || seen.has(candidate)) {
continue;
}
seen.add(candidate);
links.push(candidate);
}
return links;
}
export function humanSize(bytes: number): string {
const value = Number(bytes);
if (!Number.isFinite(value) || value < 0) {
return "0 B";
}
if (value < 1024) {
return `${Math.round(value)} B`;
}
const units = ["KB", "MB", "GB", "TB"];
let size = value / 1024;
let unit = 0;
while (size >= 1024 && unit < units.length - 1) {
size /= 1024;
unit += 1;
}
return `${size.toFixed(size < 10 ? 1 : 0)} ${units[unit]}`;
}
export function filenameFromUrl(url: string): string {
try {
const parsed = new URL(url);
if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
return "download.bin";
}
const queryName = parsed.searchParams.get("filename")
|| parsed.searchParams.get("file")
|| parsed.searchParams.get("name")
|| parsed.searchParams.get("download")
|| parsed.searchParams.get("title")
|| "";
const rawName = queryName || path.basename(parsed.pathname || "");
const decoded = safeDecodeURIComponent(rawName || "").trim();
const normalized = decoded
.replace(/\.(rar|zip|7z|tar|gz|bz2|xz|iso|part\d+\.rar|r\d{2,3})\.html$/i, ".$1")
.replace(/\.(mp4|mkv|avi|mp3|flac|srt)\.html$/i, ".$1");
return sanitizeFilename(normalized || "download.bin");
} catch {
return "download.bin";
}
}
export function looksLikeOpaqueFilename(name: string): boolean {
const cleaned = sanitizeFilename(name || "").toLowerCase();
if (!cleaned || cleaned === "download.bin") {
return true;
}
const parsed = path.parse(cleaned);
return /^[a-f0-9]{24,}$/i.test(parsed.name || cleaned);
}
export function inferPackageNameFromLinks(links: string[]): string {
if (links.length === 0) {
return "Paket";
}
const names = links.map((link) => filenameFromUrl(link).toLowerCase());
const first = names[0];
const match = first.match(/^([a-z0-9._\- ]{3,80}?)(?:\.|-|_)(?:part\d+|r\d{2}|s\d{2}e\d{2})/i);
if (match) {
return sanitizeFilename(match[1]);
}
return sanitizeFilename(path.parse(first).name || "Paket");
}
export function uniquePreserveOrder(items: string[]): string[] {
const seen = new Set<string>();
const out: string[] = [];
for (const item of items) {
const trimmed = item.trim();
if (!trimmed || seen.has(trimmed)) {
continue;
}
seen.add(trimmed);
out.push(trimmed);
}
return out;
}
export function parsePackagesFromLinksText(rawText: string, defaultPackageName: string): ParsedPackageInput[] {
const lines = String(rawText || "").split(/\r?\n/);
const packages: ParsedPackageInput[] = [];
let currentName = String(defaultPackageName || "").trim();
let currentLinks: string[] = [];
const flush = (): void => {
const links = uniquePreserveOrder(currentLinks.filter((line) => isHttpLink(line)));
if (links.length > 0) {
const normalizedCurrentName = String(currentName || "").trim();
packages.push({
name: normalizedCurrentName
? sanitizeFilename(normalizedCurrentName)
: inferPackageNameFromLinks(links),
links
});
}
currentLinks = [];
};
for (const line of lines) {
const text = line.trim();
if (!text) {
continue;
}
const marker = text.match(/^#\s*package\s*:\s*(.+)$/i);
if (marker) {
flush();
currentName = String(marker[1] || "").trim();
continue;
}
currentLinks.push(text);
}
flush();
if (packages.length === 0) {
return [];
}
return packages;
}
export function ensureDirPath(baseDir: string, packageName: string): string {
if (!path.isAbsolute(baseDir)) {
throw new Error("baseDir muss ein absoluter Pfad sein");
}
return path.join(baseDir, sanitizeFilename(packageName));
}
export function nowMs(): number {
return Date.now();
}
export function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
export function formatEta(seconds: number): string {
if (!Number.isFinite(seconds) || seconds < 0) {
return "--";
}
const s = Math.floor(seconds);
const sec = s % 60;
const minTotal = Math.floor(s / 60);
const min = minTotal % 60;
const hr = Math.floor(minTotal / 60);
if (hr > 0) {
return `${String(hr).padStart(2, "0")}:${String(min).padStart(2, "0")}:${String(sec).padStart(2, "0")}`;
}
return `${String(min).padStart(2, "0")}:${String(sec).padStart(2, "0")}`;
}

View File

@ -1,87 +0,0 @@
import { contextBridge, ipcRenderer } from "electron";
import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress
} from "../shared/types";
import { IPC_CHANNELS } from "../shared/ipc";
import { ElectronApi } from "../shared/preload-api";
const api: ElectronApi = {
getSnapshot: (): Promise<UiSnapshot> => ipcRenderer.invoke(IPC_CHANNELS.GET_SNAPSHOT),
getVersion: (): Promise<string> => ipcRenderer.invoke(IPC_CHANNELS.GET_VERSION),
checkUpdates: (): Promise<UpdateCheckResult> => ipcRenderer.invoke(IPC_CHANNELS.CHECK_UPDATES),
installUpdate: () => ipcRenderer.invoke(IPC_CHANNELS.INSTALL_UPDATE),
openExternal: (url: string): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_EXTERNAL, url),
updateSettings: (settings: Partial<AppSettings>): Promise<AppSettings> => ipcRenderer.invoke(IPC_CHANNELS.UPDATE_SETTINGS, settings),
addLinks: (payload: AddLinksPayload): Promise<{ addedPackages: number; addedLinks: number; invalidCount: number }> =>
ipcRenderer.invoke(IPC_CHANNELS.ADD_LINKS, payload),
addContainers: (filePaths: string[]): Promise<{ addedPackages: number; addedLinks: number }> =>
ipcRenderer.invoke(IPC_CHANNELS.ADD_CONTAINERS, filePaths),
getStartConflicts: (): Promise<StartConflictEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_START_CONFLICTS),
resolveStartConflict: (packageId: string, policy: DuplicatePolicy): Promise<StartConflictResolutionResult> =>
ipcRenderer.invoke(IPC_CHANNELS.RESOLVE_START_CONFLICT, packageId, policy),
clearAll: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_ALL),
start: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START),
startPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_PACKAGES, packageIds),
stop: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.STOP),
togglePause: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PAUSE),
cancelPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CANCEL_PACKAGE, packageId),
renamePackage: (packageId: string, newName: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RENAME_PACKAGE, packageId, newName),
reorderPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REORDER_PACKAGES, packageIds),
removeItem: (itemId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_ITEM, itemId),
togglePackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PACKAGE, packageId),
exportQueue: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
importQueue: (json: string): Promise<{ addedPackages: number; addedLinks: number }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_QUEUE, json),
toggleClipboard: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_CLIPBOARD),
pickFolder: (): Promise<string | null> => ipcRenderer.invoke(IPC_CHANNELS.PICK_FOLDER),
pickContainers: (): Promise<string[]> => ipcRenderer.invoke(IPC_CHANNELS.PICK_CONTAINERS),
getSessionStats: (): Promise<SessionStats> => ipcRenderer.invoke(IPC_CHANNELS.GET_SESSION_STATS),
restart: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESTART),
quit: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.QUIT),
exportBackup: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_BACKUP),
importBackup: (): Promise<{ restored: boolean; message: string }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_BACKUP),
openLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_LOG),
openSessionLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_SESSION_LOG),
retryExtraction: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RETRY_EXTRACTION, packageId),
extractNow: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.EXTRACT_NOW, packageId),
resetPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_PACKAGE, packageId),
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),
onStateUpdate: (callback: (snapshot: UiSnapshot) => void): (() => void) => {
const listener = (_event: unknown, snapshot: UiSnapshot): void => callback(snapshot);
ipcRenderer.on(IPC_CHANNELS.STATE_UPDATE, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.STATE_UPDATE, listener);
};
},
onClipboardDetected: (callback: (links: string[]) => void): (() => void) => {
const listener = (_event: unknown, links: string[]): void => callback(links);
ipcRenderer.on(IPC_CHANNELS.CLIPBOARD_DETECTED, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.CLIPBOARD_DETECTED, listener);
};
},
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void): (() => void) => {
const listener = (_event: unknown, progress: UpdateInstallProgress): void => callback(progress);
ipcRenderer.on(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, listener);
return () => {
ipcRenderer.removeListener(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, listener);
};
}
};
contextBridge.exposeInMainWorld("rd", api);

File diff suppressed because it is too large Load Diff

View File

@ -1,12 +0,0 @@
<!doctype html>
<html lang="de">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Multi Debrid Downloader</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="./main.tsx"></script>
</body>
</html>

View File

@ -1,15 +0,0 @@
import React from "react";
import { createRoot } from "react-dom/client";
import { App } from "./App";
import "./styles.css";
const rootElement = document.getElementById("root");
if (!rootElement) {
throw new Error("Root element fehlt");
}
createRoot(rootElement).render(
<React.StrictMode>
<App />
</React.StrictMode>
);

View File

@ -1,25 +0,0 @@
import type { PackageEntry } from "../shared/types";
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
const fromIndex = order.indexOf(draggedPackageId);
const toIndex = order.indexOf(targetPackageId);
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
return order;
}
const next = [...order];
const [dragged] = next.splice(fromIndex, 1);
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
next.splice(insertIndex, 0, dragged);
return next;
}
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
const sorted = [...order];
sorted.sort((a, b) => {
const nameA = (packages[a]?.name ?? "").toLowerCase();
const nameB = (packages[b]?.name ?? "").toLowerCase();
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
return descending ? -cmp : cmp;
});
return sorted;
}

File diff suppressed because it is too large Load Diff

View File

@ -1,11 +0,0 @@
/// <reference types="vite/client" />
import type { ElectronApi } from "../shared/preload-api";
declare global {
interface Window {
rd: ElectronApi;
}
}
export {};

View File

@ -1,47 +0,0 @@
export const IPC_CHANNELS = {
GET_SNAPSHOT: "app:get-snapshot",
GET_VERSION: "app:get-version",
CHECK_UPDATES: "app:check-updates",
INSTALL_UPDATE: "app:install-update",
UPDATE_INSTALL_PROGRESS: "app:update-install-progress",
OPEN_EXTERNAL: "app:open-external",
UPDATE_SETTINGS: "app:update-settings",
ADD_LINKS: "queue:add-links",
ADD_CONTAINERS: "queue:add-containers",
GET_START_CONFLICTS: "queue:get-start-conflicts",
RESOLVE_START_CONFLICT: "queue:resolve-start-conflict",
CLEAR_ALL: "queue:clear-all",
START: "queue:start",
START_PACKAGES: "queue:start-packages",
STOP: "queue:stop",
TOGGLE_PAUSE: "queue:toggle-pause",
CANCEL_PACKAGE: "queue:cancel-package",
RENAME_PACKAGE: "queue:rename-package",
REORDER_PACKAGES: "queue:reorder-packages",
REMOVE_ITEM: "queue:remove-item",
TOGGLE_PACKAGE: "queue:toggle-package",
EXPORT_QUEUE: "queue:export",
IMPORT_QUEUE: "queue:import",
PICK_FOLDER: "dialog:pick-folder",
PICK_CONTAINERS: "dialog:pick-containers",
STATE_UPDATE: "state:update",
CLIPBOARD_DETECTED: "clipboard:detected",
TOGGLE_CLIPBOARD: "clipboard:toggle",
GET_SESSION_STATS: "stats:get-session-stats",
RESTART: "app:restart",
QUIT: "app:quit",
EXPORT_BACKUP: "app:export-backup",
IMPORT_BACKUP: "app:import-backup",
OPEN_LOG: "app:open-log",
OPEN_SESSION_LOG: "app:open-session-log",
RETRY_EXTRACTION: "queue:retry-extraction",
EXTRACT_NOW: "queue:extract-now",
RESET_PACKAGE: "queue:reset-package",
GET_HISTORY: "history:get",
CLEAR_HISTORY: "history:clear",
REMOVE_HISTORY_ENTRY: "history:remove-entry",
SET_PACKAGE_PRIORITY: "queue:set-package-priority",
SKIP_ITEMS: "queue:skip-items",
RESET_ITEMS: "queue:reset-items",
START_ITEMS: "queue:start-items"
} as const;

View File

@ -1,62 +0,0 @@
import type {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
UiSnapshot,
UpdateCheckResult,
UpdateInstallProgress,
UpdateInstallResult
} from "./types";
export interface ElectronApi {
getSnapshot: () => Promise<UiSnapshot>;
getVersion: () => Promise<string>;
checkUpdates: () => Promise<UpdateCheckResult>;
installUpdate: () => Promise<UpdateInstallResult>;
openExternal: (url: string) => Promise<boolean>;
updateSettings: (settings: Partial<AppSettings>) => Promise<AppSettings>;
addLinks: (payload: AddLinksPayload) => Promise<{ addedPackages: number; addedLinks: number; invalidCount: number }>;
addContainers: (filePaths: string[]) => Promise<{ addedPackages: number; addedLinks: number }>;
getStartConflicts: () => Promise<StartConflictEntry[]>;
resolveStartConflict: (packageId: string, policy: DuplicatePolicy) => Promise<StartConflictResolutionResult>;
clearAll: () => Promise<void>;
start: () => Promise<void>;
startPackages: (packageIds: string[]) => Promise<void>;
stop: () => Promise<void>;
togglePause: () => Promise<boolean>;
cancelPackage: (packageId: string) => Promise<void>;
renamePackage: (packageId: string, newName: string) => Promise<void>;
reorderPackages: (packageIds: string[]) => Promise<void>;
removeItem: (itemId: string) => Promise<void>;
togglePackage: (packageId: string) => Promise<void>;
exportQueue: () => Promise<{ saved: boolean }>;
importQueue: (json: string) => Promise<{ addedPackages: number; addedLinks: number }>;
toggleClipboard: () => Promise<boolean>;
pickFolder: () => Promise<string | null>;
pickContainers: () => Promise<string[]>;
getSessionStats: () => Promise<SessionStats>;
restart: () => Promise<void>;
quit: () => Promise<void>;
exportBackup: () => Promise<{ saved: boolean }>;
importBackup: () => Promise<{ restored: boolean; message: string }>;
openLog: () => Promise<void>;
openSessionLog: () => Promise<void>;
retryExtraction: (packageId: string) => Promise<void>;
extractNow: (packageId: string) => Promise<void>;
resetPackage: (packageId: string) => Promise<void>;
getHistory: () => Promise<HistoryEntry[]>;
clearHistory: () => Promise<void>;
removeHistoryEntry: (entryId: string) => Promise<void>;
setPackagePriority: (packageId: string, priority: PackagePriority) => Promise<void>;
skipItems: (itemIds: string[]) => Promise<void>;
resetItems: (itemIds: string[]) => Promise<void>;
startItems: (itemIds: string[]) => Promise<void>;
onStateUpdate: (callback: (snapshot: UiSnapshot) => void) => () => void;
onClipboardDetected: (callback: (links: string[]) => void) => () => void;
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void) => () => void;
}

View File

@ -1,289 +0,0 @@
export type DownloadStatus =
| "queued"
| "validating"
| "downloading"
| "paused"
| "reconnect_wait"
| "extracting"
| "integrity_check"
| "completed"
| "failed"
| "cancelled";
export type CleanupMode = "none" | "trash" | "delete";
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
export type SpeedMode = "global" | "per_download";
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload" | "onefichier";
export type DebridFallbackProvider = DebridProvider | "none";
export type AppTheme = "dark" | "light";
export type PackagePriority = "high" | "normal" | "low";
export type ExtractCpuPriority = "high" | "middle" | "low";
export interface BandwidthScheduleEntry {
id: string;
startHour: number;
endHour: number;
speedLimitKbps: number;
enabled: boolean;
}
export interface DownloadStats {
totalDownloaded: number;
totalDownloadedAllTime: number;
totalFiles: number;
totalPackages: number;
sessionStartedAt: number;
}
export interface AppSettings {
token: string;
megaLogin: string;
megaPassword: string;
bestToken: string;
allDebridToken: string;
ddownloadLogin: string;
ddownloadPassword: string;
oneFichierApiKey: string;
archivePasswordList: string;
rememberToken: boolean;
providerPrimary: DebridProvider;
providerSecondary: DebridFallbackProvider;
providerTertiary: DebridFallbackProvider;
autoProviderFallback: boolean;
outputDir: string;
packageName: string;
autoExtract: boolean;
autoRename4sf4sj: boolean;
extractDir: string;
collectMkvToLibrary: boolean;
mkvLibraryDir: string;
createExtractSubfolder: boolean;
hybridExtract: boolean;
cleanupMode: CleanupMode;
extractConflictMode: ConflictMode;
removeLinkFilesAfterExtract: boolean;
removeSamplesAfterExtract: boolean;
enableIntegrityCheck: boolean;
autoResumeOnStart: boolean;
autoReconnect: boolean;
reconnectWaitSeconds: number;
completedCleanupPolicy: FinishedCleanupPolicy;
maxParallel: number;
maxParallelExtract: number;
retryLimit: number;
speedLimitEnabled: boolean;
speedLimitKbps: number;
speedLimitMode: SpeedMode;
updateRepo: string;
autoUpdateCheck: boolean;
clipboardWatch: boolean;
minimizeToTray: boolean;
theme: AppTheme;
collapseNewPackages: boolean;
autoSkipExtracted: boolean;
confirmDeleteSelection: boolean;
totalDownloadedAllTime: number;
bandwidthSchedules: BandwidthScheduleEntry[];
columnOrder: string[];
extractCpuPriority: ExtractCpuPriority;
autoExtractWhenStopped: boolean;
}
export interface DownloadItem {
id: string;
packageId: string;
url: string;
provider: DebridProvider | null;
status: DownloadStatus;
retries: number;
speedBps: number;
downloadedBytes: number;
totalBytes: number | null;
progressPercent: number;
fileName: string;
targetPath: string;
resumable: boolean;
attempts: number;
lastError: string;
fullStatus: string;
createdAt: number;
updatedAt: number;
onlineStatus?: "online" | "offline" | "checking";
}
export interface PackageEntry {
id: string;
name: string;
outputDir: string;
extractDir: string;
status: DownloadStatus;
itemIds: string[];
cancelled: boolean;
enabled: boolean;
priority: PackagePriority;
postProcessLabel?: string;
createdAt: number;
updatedAt: number;
}
export interface SessionState {
version: number;
packageOrder: string[];
packages: Record<string, PackageEntry>;
items: Record<string, DownloadItem>;
runStartedAt: number;
totalDownloadedBytes: number;
summaryText: string;
reconnectUntil: number;
reconnectReason: string;
paused: boolean;
running: boolean;
updatedAt: number;
}
export interface DownloadSummary {
total: number;
success: number;
failed: number;
cancelled: number;
extracted: number;
durationSeconds: number;
averageSpeedBps: number;
}
export interface ParsedPackageInput {
name: string;
links: string[];
fileNames?: string[];
}
export interface ContainerImportResult {
packages: ParsedPackageInput[];
source: "dlc";
}
export interface UiSnapshot {
settings: AppSettings;
session: SessionState;
summary: DownloadSummary | null;
stats: DownloadStats;
speedText: string;
etaText: string;
canStart: boolean;
canStop: boolean;
canPause: boolean;
clipboardActive: boolean;
reconnectSeconds: number;
packageSpeedBps: Record<string, number>;
}
export interface AddLinksPayload {
rawText: string;
packageName?: string;
duplicatePolicy?: DuplicatePolicy;
}
export interface AddContainerPayload {
filePaths: string[];
}
export type DuplicatePolicy = "keep" | "skip" | "overwrite";
export interface QueueAddResult {
addedPackages: number;
addedLinks: number;
skippedExistingPackages: string[];
overwrittenPackages: string[];
}
export interface ContainerConflictResult {
conflicts: string[];
packageCount: number;
linkCount: number;
}
export interface StartConflictEntry {
packageId: string;
packageName: string;
extractDir: string;
}
export interface StartConflictResolutionResult {
skipped: boolean;
overwritten: boolean;
}
export interface UpdateCheckResult {
updateAvailable: boolean;
currentVersion: string;
latestVersion: string;
latestTag: string;
releaseUrl: string;
setupAssetUrl?: string;
setupAssetName?: string;
setupAssetDigest?: string;
releaseNotes?: string;
error?: string;
}
export interface UpdateInstallResult {
started: boolean;
message: string;
}
export interface UpdateInstallProgress {
stage: "starting" | "downloading" | "verifying" | "launching" | "done" | "error";
percent: number | null;
downloadedBytes: number;
totalBytes: number | null;
message: string;
}
export interface ParsedHashEntry {
fileName: string;
algorithm: "crc32" | "md5" | "sha1";
digest: string;
}
export interface BandwidthSample {
timestamp: number;
speedBps: number;
}
export interface BandwidthStats {
samples: BandwidthSample[];
currentSpeedBps: number;
averageSpeedBps: number;
maxSpeedBps: number;
totalBytesSession: number;
sessionDurationSeconds: number;
}
export interface SessionStats {
bandwidth: BandwidthStats;
totalDownloads: number;
completedDownloads: number;
failedDownloads: number;
activeDownloads: number;
queuedDownloads: number;
}
export interface HistoryEntry {
id: string;
name: string;
totalBytes: number;
downloadedBytes: number;
fileCount: number;
provider: DebridProvider | null;
completedAt: number;
durationSeconds: number;
status: "completed" | "deleted";
outputDir: string;
urls?: string[];
}
export interface HistoryState {
entries: HistoryEntry[];
maxEntries: number;
}

View File

@ -1,49 +0,0 @@
import { describe, expect, it } from "vitest";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
describe("reorderPackageOrderByDrop", () => {
it("moves adjacent package down by one on drop", () => {
const next = reorderPackageOrderByDrop(["a", "b", "c"], "b", "c");
expect(next).toEqual(["a", "c", "b"]);
});
it("moves package after lower drop target", () => {
const next = reorderPackageOrderByDrop(["a", "b", "c", "d"], "a", "c");
expect(next).toEqual(["b", "c", "a", "d"]);
});
it("returns original order when ids are invalid", () => {
const order = ["a", "b", "c"];
expect(reorderPackageOrderByDrop(order, "x", "b")).toEqual(order);
expect(reorderPackageOrderByDrop(order, "a", "x")).toEqual(order);
expect(reorderPackageOrderByDrop(order, "a", "a")).toEqual(order);
});
});
describe("sortPackageOrderByName", () => {
it("sorts package IDs alphabetically ascending", () => {
const sorted = sortPackageOrderByName(
["pkg3", "pkg1", "pkg2"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
false
);
expect(sorted).toEqual(["pkg1", "pkg2", "pkg3"]);
});
it("sorts package IDs alphabetically descending", () => {
const sorted = sortPackageOrderByName(
["pkg1", "pkg2", "pkg3"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
true
);
expect(sorted).toEqual(["pkg3", "pkg2", "pkg1"]);
});
});

View File

@ -1,694 +0,0 @@
import { describe, it, expect } from "vitest";
import {
extractEpisodeToken,
applyEpisodeTokenToFolderName,
sourceHasRpToken,
ensureRepackToken,
buildAutoRenameBaseName,
buildAutoRenameBaseNameFromFolders,
buildAutoRenameBaseNameFromFoldersWithOptions
} from "../src/main/download-manager";
describe("extractEpisodeToken", () => {
it("extracts S01E01 from standard scene format", () => {
expect(extractEpisodeToken("show.name.s01e01.720p")).toBe("S01E01");
});
it("extracts episode with dot separators", () => {
expect(extractEpisodeToken("Show.S02E15.1080p")).toBe("S02E15");
});
it("extracts episode with dash separators", () => {
expect(extractEpisodeToken("show-s3e5-720p")).toBe("S03E05");
});
it("extracts episode with underscore separators", () => {
expect(extractEpisodeToken("show_s10e100_hdtv")).toBe("S10E100");
});
it("extracts episode with space separators", () => {
expect(extractEpisodeToken("Show Name s1e2 720p")).toBe("S01E02");
});
it("pads single-digit season and episode to 2 digits", () => {
expect(extractEpisodeToken("show.s1e3.720p")).toBe("S01E03");
});
it("handles 3-digit episode numbers", () => {
expect(extractEpisodeToken("show.s01e123")).toBe("S01E123");
});
it("returns null for no episode token", () => {
expect(extractEpisodeToken("some.random.file.720p")).toBeNull();
});
it("returns null for season-only pattern (no episode)", () => {
expect(extractEpisodeToken("show.s01.720p")).toBeNull();
});
it("returns null for empty string", () => {
expect(extractEpisodeToken("")).toBeNull();
});
it("is case-insensitive", () => {
expect(extractEpisodeToken("Show.S05E10.1080p")).toBe("S05E10");
expect(extractEpisodeToken("show.s05e10.1080p")).toBe("S05E10");
});
it("extracts from episode token at start of string", () => {
expect(extractEpisodeToken("s01e01.720p")).toBe("S01E01");
});
it("extracts from episode token at end of string", () => {
expect(extractEpisodeToken("show.s02e03")).toBe("S02E03");
});
it("extracts double episode token s01e01e02", () => {
expect(extractEpisodeToken("tvr-mammon-s01e01e02-720p")).toBe("S01E01E02");
});
it("extracts double episode with dot separators", () => {
expect(extractEpisodeToken("Show.S01E03E04.720p")).toBe("S01E03E04");
});
it("extracts double episode at end of string", () => {
expect(extractEpisodeToken("show.s02e05e06")).toBe("S02E05E06");
});
it("extracts double episode with single-digit numbers", () => {
expect(extractEpisodeToken("show-s1e1e2-720p")).toBe("S01E01E02");
});
});
describe("applyEpisodeTokenToFolderName", () => {
it("replaces existing episode token in folder name", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S02E05")).toBe("Show.S02E05.720p-4sf");
});
it("replaces season-only token when no episode in folder", () => {
expect(applyEpisodeTokenToFolderName("Show.S01.720p-4sf", "S01E03")).toBe("Show.S01E03.720p-4sf");
});
it("inserts before -4sf suffix when no season/episode in folder", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4sf", "S01E05")).toBe("Show.720p.S01E05-4sf");
});
it("inserts before -4sj suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4sj", "S01E05")).toBe("Show.720p.S01E05-4sj");
});
it("appends episode token when no recognized pattern", () => {
expect(applyEpisodeTokenToFolderName("SomeFolder", "S01E01")).toBe("SomeFolder.S01E01");
});
it("returns episode token when folder name is empty", () => {
expect(applyEpisodeTokenToFolderName("", "S01E01")).toBe("S01E01");
});
it("handles folder with existing multi-digit episode", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E99.720p-4sf", "S01E05")).toBe("Show.S01E05.720p-4sf");
});
it("is case-insensitive for -4SF/-4SJ suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4SF", "S01E01")).toBe("Show.720p.S01E01-4SF");
});
it("applies double episode token to season-only folder", () => {
expect(applyEpisodeTokenToFolderName("Mammon.S01.German.1080P.Bluray.x264-SMAHD", "S01E01E02"))
.toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("replaces existing double episode in folder with new token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01E02.720p-4sf", "S01E03E04"))
.toBe("Show.S01E03E04.720p-4sf");
});
it("replaces existing single episode in folder with double episode token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S01E01E02"))
.toBe("Show.S01E01E02.720p-4sf");
});
});
describe("sourceHasRpToken", () => {
it("detects .rp. in filename", () => {
expect(sourceHasRpToken("show.s01e01.rp.720p")).toBe(true);
});
it("detects -rp- in filename", () => {
expect(sourceHasRpToken("show-s01e01-rp-720p")).toBe(true);
});
it("detects _rp_ in filename", () => {
expect(sourceHasRpToken("show_s01e01_rp_720p")).toBe(true);
});
it("detects rp at end of string", () => {
expect(sourceHasRpToken("show.s01e01.rp")).toBe(true);
});
it("does not match rp inside a word", () => {
expect(sourceHasRpToken("enterprise.s01e01")).toBe(false);
});
it("returns false for empty string", () => {
expect(sourceHasRpToken("")).toBe(false);
});
it("is case-insensitive", () => {
expect(sourceHasRpToken("show.RP.720p")).toBe(true);
});
});
describe("ensureRepackToken", () => {
it("inserts REPACK before quality token", () => {
expect(ensureRepackToken("Show.S01E01.1080p-4sf")).toBe("Show.S01E01.REPACK.1080p-4sf");
});
it("inserts REPACK before 720p", () => {
expect(ensureRepackToken("Show.S01E01.720p-4sf")).toBe("Show.S01E01.REPACK.720p-4sf");
});
it("inserts REPACK before 2160p", () => {
expect(ensureRepackToken("Show.S01E01.2160p-4sf")).toBe("Show.S01E01.REPACK.2160p-4sf");
});
it("inserts REPACK before -4sf when no quality token", () => {
expect(ensureRepackToken("Show.S01E01-4sf")).toBe("Show.S01E01.REPACK-4sf");
});
it("inserts REPACK before -4sj when no quality token", () => {
expect(ensureRepackToken("Show.S01E01-4sj")).toBe("Show.S01E01.REPACK-4sj");
});
it("appends REPACK when no recognized insertion point", () => {
expect(ensureRepackToken("Show.S01E01")).toBe("Show.S01E01.REPACK");
});
it("does not double-add REPACK if already present", () => {
expect(ensureRepackToken("Show.S01E01.REPACK.1080p-4sf")).toBe("Show.S01E01.REPACK.1080p-4sf");
});
it("does not double-add repack (case-insensitive)", () => {
expect(ensureRepackToken("Show.s01e01.repack.720p-4sf")).toBe("Show.s01e01.repack.720p-4sf");
});
});
describe("buildAutoRenameBaseName", () => {
it("renames with episode token from source file", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "show.s01e05.720p.mkv");
expect(result).toBe("Show.S01E05.720p-4sf");
});
it("works with -4sj suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sj", "show.s01e03.720p.mkv");
expect(result).toBe("Show.S01E03.720p-4sj");
});
it("renames generic scene folder with group suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-GROUP", "show.s01e05.720p.mkv");
expect(result).toBe("Show.S01.720p-GROUP");
});
it("returns null when source has no episode token", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "random.file.720p.mkv");
expect(result).toBeNull();
});
it("adds REPACK when source has rp token", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4sf", "show.s01e05.rp.720p.mkv");
expect(result).toBe("Show.S01E05.REPACK.720p-4sf");
});
it("handles folder with existing episode that gets replaced", () => {
const result = buildAutoRenameBaseName("Show.S01E01.720p-4sf", "show.s01e10.720p.mkv");
expect(result).toBe("Show.S01E10.720p-4sf");
});
it("inserts episode before -4sf when folder has no season/episode", () => {
const result = buildAutoRenameBaseName("Show.720p-4sf", "show.s01e05.720p.mkv");
expect(result).toBe("Show.720p.S01E05-4sf");
});
it("handles case-insensitive 4SF suffix", () => {
const result = buildAutoRenameBaseName("Show.S01.720p-4SF", "show.s01e02.720p.mkv");
expect(result).toBe("Show.S01E02.720p-4SF");
});
it("handles rp + no quality token in folder", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e05.rp.mkv");
expect(result).toBe("Show.S01E05.REPACK-4sf");
});
it("returns null for empty folder name", () => {
const result = buildAutoRenameBaseName("", "show.s01e01.mkv");
expect(result).toBeNull();
});
it("returns null for empty source file name", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "");
expect(result).toBeNull();
});
// Edge cases
it("handles 2160p quality token", () => {
const result = buildAutoRenameBaseName("Show.S01.2160p-4sf", "show.s01e01.rp.2160p.mkv");
expect(result).toBe("Show.S01E01.REPACK.2160p-4sf");
});
it("handles 480p quality token", () => {
const result = buildAutoRenameBaseName("Show.S01.480p-4sf", "show.s01e07.480p.mkv");
expect(result).toBe("Show.S01E07.480p-4sf");
});
it("does not trigger on folders ending with similar but wrong suffix", () => {
expect(buildAutoRenameBaseName("Show.S01-4sfx", "show.s01e01.mkv")).toBeNull();
expect(buildAutoRenameBaseName("Show.S01-x4sf", "show.s01e01.mkv")).toBeNull();
});
it("handles high season and episode numbers", () => {
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
expect(result).not.toBeNull();
expect(result!).toContain("S99E999");
});
// Real-world scene release patterns
it("real-world: German series with dots", () => {
const result = buildAutoRenameBaseName(
"Der.Bergdoktor.S18.German.720p.WEB.x264-4SJ",
"der.bergdoktor.s18e01.german.720p.web.x264"
);
expect(result).toBe("Der.Bergdoktor.S18E01.German.720p.WEB.x264-4SJ");
});
it("real-world: English series with rp token", () => {
const result = buildAutoRenameBaseName(
"The.Last.of.Us.S02.1080p.WEB-4SF",
"the.last.of.us.s02e03.rp.1080p.web"
);
expect(result).toBe("The.Last.of.Us.S02E03.REPACK.1080p.WEB-4SF");
});
it("real-world: multiple dots in name", () => {
const result = buildAutoRenameBaseName(
"Grey.s.Anatomy.S21.German.DL.720p.WEB.x264-4SJ",
"grey.s.anatomy.s21e08.german.dl.720p.web.x264"
);
expect(result).toBe("Grey.s.Anatomy.S21E08.German.DL.720p.WEB.x264-4SJ");
});
it("real-world: 4K content", () => {
const result = buildAutoRenameBaseName(
"Severance.S02.2160p.ATVP.WEB-DL.DDP5.1.DV.H.265-4SF",
"severance.s02e07.2160p.atvp.web-dl.ddp5.1.dv.h.265"
);
expect(result).toBe("Severance.S02E07.2160p.ATVP.WEB-DL.DDP5.1.DV.H.265-4SF");
});
it("real-world: Britannia release keeps folder base name", () => {
const result = buildAutoRenameBaseName(
"Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW",
"law-britannia.s02e01.720p.webrip"
);
expect(result).toBe("Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW");
});
it("real-world: Britannia repack injects REPACK", () => {
const result = buildAutoRenameBaseName(
"Britannia.S02.GERMAN.720p.WEBRiP.x264-LAW",
"law-britannia.s02e09.720p.webrip.repack"
);
expect(result).toBe("Britannia.S02.GERMAN.REPACK.720p.WEBRiP.x264-LAW");
});
it("adds REPACK when folder name carries RP hint", () => {
const result = buildAutoRenameBaseName(
"Banshee.S02E01.German.RP.720p.BluRay.x264-RIPLEY",
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.REPACK.720p.BluRay.x264-RIPLEY");
});
it("real-world: folder already has wrong episode", () => {
const result = buildAutoRenameBaseName(
"Cobra.Kai.S06E01.720p.NF.WEB-DL.DDP5.1.x264-4SF",
"cobra.kai.s06e14.720p.nf.web-dl.ddp5.1.x264"
);
expect(result).toBe("Cobra.Kai.S06E14.720p.NF.WEB-DL.DDP5.1.x264-4SF");
});
// Bug-hunting edge cases
it("source filename extension is not included in episode detection", () => {
// The sourceFileName passed to buildAutoRenameBaseName is the basename without extension
// so .mkv should not interfere, but let's verify with an actual extension
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
// "mkv" should not be treated as part of the filename match
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
});
it("does not match episode-like patterns in codec strings", () => {
// h.265 has digits but should not be confused with episode tokens
const token = extractEpisodeToken("show.s01e01.h.265");
expect(token).toBe("S01E01");
});
it("handles folder with dash separators throughout", () => {
const result = buildAutoRenameBaseName(
"Show-Name-S01-720p-4sf",
"show-name-s01e05-720p"
);
expect(result).toBe("Show-Name-S01E05-720p-4sf");
});
it("does not duplicate episode when folder already has the same episode", () => {
const result = buildAutoRenameBaseName(
"Show.S01E05.720p-4sf",
"show.s01e05.720p"
);
// Must NOT produce "Show.S01E05.720p.S01E05-4sf" (double episode bug)
expect(result).toBe("Show.S01E05.720p-4sf");
});
it("handles folder with only -4sf suffix (edge case)", () => {
const result = buildAutoRenameBaseName("-4sf", "show.s01e01.mkv");
// Extreme edge case - sanitizeFilename trims leading dots
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
expect(result!).toContain("-4sf");
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
});
it("sanitizes special characters from result", () => {
// sanitizeFilename should strip dangerous chars
const result = buildAutoRenameBaseName("Show:Name.S01-4sf", "show.s01e01.mkv");
// The colon should be sanitized away
expect(result).not.toBeNull();
expect(result!).not.toContain(":");
});
});
describe("buildAutoRenameBaseNameFromFolders", () => {
it("uses parent folder when current folder is not a scene template", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Episode 01",
"Banshee.S02.German.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02.German.720p.BluRay.x264-RIPLEY");
});
it("uses nested scene subfolder directly", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Banshee.S02E01.German.720p.BluRay.x264-RIPLEY",
"Banshee.S02.German.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.720p.BluRay.x264-RIPLEY");
});
it("injects REPACK when parent folder carries repack hint", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Banshee.S02E01.German.720p.BluRay.x264-RIPLEY",
"Banshee.S02.German.RP.720p.BluRay.x264-RIPLEY"
],
"r-banshee.s02e01-720p"
);
expect(result).toBe("Banshee.S02E01.German.REPACK.720p.BluRay.x264-RIPLEY");
});
it("uses nested Arrow episode folder with title", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG",
"Arrow.S04.German.DL.720p.BluRay.x264-RSG"
],
"rsg-arrow-s04e01-720p"
);
expect(result).toBe("Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG");
});
it("adds REPACK for Arrow when source contains rp token", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Arrow.S04E01.Green.Arrow.German.DL.720p.BluRay.x264-RSG",
"Arrow.S04.German.DL.720p.BluRay.x264-RSG"
],
"rsg-arrow-s04e01.rp.720p"
);
expect(result).toBe("Arrow.S04E01.Green.Arrow.German.DL.REPACK.720p.BluRay.x264-RSG");
});
it("converts Teil token to episode using parent season", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Last.Impact.Der.Einschlag.Teil1.GERMAN.DL.720p.WEB.H264-SunDry",
"Last.Impact.Der.Einschlag.S01.GERMAN.DL.720p.WEB.H264-SunDry"
],
"sundry-last.impact.der.einschlag.teil1.720p.web.h264"
);
expect(result).toBe("Last.Impact.Der.Einschlag.S01E01.GERMAN.DL.720p.WEB.H264-SunDry");
});
it("converts Teil token to episode with REPACK", () => {
const result = buildAutoRenameBaseNameFromFolders(
[
"Last.Impact.Der.Einschlag.Teil1.GERMAN.DL.720p.WEB.H264-SunDry",
"Last.Impact.Der.Einschlag.S01.GERMAN.DL.720p.WEB.H264-SunDry"
],
"sundry-last.impact.der.einschlag.teil1.rp.720p.web.h264"
);
expect(result).toBe("Last.Impact.Der.Einschlag.S01E01.GERMAN.DL.REPACK.720p.WEB.H264-SunDry");
});
it("forces episode insertion for flat season folder when many files share directory", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Arrow.S08.GERMAN.DUBBED.DL.720p.BluRay.x264-TMSF"
],
"tmsf-arrow-s08e03-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Arrow.S08E03.GERMAN.DUBBED.DL.720p.BluRay.x264-TMSF");
});
it("forces episode insertion plus REPACK for flat season folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Gotham.S05.GERMAN.DUBBED.720p.BLURAY.x264-ZZGtv"
],
"zzgtv-gotham-s05e02.rp",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Gotham.S05E02.GERMAN.DUBBED.REPACK.720p.BLURAY.x264-ZZGtv");
});
it("uses nested episode title folder for Gotham TvR style", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Gotham.S04E01.Pax.Penguina.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR",
"Gotham.S04.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR"
],
"tvr-gotham-s04e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Gotham.S04E01.Pax.Penguina.GERMAN.5.1.DL.AC3.720p.BDRiP.x264-TvR");
});
it("uses nested title folder for Britannia TV4A style", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Britannia.S01E01.Die.Landung.German.DL.720p.BluRay.x264-TV4A",
"Britannia.S01.German.DL.720p.BluRay.x264-TV4A"
],
"tv4a-britannia.s01e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Britannia.S01E01.Die.Landung.German.DL.720p.BluRay.x264-TV4A");
});
it("handles odd source token style 101 by using nested Agent X folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Agent.X.S01E01.Pilot.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS",
"Agent.X.S01.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS"
],
"tvs-agent-x-dd51-ded-dl-7p-ithd-x264-101",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Agent.X.S01E01.Pilot.German.DD51.Dubbed.DL.720p.iTunesHD.x264-TVS");
});
it("maps compact code 301 to S03E01 for nested Legion folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Legion.S03E01.Kapitel.20.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS",
"Legion.S03.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS"
],
"tvs-legion-dd51-ded-dl-7p-azhd-avc-301",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Legion.S03E01.Kapitel.20.German.DD51.Dubbed.DL.720p.AmazonHD.AVC-TVS");
});
it("maps compact code 211 in flat season folder", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Lethal.Weapon.S02.German.DD51.Dubbed.DL.720p.AmazonHD.x264-TVS"
],
"tvs-lethal-weapon-dd51-ded-dl-7p-azhd-x264-211",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Lethal.Weapon.S02E11.German.DD51.Dubbed.DL.720p.AmazonHD.x264-TVS");
});
it("maps episode-only token e01 via season folder hint and keeps REPACK", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e01-720p-repack",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.REPACK.720p.WEB.h264-TMSF");
});
it("maps episode-only token e02 via season folder hint", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"tmsf-cheatderbetrug-e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E02.GERMAN.720p.WEB.h264-TMSF");
});
it("keeps renaming for odd source order like 4sf-bs-720p-s01e05", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-TMSF"
],
"4sf-bs-720p-s01e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E05.GERMAN.720p.WEB.h264-TMSF");
});
it("accepts lowercase scene group suffixes", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Cheat.der.Betrug.S01.GERMAN.720p.WEB.h264-tmsf"
],
"tmsf-cheatderbetrug-e01-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.720p.WEB.h264-tmsf");
});
it("renames double episode file into season folder (Mammon style)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e01e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("renames second double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e03e04-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E03E04.German.1080P.Bluray.x264-SMAHD");
});
it("renames third double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e05e06-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E05E06.German.1080P.Bluray.x264-SMAHD");
});
// Last-resort fallback: folder has season but no scene group suffix (user-renamed packages)
it("renames when folder has season but no scene group suffix (Mystery Road case)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mystery Road S02E05");
});
it("renames with season-only folder and custom name without dots", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Meine Serie S03"],
"meine-serie-s03e10-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Meine Serie S03E10");
});
it("prefers scene-group folder over season-only fallback", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mystery Road S02",
"Mystery.Road.S02.GERMAN.DL.AC3.720p.HDTV.x264-hrs"
],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
// Should use the scene-group folder (hrs), not the custom one
expect(result).toBe("Mystery.Road.S02E05.GERMAN.DL.AC3.720p.HDTV.x264-hrs");
});
it("does not use season-only fallback when forceEpisodeForSeasonFolder is false", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: false }
);
expect(result).toBeNull();
});
it("renames Riviera S02 with single-digit episode s02e2", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
"tvp-riviera-s02e2-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
});
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"r104.de.dl.web.7p-s04e02",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
it("renames Room 104 wayne source with episode", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"room.104.s04e01.german.dl.720p.web.h264-wayne",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
});

View File

@ -1,109 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { cleanupCancelledPackageArtifacts, removeDownloadLinkArtifacts, removeSampleArtifacts } from "../src/main/cleanup";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("cleanup", () => {
it("removes archive artifacts but keeps media", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.writeFileSync(path.join(dir, "release.part1.rar"), "x");
fs.writeFileSync(path.join(dir, "movie.mkv"), "x");
const removed = cleanupCancelledPackageArtifacts(dir);
expect(removed).toBeGreaterThan(0);
expect(fs.existsSync(path.join(dir, "release.part1.rar"))).toBe(false);
expect(fs.existsSync(path.join(dir, "movie.mkv"))).toBe(true);
});
it("removes sample artifacts and link files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.mkdirSync(path.join(dir, "Samples"), { recursive: true });
fs.writeFileSync(path.join(dir, "Samples", "demo-sample.mkv"), "x");
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://example.com/a\n");
const links = await removeDownloadLinkArtifacts(dir);
const samples = await removeSampleArtifacts(dir);
expect(links).toBeGreaterThan(0);
expect(samples.files + samples.dirs).toBeGreaterThan(0);
});
it("cleans up archive files in nested directories", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
// Create nested directory structure with archive files
const sub1 = path.join(dir, "season1");
const sub2 = path.join(dir, "season1", "extras");
fs.mkdirSync(sub2, { recursive: true });
fs.writeFileSync(path.join(sub1, "episode.part1.rar"), "x");
fs.writeFileSync(path.join(sub1, "episode.part2.rar"), "x");
fs.writeFileSync(path.join(sub2, "bonus.zip"), "x");
fs.writeFileSync(path.join(sub2, "bonus.7z"), "x");
// Non-archive files should be kept
fs.writeFileSync(path.join(sub1, "video.mkv"), "real content");
fs.writeFileSync(path.join(sub2, "subtitle.srt"), "subtitle content");
const removed = cleanupCancelledPackageArtifacts(dir);
expect(removed).toBe(4); // 2 rar parts + zip + 7z
expect(fs.existsSync(path.join(sub1, "episode.part1.rar"))).toBe(false);
expect(fs.existsSync(path.join(sub1, "episode.part2.rar"))).toBe(false);
expect(fs.existsSync(path.join(sub2, "bonus.zip"))).toBe(false);
expect(fs.existsSync(path.join(sub2, "bonus.7z"))).toBe(false);
// Non-archives kept
expect(fs.existsSync(path.join(sub1, "video.mkv"))).toBe(true);
expect(fs.existsSync(path.join(sub2, "subtitle.srt"))).toBe(true);
});
it("detects link artifacts by URL content in text files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
// File with link-like name containing URLs should be removed
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://rapidgator.net/file/abc123\nhttps://uploaded.net/file/def456\n");
// File with link-like name but no URLs should be kept
fs.writeFileSync(path.join(dir, "my_downloads.txt"), "Just some random text without URLs");
// Regular text file that doesn't match the link pattern should be kept
fs.writeFileSync(path.join(dir, "readme.txt"), "https://example.com");
// .url files should always be removed
fs.writeFileSync(path.join(dir, "bookmark.url"), "[InternetShortcut]\nURL=https://example.com");
// .dlc files should always be removed
fs.writeFileSync(path.join(dir, "container.dlc"), "encrypted-data");
const removed = await removeDownloadLinkArtifacts(dir);
expect(removed).toBeGreaterThanOrEqual(3); // download_links.txt + bookmark.url + container.dlc
expect(fs.existsSync(path.join(dir, "download_links.txt"))).toBe(false);
expect(fs.existsSync(path.join(dir, "bookmark.url"))).toBe(false);
expect(fs.existsSync(path.join(dir, "container.dlc"))).toBe(false);
// Non-matching files should be kept
expect(fs.existsSync(path.join(dir, "readme.txt"))).toBe(true);
});
it("does not recurse into sample symlink or junction targets", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
const external = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-ext-"));
tempDirs.push(dir, external);
const outsideFile = path.join(external, "outside-sample.mkv");
fs.writeFileSync(outsideFile, "keep", "utf8");
const linkedSampleDir = path.join(dir, "sample");
const linkType: fs.symlink.Type = process.platform === "win32" ? "junction" : "dir";
fs.symlinkSync(external, linkedSampleDir, linkType);
const result = await removeSampleArtifacts(dir);
expect(result.files).toBe(0);
expect(fs.existsSync(outsideFile)).toBe(true);
});
});

View File

@ -1,210 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import { importDlcContainers } from "../src/main/container";
const tempDirs: string[] = [];
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("container", () => {
it("skips oversized DLC files without throwing and blocking other files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const oversizedFilePath = path.join(dir, "oversized.dlc");
fs.writeFileSync(oversizedFilePath, Buffer.alloc((8 * 1024 * 1024) + 1, 1));
// Create a valid mockup DLC that would be skipped if an error was thrown
const validFilePath = path.join(dir, "valid.dlc");
// Just needs to be short enough to pass file limits but fail parsing, triggering dcrypt fallback
fs.writeFileSync(validFilePath, Buffer.from("Valid but not real DLC content..."));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("http://example.com/file1.rar\nhttp://example.com/file2.rar", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([oversizedFilePath, validFilePath]);
// Expect the oversized to be silently skipped, and valid to be parsed into 1 package with DLC filename
expect(result).toHaveLength(1);
expect(result[0].name).toBe("valid");
expect(result[0].links).toEqual(["http://example.com/file1.rar", "http://example.com/file2.rar"]);
expect(fetchSpy).toHaveBeenCalledTimes(1);
});
it("skips non-dlc files completely", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-non-"));
tempDirs.push(dir);
const txtPath = path.join(dir, "links.txt");
fs.writeFileSync(txtPath, "http://link.com/1");
const result = await importDlcContainers([txtPath]);
expect(result).toEqual([]);
});
it("falls back to dcrypt if local decryption returns empty", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "fallback.dlc");
// A file large enough to trigger local decryption attempt (needs > 89 bytes to pass the slice check)
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
// Mock local RC service failure (returning 404)
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
// Mock dcrypt fallback success
return new Response("http://fallback.com/1", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("fallback");
expect(result[0].links).toEqual(["http://fallback.com/1"]);
// Should have tried both!
expect(fetchSpy).toHaveBeenCalledTimes(2);
});
it("falls back to dcrypt when local decryption throws invalid padding", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "invalid-local.dlc");
fs.writeFileSync(filePath, "X".repeat(120));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response(`<rc>${Buffer.alloc(16).toString("base64")}</rc>`, { status: 200 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("http://example.com/fallback1", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("invalid-local");
expect(result[0].links).toEqual(["http://example.com/fallback1"]);
expect(fetchSpy).toHaveBeenCalledTimes(2);
});
it("falls back to paste endpoint when upload returns 413", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "big-dlc.dlc");
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("http://paste-fallback.com/file1.rar\nhttp://paste-fallback.com/file2.rar", { status: 200 });
}
return new Response("", { status: 404 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const result = await importDlcContainers([filePath]);
expect(result).toHaveLength(1);
expect(result[0].name).toBe("big-dlc");
expect(result[0].links).toEqual(["http://paste-fallback.com/file1.rar", "http://paste-fallback.com/file2.rar"]);
// local RC + upload + paste = 3 calls
expect(fetchSpy).toHaveBeenCalledTimes(3);
});
it("throws when both dcrypt endpoints return 413", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "huge.dlc");
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("Request Entity Too Large", { status: 413 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/zu groß für dcrypt/i);
});
it("throws when upload returns 413 and paste returns 500", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "doomed.dlc");
fs.writeFileSync(filePath, Buffer.from("not a valid dlc payload at all"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("Request Entity Too Large", { status: 413 });
}
if (urlStr.includes("dcrypt.it/decrypt/paste")) {
return new Response("paste failure", { status: 500 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/DLC konnte nicht importiert werden/i);
});
it("throws clear error when all dlc imports fail", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
tempDirs.push(dir);
const filePath = path.join(dir, "broken.dlc");
fs.writeFileSync(filePath, Buffer.from("not a valid dlc payload at all"));
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("service.jdownloader.org")) {
return new Response("", { status: 404 });
}
if (urlStr.includes("dcrypt.it/decrypt/upload")) {
return new Response("upstream failure", { status: 500 });
}
return new Response("", { status: 500 });
});
globalThis.fetch = fetchSpy as unknown as typeof fetch;
await expect(importDlcContainers([filePath])).rejects.toThrow(/DLC konnte nicht importiert werden/i);
});
});

View File

@ -1,792 +0,0 @@
import { afterEach, describe, expect, it, vi } from "vitest";
import { defaultSettings, REQUEST_RETRIES } from "../src/main/constants";
import { DebridService, extractRapidgatorFilenameFromHtml, filenameFromRapidgatorUrlPath, normalizeResolvedFilename } from "../src/main/debrid";
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("debrid service", () => {
it("falls back to Mega web when Real-Debrid fails", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
bestToken: "",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "bestdebrid" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({ error: "traffic_limit" }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "file.bin",
directUrl: "https://mega-web.example/file.bin",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part1.rar.html");
expect(result.provider).toBe("megadebrid");
expect(result.directUrl).toBe("https://mega-web.example/file.bin");
expect(megaWeb).toHaveBeenCalledTimes(1);
});
it("does not fallback when auto fallback is disabled", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "bestdebrid" as const,
autoProviderFallback: false
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response("traffic exhausted", { status: 429 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "unused.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part2.rar.html")).rejects.toThrow();
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("uses BestDebrid auth header without token query fallback", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
const calledUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
calledUrls.push(url);
if (url.includes("/api/v1/generateLink?link=")) {
return new Response(JSON.stringify({ download: "https://best.example/file.bin", filename: "file.bin", filesize: 2048 }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part3.rar.html");
expect(result.provider).toBe("bestdebrid");
expect(result.fileSize).toBe(2048);
expect(calledUrls.some((url) => url.includes("auth="))).toBe(false);
});
it("sends Bearer auth header to BestDebrid", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let authHeader = "";
globalThis.fetch = (async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/api/v1/generateLink?link=")) {
const headers = init?.headers;
if (headers instanceof Headers) {
authHeader = headers.get("Authorization") || "";
} else if (Array.isArray(headers)) {
const tuple = headers.find(([key]) => key.toLowerCase() === "authorization");
authHeader = tuple?.[1] || "";
} else {
authHeader = String((headers as Record<string, unknown> | undefined)?.Authorization || "");
}
return new Response(JSON.stringify({ download: "https://best.example/file.bin", filename: "file.bin", filesize: 42 }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://hoster.example/file/abc");
expect(result.provider).toBe("bestdebrid");
expect(authHeader).toBe("Bearer best-token");
});
it("does not retry BestDebrid auth failures (401)", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "best-token",
providerPrimary: "bestdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let calls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/api/v1/generateLink?link=")) {
calls += 1;
return new Response(JSON.stringify({ message: "Unauthorized" }), {
status: 401,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://hoster.example/file/no-retry")).rejects.toThrow();
expect(calls).toBe(1);
});
it("does not retry AllDebrid auth failures (403)", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token",
providerPrimary: "alldebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
let calls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
calls += 1;
return new Response(JSON.stringify({ status: "error", error: { message: "forbidden" } }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://hoster.example/file/no-retry-ad")).rejects.toThrow();
expect(calls).toBe(1);
});
it("supports AllDebrid unlock", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
providerPrimary: "alldebrid" as const,
providerSecondary: "realdebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
return new Response(JSON.stringify({
status: "success",
data: {
link: "https://alldebrid.example/file.bin",
filename: "file.bin",
filesize: 4096
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/example.part4.rar.html");
expect(result.provider).toBe("alldebrid");
expect(result.directUrl).toBe("https://alldebrid.example/file.bin");
expect(result.fileSize).toBe(4096);
});
it("treats MegaDebrid as not configured when web fallback callback is unavailable", async () => {
const settings = {
...defaultSettings(),
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const service = new DebridService(settings);
await expect(service.unrestrictLink("https://rapidgator.net/file/missing-mega-web")).rejects.toThrow(/nicht konfiguriert/i);
});
it("uses Mega web path exclusively", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
const fetchSpy = vi.fn(async () => new Response("not-found", { status: 404 }));
globalThis.fetch = fetchSpy as unknown as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "from-web.rar",
directUrl: "https://www11.unrestrict.link/download/file/abc/from-web.rar",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const result = await service.unrestrictLink("https://rapidgator.net/file/abc/from-web.rar.html");
expect(result.provider).toBe("megadebrid");
expect(result.directUrl).toContain("unrestrict.link/download/file/");
expect(megaWeb).toHaveBeenCalledTimes(1);
expect(fetchSpy).toHaveBeenCalledTimes(0);
});
it("aborts Mega web unrestrict when caller signal is cancelled", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const megaWeb = vi.fn((_link: string, signal?: AbortSignal): Promise<never> => new Promise((_, reject) => {
const onAbort = (): void => reject(new Error("aborted:mega-web-test"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
const controller = new AbortController();
const abortTimer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);
expect(megaWeb).toHaveBeenCalledTimes(1);
expect(megaWeb.mock.calls[0]?.[1]).toBe(controller.signal);
} finally {
clearTimeout(abortTimer);
}
});
it("respects provider selection and does not append hidden providers", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "megadebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "megadebrid" as const,
autoProviderFallback: true
};
let allDebridCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/unlock")) {
allDebridCalls += 1;
return new Response(JSON.stringify({ status: "success", data: { link: "https://alldebrid.example/file.bin" } }), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => null);
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part5.rar.html")).rejects.toThrow();
expect(allDebridCalls).toBe(0);
});
it("does not use secondary provider when fallback is disabled and primary is missing", async () => {
const settings = {
...defaultSettings(),
token: "",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "megadebrid" as const,
providerTertiary: "none" as const,
autoProviderFallback: false
};
const megaWeb = vi.fn(async () => ({
fileName: "should-not-run.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part5.rar.html")).rejects.toThrow(/nicht konfiguriert/i);
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("allows disabling secondary and tertiary providers", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
megaLogin: "user",
megaPassword: "pass",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({ error: "traffic_limit" }), {
status: 403,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const megaWeb = vi.fn(async () => ({
fileName: "unused.bin",
directUrl: "https://unused",
fileSize: null,
retriesUsed: 0
}));
const service = new DebridService(settings, { megaWebUnrestrict: megaWeb });
await expect(service.unrestrictLink("https://rapidgator.net/file/example.part6.rar.html")).rejects.toThrow();
expect(megaWeb).toHaveBeenCalledTimes(0);
});
it("resolves rapidgator filename from page when provider returns hash", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
return new Response(JSON.stringify({
download: "https://cdn.example/file.bin",
filename: "6f09df2984fe01378537c7cd8d7fa7ce",
filesize: 2048
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce")) {
return new Response("<html><head><title>download file Banshee.S04E01.German.DL.720p.part01.rar - Rapidgator</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const result = await service.unrestrictLink("https://rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce");
expect(result.provider).toBe("realdebrid");
expect(result.fileName).toBe("Banshee.S04E01.German.DL.720p.part01.rar");
});
it("resolves filenames for rg.to links", async () => {
const settings = {
...defaultSettings(),
allDebridToken: ""
};
const link = "https://rg.to/file/685cec6dcc1837dc725755fc9c726dd9";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url === link) {
return new Response("<html><head><title>Download file Bulletproof.S01E01.German.DL.DD20.Synced.720p.AmazonHD.h264-GDR.part01.rar</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.get(link)).toBe("Bulletproof.S01E01.German.DL.DD20.Synced.720p.AmazonHD.h264-GDR.part01.rar");
});
it("does not unrestrict non-rapidgator links during filename scan", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true,
allDebridToken: ""
};
const linkFromPage = "https://rapidgator.net/file/11111111111111111111111111111111";
const linkFromProvider = "https://hoster.example/file/22222222222222222222222222222222";
let unrestrictCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url === linkFromPage) {
return new Response("<html><head><title>Download file from-page.part1.rar</title></head></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
unrestrictCalls += 1;
const body = init?.body;
const bodyText = body instanceof URLSearchParams ? body.toString() : String(body || "");
const linkValue = new URLSearchParams(bodyText).get("link") || "";
if (linkValue === linkFromProvider) {
return new Response(JSON.stringify({
download: "https://cdn.example/from-provider",
filename: "from-provider.part2.rar",
filesize: 1024
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const events: Array<{ link: string; fileName: string }> = [];
const resolved = await service.resolveFilenames([linkFromPage, linkFromProvider], (link, fileName) => {
events.push({ link, fileName });
});
expect(resolved.get(linkFromPage)).toBe("from-page.part1.rar");
expect(resolved.has(linkFromProvider)).toBe(false);
expect(unrestrictCalls).toBe(0);
expect(events).toEqual(expect.arrayContaining([
{ link: linkFromPage, fileName: "from-page.part1.rar" }
]));
});
it("does not unrestrict rapidgator links during filename scan after page lookup miss", async () => {
const settings = {
...defaultSettings(),
token: "rd-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
allDebridToken: ""
};
const link = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
let unrestrictCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.real-debrid.com/rest/1.0/unrestrict/link")) {
unrestrictCalls += 1;
return new Response(JSON.stringify({ error: "should-not-be-called" }), {
status: 500,
headers: { "Content-Type": "application/json" }
});
}
if (url === link) {
return new Response("not found", { status: 404 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.size).toBe(0);
expect(unrestrictCalls).toBe(0);
});
it("maps AllDebrid filename infos by index when response link is missing", async () => {
const settings = {
...defaultSettings(),
token: "",
bestToken: "",
allDebridToken: "ad-token",
providerPrimary: "realdebrid" as const,
providerSecondary: "none" as const,
providerTertiary: "none" as const,
autoProviderFallback: true
};
const linkA = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
const linkB = "https://rapidgator.net/file/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
return new Response(JSON.stringify({
status: "success",
data: {
infos: [
{ filename: "wrong-a.mkv" },
{ filename: "wrong-b.mkv" }
]
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url === linkA || url === linkB) {
return new Response("no title", { status: 404 });
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([linkA, linkB]);
expect(resolved.get(linkA)).toBe("wrong-a.mkv");
expect(resolved.get(linkB)).toBe("wrong-b.mkv");
expect(resolved.size).toBe(2);
});
it("retries AllDebrid filename infos after transient server error", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token"
};
const link = "https://rapidgator.net/file/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
let infoCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
infoCalls += 1;
if (infoCalls === 1) {
return new Response("temporary error", { status: 500 });
}
return new Response(JSON.stringify({
status: "success",
data: {
infos: [
{ link, filename: "resolved-from-infos.mkv" }
]
}
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.get(link)).toBe("resolved-from-infos.mkv");
expect(infoCalls).toBe(2);
});
it("retries AllDebrid filename infos when HTML challenge is returned", async () => {
const settings = {
...defaultSettings(),
allDebridToken: "ad-token"
};
const link = "https://rapidgator.net/file/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb";
let infoCalls = 0;
let pageCalls = 0;
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("api.alldebrid.com/v4/link/infos")) {
infoCalls += 1;
return new Response("<html><title>cf challenge</title></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}
if (url === link) {
pageCalls += 1;
}
return new Response("not-found", { status: 404 });
}) as typeof fetch;
const service = new DebridService(settings);
const resolved = await service.resolveFilenames([link]);
expect(resolved.size).toBe(0);
expect(infoCalls).toBe(REQUEST_RETRIES);
expect(pageCalls).toBe(1);
});
});
describe("normalizeResolvedFilename", () => {
it("strips HTML entities", () => {
expect(normalizeResolvedFilename("Show.S01E01.German.DL.720p.part01.rar")).toBe("Show.S01E01.German.DL.720p.part01.rar");
expect(normalizeResolvedFilename("File&amp;Name.part1.rar")).toBe("File&Name.part1.rar");
expect(normalizeResolvedFilename("File&quot;Name&quot;.part1.rar")).toBe('File"Name".part1.rar');
});
it("strips HTML tags and collapses whitespace", () => {
// Tags are replaced by spaces, then multiple spaces collapsed
const result = normalizeResolvedFilename("<b>Show.S01E01</b>.part01.rar");
expect(result).toBe("Show.S01E01 .part01.rar");
// Entity decoding happens before tag removal, so &lt;...&gt; becomes <...> then gets stripped
const entityTagResult = normalizeResolvedFilename("File&lt;Tag&gt;.part1.rar");
expect(entityTagResult).toBe("File .part1.rar");
});
it("strips 'download file' prefix", () => {
expect(normalizeResolvedFilename("Download file Show.S01E01.part01.rar")).toBe("Show.S01E01.part01.rar");
expect(normalizeResolvedFilename("download file Movie.2024.mkv")).toBe("Movie.2024.mkv");
});
it("strips Rapidgator suffix", () => {
expect(normalizeResolvedFilename("Show.S01E01.part01.rar - Rapidgator")).toBe("Show.S01E01.part01.rar");
expect(normalizeResolvedFilename("Movie.mkv | Rapidgator.net")).toBe("Movie.mkv");
});
it("returns empty for opaque or non-filename values", () => {
expect(normalizeResolvedFilename("")).toBe("");
expect(normalizeResolvedFilename("just some text")).toBe("");
expect(normalizeResolvedFilename("e51f6809bb6ca615601f5ac5db433737")).toBe("");
expect(normalizeResolvedFilename("download.bin")).toBe("");
});
it("handles combined transforms", () => {
// "Download file" prefix stripped, &amp; decoded to &, "- Rapidgator" suffix stripped
expect(normalizeResolvedFilename("Download file Show.S01E01.part01.rar - Rapidgator"))
.toBe("Show.S01E01.part01.rar");
});
});
describe("filenameFromRapidgatorUrlPath", () => {
it("extracts filename from standard rapidgator URL", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/abc123/Show.S01E01.part01.rar.html"))
.toBe("Show.S01E01.part01.rar");
});
it("extracts filename without .html suffix", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/abc123/Movie.2024.mkv"))
.toBe("Movie.2024.mkv");
});
it("returns empty for hash-only URL paths", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/e51f6809bb6ca615601f5ac5db433737"))
.toBe("");
});
it("returns empty for invalid URLs", () => {
expect(filenameFromRapidgatorUrlPath("not-a-url")).toBe("");
expect(filenameFromRapidgatorUrlPath("")).toBe("");
});
it("handles URL-encoded path segments", () => {
expect(filenameFromRapidgatorUrlPath("https://rapidgator.net/file/id/Show%20Name.S01E01.part01.rar.html"))
.toBe("Show Name.S01E01.part01.rar");
});
});
describe("extractRapidgatorFilenameFromHtml", () => {
it("extracts filename from title tag", () => {
const html = "<html><head><title>Download file Show.S01E01.German.DL.720p.part01.rar - Rapidgator</title></head></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Show.S01E01.German.DL.720p.part01.rar");
});
it("extracts filename from og:title meta tag", () => {
const html = '<html><head><meta property="og:title" content="Movie.2024.German.DL.1080p.mkv"></head></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Movie.2024.German.DL.1080p.mkv");
});
it("extracts filename from reversed og:title attribute order", () => {
const html = '<html><head><meta content="Movie.2024.German.DL.1080p.mkv" property="og:title"></head></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Movie.2024.German.DL.1080p.mkv");
});
it("returns empty for HTML without recognizable filenames", () => {
const html = "<html><head><title>Rapidgator: Fast, Pair and Unlimited</title></head><body>No file here</body></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("");
});
it("returns empty for empty HTML", () => {
expect(extractRapidgatorFilenameFromHtml("")).toBe("");
});
it("ignores broad body text that is not a labeled filename", () => {
const html = "<html><body>Please download file now from mirror.mkv</body></html>";
expect(extractRapidgatorFilenameFromHtml(html)).toBe("");
});
it("extracts from File name label in page body", () => {
const html = '<html><body>File name: <b>Show.S02E03.720p.part01.rar</b></body></html>';
expect(extractRapidgatorFilenameFromHtml(html)).toBe("Show.S02E03.720p.part01.rar");
});
});

File diff suppressed because it is too large Load Diff

View File

@ -1,204 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { spawnSync } from "node:child_process";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalBackend = process.env.RD_EXTRACT_BACKEND;
function hasJavaRuntime(): boolean {
const result = spawnSync("java", ["-version"], { stdio: "ignore" });
return result.status === 0;
}
function hasJvmExtractorRuntime(): boolean {
const root = path.join(process.cwd(), "resources", "extractor-jvm");
const classesMain = path.join(root, "classes", "com", "sucukdeluxe", "extractor", "JBindExtractorMain.class");
const requiredLibs = [
path.join(root, "lib", "sevenzipjbinding.jar"),
path.join(root, "lib", "sevenzipjbinding-all-platforms.jar"),
path.join(root, "lib", "zip4j.jar")
];
return fs.existsSync(classesMain) && requiredLibs.every((libPath) => fs.existsSync(libPath));
}
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalBackend;
}
});
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zipPath = path.join(packageDir, "release.zip");
const zip = new AdmZip();
zip.addFile("episode.txt", Buffer.from("ok"));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("emits progress callbacks with archiveName and percent", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create a ZIP with some content to trigger progress
const zipPath = path.join(packageDir, "progress-test.zip");
const zip = new AdmZip();
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
zip.writeZip(zipPath);
const progressUpdates: Array<{
archiveName: string;
percent: number;
phase: string;
archivePercent?: number;
}> = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
progressUpdates.push({
archiveName: update.archiveName,
percent: update.percent,
phase: update.phase,
archivePercent: update.archivePercent,
});
},
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
// Should have at least preparing, extracting, and done phases
const phases = new Set(progressUpdates.map((u) => u.phase));
expect(phases.has("preparing")).toBe(true);
expect(phases.has("extracting")).toBe(true);
// Extracting phase should include the archive name
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
expect(extracting.length).toBeGreaterThan(0);
// Should end at 100%
const lastExtracting = extracting[extracting.length - 1];
expect(lastExtracting.archivePercent).toBe(100);
// Files should exist
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
});
it("extracts multiple archives sequentially with progress for each", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create two separate ZIP archives
const zip1 = new AdmZip();
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
zip1.writeZip(path.join(packageDir, "archive1.zip"));
const zip2 = new AdmZip();
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
zip2.writeZip(path.join(packageDir, "archive2.zip"));
const archiveNames = new Set<string>();
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
if (update.phase === "extracting" && update.archiveName) {
archiveNames.add(update.archiveName);
}
},
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// Both archive names should have appeared in progress
expect(archiveNames.has("archive1.zip")).toBe(true);
expect(archiveNames.has("archive2.zip")).toBe(true);
// Both files extracted
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zipPath = path.join(packageDir, "conflict.zip");
const zip = new AdmZip();
zip.addFile("same.txt", Buffer.from("new"));
zip.writeZip(zipPath);
const existingPath = path.join(targetDir, "same.txt");
fs.writeFileSync(existingPath, "old", "utf8");
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "ask",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.readFileSync(existingPath, "utf8")).toBe("old");
});
});

File diff suppressed because it is too large Load Diff

View File

@ -1,84 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { parseHashLine, readHashManifest, validateFileAgainstManifest } from "../src/main/integrity";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("integrity", () => {
it("parses md5 and sfv lines", () => {
const md = parseHashLine("d41d8cd98f00b204e9800998ecf8427e sample.bin");
expect(md?.algorithm).toBe("md5");
const sfv = parseHashLine("sample.bin 1A2B3C4D");
expect(sfv?.algorithm).toBe("crc32");
});
it("validates file against md5 manifest", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
const filePath = path.join(dir, "movie.bin");
fs.writeFileSync(filePath, Buffer.from("hello"));
fs.writeFileSync(path.join(dir, "hash.md5"), "5d41402abc4b2a76b9719d911017c592 movie.bin\n");
const result = await validateFileAgainstManifest(filePath, dir);
expect(result.ok).toBe(true);
});
it("skips manifest files larger than 5MB", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
// Create a .md5 manifest that exceeds the 5MB limit
const largeContent = "d41d8cd98f00b204e9800998ecf8427e sample.bin\n".repeat(200000);
const manifestPath = path.join(dir, "hashes.md5");
fs.writeFileSync(manifestPath, largeContent, "utf8");
// Verify the file is actually > 5MB
const stat = fs.statSync(manifestPath);
expect(stat.size).toBeGreaterThan(5 * 1024 * 1024);
// readHashManifest should skip the oversized file
const manifest = readHashManifest(dir);
expect(manifest.size).toBe(0);
});
it("does not parse SHA256 (64-char hex) as valid hash", () => {
// SHA256 is 64 chars - parseHashLine only supports 32 (MD5) and 40 (SHA1)
const sha256Line = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 emptyfile.bin";
const result = parseHashLine(sha256Line);
// 64-char hex should not match the MD5 (32) or SHA1 (40) pattern
expect(result).toBeNull();
});
it("parses SHA1 hash lines correctly", () => {
const sha1Line = "da39a3ee5e6b4b0d3255bfef95601890afd80709 emptyfile.bin";
const result = parseHashLine(sha1Line);
expect(result).not.toBeNull();
expect(result?.algorithm).toBe("sha1");
expect(result?.digest).toBe("da39a3ee5e6b4b0d3255bfef95601890afd80709");
expect(result?.fileName).toBe("emptyfile.bin");
});
it("ignores comment lines in hash manifests", () => {
expect(parseHashLine("; This is a comment")).toBeNull();
expect(parseHashLine("")).toBeNull();
expect(parseHashLine(" ")).toBeNull();
});
it("keeps first hash entry when duplicate filename appears across manifests", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-int-"));
tempDirs.push(dir);
fs.writeFileSync(path.join(dir, "disc1.md5"), "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa movie.mkv\n", "utf8");
fs.writeFileSync(path.join(dir, "disc2.md5"), "bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb movie.mkv\n", "utf8");
const manifest = readHashManifest(dir);
expect(manifest.get("movie.mkv")?.digest).toBe("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa");
});
});

View File

@ -1,74 +0,0 @@
import { describe, expect, it } from "vitest";
import { mergePackageInputs, parseCollectorInput } from "../src/main/link-parser";
describe("link-parser", () => {
describe("mergePackageInputs", () => {
it("merges packages with the same name and preserves order", () => {
const input = [
{ name: "Package A", links: ["http://link1", "http://link2"] },
{ name: "Package B", links: ["http://link3"] },
{ name: "Package A", links: ["http://link4", "http://link1"] },
{ name: "", links: ["http://link5"] } // empty name will be inferred
];
const result = mergePackageInputs(input);
expect(result).toHaveLength(3); // Package A, Package B, and inferred 'Paket'
const pkgA = result.find(p => p.name === "Package A");
expect(pkgA?.links).toEqual(["http://link1", "http://link2", "http://link4"]); // link1 deduplicated
const pkgB = result.find(p => p.name === "Package B");
expect(pkgB?.links).toEqual(["http://link3"]);
});
it("sanitizes names during merge", () => {
const input = [
{ name: "Valid_Name", links: ["http://link1"] },
{ name: "Valid?Name*", links: ["http://link2"] }
];
const result = mergePackageInputs(input);
// "Valid?Name*" becomes "Valid Name " -> trimmed to "Valid Name"
expect(result.map(p => p.name).sort()).toEqual(["Valid Name", "Valid_Name"]);
});
});
describe("parseCollectorInput", () => {
it("returns empty array for empty or invalid input", () => {
expect(parseCollectorInput("")).toEqual([]);
expect(parseCollectorInput("just some text without links")).toEqual([]);
expect(parseCollectorInput("ftp://notsupported")).toEqual([]);
});
it("parses and merges links from raw text", () => {
const rawText = `
Here are some links:
http://example.com/part1.rar
http://example.com/part2.rar
# package: Custom_Name
http://other.com/file1
http://other.com/file2
`;
const result = parseCollectorInput(rawText, "DefaultFallback");
// Should have 2 packages: "DefaultFallback" and "Custom_Name"
expect(result).toHaveLength(2);
const defaultPkg = result.find(p => p.name === "DefaultFallback");
expect(defaultPkg?.links).toEqual([
"http://example.com/part1.rar",
"http://example.com/part2.rar"
]);
const customPkg = result.find(p => p.name === "Custom_Name"); // sanitized!
expect(customPkg?.links).toEqual([
"http://other.com/file1",
"http://other.com/file2"
]);
});
});
});

View File

@ -1,178 +0,0 @@
import { afterEach, describe, expect, it, vi } from "vitest";
import { MegaWebFallback } from "../src/main/mega-web-fallback";
const originalFetch = globalThis.fetch;
describe("mega-web-fallback", () => {
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("MegaWebFallback class", () => {
it("returns null when credentials are empty", async () => {
const fallback = new MegaWebFallback(() => ({ login: "", password: "" }));
const result = await fallback.unrestrict("https://mega.debrid/test");
expect(result).toBeNull();
});
it("logs in, fetches HTML, parses code, and polls AJAX for direct url", async () => {
let fetchCallCount = 0;
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
fetchCallCount += 1;
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
return new Response('<form id="debridForm"></form>', { status: 200 });
}
if (urlStr.includes("form=debrid")) {
// The POST to generate the code
return new Response(`
<div class="acp-box">
<h3>Link: https://mega.debrid/link1</h3>
<a href="javascript:processDebrid(1,'secretcode123',0)">Download</a>
</div>
`, { status: 200 });
}
if (urlStr.includes("ajax=debrid")) {
// Polling endpoint
return new Response(JSON.stringify({ link: "https://mega.direct/123" }), { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
const result = await fallback.unrestrict("https://mega.debrid/link1");
expect(result).not.toBeNull();
expect(result?.directUrl).toBe("https://mega.direct/123");
expect(result?.fileName).toBe("link1");
// Calls: 1. Login POST, 2. Verify GET, 3. Generate POST, 4. Polling POST
expect(fetchCallCount).toBe(4);
});
it("throws if login fails to set cookie", async () => {
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers(); // No cookie
return new Response("", { headers, status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "bad", password: "bad" }));
await expect(fallback.unrestrict("http://mega.debrid/file"))
.rejects.toThrow("Mega-Web Login liefert kein Session-Cookie");
});
it("throws if login verify check fails (no form found)", async () => {
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
// Missing form!
return new Response('<html><body>Nothing here</body></html>', { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
await expect(fallback.unrestrict("http://mega.debrid/file"))
.rejects.toThrow("Mega-Web Login ungültig oder Session blockiert");
});
it("returns null if generation fails to find a code", async () => {
let callCount = 0;
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
const urlStr = String(url);
callCount++;
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return new Response("", { headers, status: 200 });
}
if (urlStr.includes("page=debrideur")) {
return new Response('<form id="debridForm"></form>', { status: 200 });
}
if (urlStr.includes("form=debrid")) {
// The generate POST returns HTML without any codes
return new Response(`<div>No links here</div>`, { status: 200 });
}
return new Response("Not found", { status: 404 });
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
const result = await fallback.unrestrict("http://mega.debrid/file");
// Generation fails -> resets cookie -> tries again -> fails again -> returns null
expect(result).toBeNull();
});
it("aborts pending Mega-Web polling when signal is cancelled", async () => {
globalThis.fetch = vi.fn((url: string | URL | Request, init?: RequestInit): Promise<Response> => {
const urlStr = String(url);
if (urlStr.includes("form=login")) {
const headers = new Headers();
headers.append("set-cookie", "session=goodcookie; path=/");
return Promise.resolve(new Response("", { headers, status: 200 }));
}
if (urlStr.includes("page=debrideur")) {
return Promise.resolve(new Response('<form id="debridForm"></form>', { status: 200 }));
}
if (urlStr.includes("form=debrid")) {
return Promise.resolve(new Response(`
<div class="acp-box">
<h3>Link: https://mega.debrid/link2</h3>
<a href="javascript:processDebrid(1,'secretcode456',0)">Download</a>
</div>
`, { status: 200 }));
}
if (urlStr.includes("ajax=debrid")) {
return new Promise<Response>((_resolve, reject) => {
const signal = init?.signal;
const onAbort = (): void => reject(new Error("aborted:ajax"));
if (signal?.aborted) {
onAbort();
return;
}
signal?.addEventListener("abort", onAbort, { once: true });
});
}
return Promise.resolve(new Response("Not found", { status: 404 }));
}) as unknown as typeof fetch;
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
const controller = new AbortController();
const timer = setTimeout(() => {
controller.abort("test");
}, 200);
try {
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);
} finally {
clearTimeout(timer);
}
});
});
});

View File

@ -1,42 +0,0 @@
import { afterEach, describe, expect, it } from "vitest";
import { RealDebridClient } from "../src/main/realdebrid";
const originalFetch = globalThis.fetch;
afterEach(() => {
globalThis.fetch = originalFetch;
});
describe("realdebrid client", () => {
it("returns a clear error when HTML is returned instead of JSON", async () => {
globalThis.fetch = (async (): Promise<Response> => {
return new Response("<html><title>Cloudflare</title></html>", {
status: 200,
headers: { "Content-Type": "text/html" }
});
}) as typeof fetch;
const client = new RealDebridClient("rd-token");
await expect(client.unrestrictLink("https://hoster.example/file/html")).rejects.toThrow(/html/i);
});
it("does not leak raw response body on JSON parse errors", async () => {
globalThis.fetch = (async (): Promise<Response> => {
return new Response("<html>token=secret-should-not-leak</html>", {
status: 200,
headers: { "Content-Type": "application/json" }
});
}) as typeof fetch;
const client = new RealDebridClient("rd-token");
try {
await client.unrestrictLink("https://hoster.example/file/invalid-json");
throw new Error("expected unrestrict to fail");
} catch (error) {
const text = String(error || "");
expect(text.toLowerCase()).toContain("json");
expect(text.toLowerCase()).not.toContain("secret-should-not-leak");
expect(text.toLowerCase()).not.toContain("<html>");
}
});
});

View File

@ -1,188 +0,0 @@
import { describe, expect, it } from "vitest";
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
type MinimalItem = {
targetPath?: string;
fileName?: string;
[key: string]: unknown;
};
function makeItems(names: string[]): MinimalItem[] {
return names.map((name) => ({
targetPath: `C:\\Downloads\\Package\\${name}`,
fileName: name,
id: name,
status: "completed",
}));
}
describe("resolveArchiveItemsFromList", () => {
// ── Multipart RAR (.partN.rar) ──
it("matches multipart .part1.rar archives", () => {
const items = makeItems([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
"Other.rar",
]);
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(3);
expect(result.map((i: any) => i.fileName)).toEqual([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
]);
});
it("matches multipart .part01.rar archives (zero-padded)", () => {
const items = makeItems([
"Film.part01.rar",
"Film.part02.rar",
"Film.part10.rar",
"Unrelated.zip",
]);
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
expect(result).toHaveLength(3);
});
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
it("matches old-style .rar + .rNN volumes", () => {
const items = makeItems([
"Archive.rar",
"Archive.r00",
"Archive.r01",
"Archive.r02",
"Other.zip",
]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(4);
});
// ── Single RAR ──
it("matches a single .rar file", () => {
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("SingleFile.rar");
});
// ── Split ZIP ──
it("matches split .zip.NNN files", () => {
const items = makeItems([
"Data.zip",
"Data.zip.001",
"Data.zip.002",
"Data.zip.003",
]);
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
expect(result).toHaveLength(4);
});
// ── Split 7z ──
it("matches split .7z.NNN files", () => {
const items = makeItems([
"Backup.7z.001",
"Backup.7z.002",
]);
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
expect(result).toHaveLength(2);
});
// ── Generic .NNN splits ──
it("matches generic .NNN split files", () => {
const items = makeItems([
"video.001",
"video.002",
"video.003",
]);
const result = resolveArchiveItemsFromList("video.001", items as any);
expect(result).toHaveLength(3);
});
// ── Exact filename match ──
it("matches a single .zip by exact name", () => {
const items = makeItems(["myarchive.zip", "other.rar"]);
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("myarchive.zip");
});
// ── Case insensitivity ──
it("matches case-insensitively", () => {
const items = makeItems([
"MOVIE.PART1.RAR",
"MOVIE.PART2.RAR",
]);
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Stem-based fallback ──
it("uses stem-based fallback when exact patterns fail", () => {
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
// but the disk file is "Movie.part1.rar"
const items = makeItems([
"Movie.rar",
]);
// The archive on disk is "Movie.part1.rar" but there's no item matching the
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
// stem fallback: "movie" starts with "movie" and ends with .rar
expect(result).toHaveLength(1);
});
// ── Single item fallback ──
it("returns single archive item when no pattern matches", () => {
const items = makeItems(["totally-different-name.rar"]);
const result = resolveArchiveItemsFromList("Original.rar", items as any);
// Single item in list with archive extension → return it
expect(result).toHaveLength(1);
});
// ── Empty when no match ──
it("returns empty when items have no archive extensions", () => {
const items = makeItems(["video.mkv", "subtitle.srt"]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(0);
});
// ── Items without targetPath ──
it("falls back to fileName when targetPath is missing", () => {
const items = [
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
];
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Multiple archives, should not cross-match ──
it("does not cross-match different archive groups", () => {
const items = makeItems([
"Episode.S01E01.part1.rar",
"Episode.S01E01.part2.rar",
"Episode.S01E02.part1.rar",
"Episode.S01E02.part2.rar",
]);
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
expect(result1).toHaveLength(2);
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
expect(result2).toHaveLength(2);
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
});
});

View File

@ -1,208 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import http from "node:http";
import { once } from "node:events";
import { DownloadManager } from "../src/main/download-manager";
import { defaultSettings } from "../src/main/constants";
import { createStoragePaths, emptySession } from "../src/main/storage";
function assert(condition: unknown, message: string): void {
if (!condition) {
throw new Error(`Self-check fehlgeschlagen: ${message}`);
}
}
async function waitFor(predicate: () => boolean, timeoutMs = 20000): Promise<void> {
const start = Date.now();
while (!predicate()) {
if (Date.now() - start > timeoutMs) {
throw new Error("Timeout während Self-check");
}
await new Promise((resolve) => setTimeout(resolve, 100));
}
}
async function runDownloadCase(baseDir: string, baseUrl: string, url: string, options?: Partial<ReturnType<typeof defaultSettings>>): Promise<DownloadManager> {
const settings = {
...defaultSettings(),
token: "demo-token",
outputDir: path.join(baseDir, "downloads"),
extractDir: path.join(baseDir, "extract"),
autoExtract: false,
autoReconnect: true,
reconnectWaitSeconds: 1,
...options
};
const manager = new DownloadManager(settings, emptySession(), createStoragePaths(path.join(baseDir, "state")));
manager.addPackages([
{
name: "test-package",
links: [url]
}
]);
manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
return manager;
}
async function main(): Promise<void> {
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "rd-node-self-"));
const binary = Buffer.alloc(512 * 1024, 7);
let flakyFailures = 1;
const server = http.createServer((req, res) => {
const url = req.url || "/";
if (url.startsWith("/file.bin") || url.startsWith("/slow.bin") || url.startsWith("/rarcancel.bin") || url.startsWith("/flaky.bin")) {
if (url.startsWith("/flaky.bin") && flakyFailures > 0) {
flakyFailures -= 1;
res.statusCode = 503;
res.end("retry");
return;
}
const range = req.headers.range;
let start = 0;
if (range) {
const match = String(range).match(/bytes=(\d+)-/i);
if (match) {
start = Number(match[1]);
}
}
const chunk = binary.subarray(start);
if (start > 0) {
res.statusCode = 206;
res.setHeader("Content-Range", `bytes ${start}-${binary.length - 1}/${binary.length}`);
}
res.setHeader("Accept-Ranges", "bytes");
res.setHeader("Content-Length", chunk.length);
res.statusCode = res.statusCode || 200;
if (url.startsWith("/slow.bin") || url.startsWith("/rarcancel.bin")) {
const mid = Math.floor(chunk.length / 2);
res.write(chunk.subarray(0, mid));
setTimeout(() => {
res.end(chunk.subarray(mid));
}, 400);
return;
}
res.end(chunk);
return;
}
res.statusCode = 404;
res.end("not-found");
});
server.listen(0, "127.0.0.1");
await once(server, "listening");
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Server konnte nicht gestartet werden");
}
const baseUrl = `http://127.0.0.1:${address.port}`;
const originalFetch = globalThis.fetch;
globalThis.fetch = async (input: RequestInfo | URL, init?: RequestInit): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("/unrestrict/link")) {
const body = init?.body;
const params = body instanceof URLSearchParams ? body : new URLSearchParams(String(body || ""));
const link = params.get("link") || "";
const filename = link.includes("rarcancel") ? "release.part1.rar" : "file.bin";
const direct = link.includes("slow")
? `${baseUrl}/slow.bin`
: link.includes("rarcancel")
? `${baseUrl}/rarcancel.bin`
: link.includes("flaky")
? `${baseUrl}/flaky.bin`
: `${baseUrl}/file.bin`;
return new Response(
JSON.stringify({
download: direct,
filename,
filesize: binary.length
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
);
}
return originalFetch(input, init);
};
try {
const manager1 = await runDownloadCase(tempRoot, baseUrl, "https://dummy/file");
const snapshot1 = manager1.getSnapshot();
const item1 = Object.values(snapshot1.session.items)[0];
assert(item1?.status === "completed", "normaler Download wurde nicht abgeschlossen");
assert(fs.existsSync(item1.targetPath), "Datei fehlt nach Download");
const manager2 = new DownloadManager(
{
...defaultSettings(),
token: "demo-token",
outputDir: path.join(tempRoot, "downloads-pause"),
extractDir: path.join(tempRoot, "extract-pause"),
autoExtract: false,
autoReconnect: false
},
emptySession(),
createStoragePaths(path.join(tempRoot, "state-pause"))
);
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
await manager2.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const paused = manager2.togglePause();
assert(paused, "Pause konnte nicht aktiviert werden");
await new Promise((resolve) => setTimeout(resolve, 150));
manager2.togglePause();
await waitFor(() => !manager2.getSnapshot().session.running, 30000);
const item2 = Object.values(manager2.getSnapshot().session.items)[0];
assert(item2?.status === "completed", "Pause/Resume Download nicht abgeschlossen");
const manager3 = await runDownloadCase(tempRoot, baseUrl, "https://dummy/flaky", { autoReconnect: true, reconnectWaitSeconds: 1 });
const item3 = Object.values(manager3.getSnapshot().session.items)[0];
assert(item3?.status === "completed", "Reconnect-Fall nicht abgeschlossen");
const manager4 = new DownloadManager(
{
...defaultSettings(),
token: "demo-token",
outputDir: path.join(tempRoot, "downloads-cancel"),
extractDir: path.join(tempRoot, "extract-cancel"),
autoExtract: false
},
emptySession(),
createStoragePaths(path.join(tempRoot, "state-cancel"))
);
manager4.addPackages([{ name: "cancel", links: ["https://dummy/rarcancel"] }]);
manager4.start();
await new Promise((resolve) => setTimeout(resolve, 150));
const pkgId = manager4.getSnapshot().session.packageOrder[0];
manager4.cancelPackage(pkgId);
await waitFor(() => !manager4.getSnapshot().session.running || Object.values(manager4.getSnapshot().session.items).every((item) => item.status !== "downloading"), 15000);
const cancelSnapshot = manager4.getSnapshot();
const remainingItems = Object.values(cancelSnapshot.session.items);
if (remainingItems.length === 0) {
assert(cancelSnapshot.session.packageOrder.length === 0, "Abgebrochenes Paket wurde nicht entfernt");
} else {
const cancelItem = remainingItems[0];
assert(cancelItem?.status === "cancelled" || cancelItem?.status === "queued", "Paketabbruch nicht wirksam");
}
const packageDir = path.join(path.join(tempRoot, "downloads-cancel"), "cancel");
const cancelArtifact = path.join(packageDir, "release.part1.rar");
await waitFor(() => !fs.existsSync(cancelArtifact), 10000);
assert(!fs.existsSync(cancelArtifact), "RAR-Artefakt wurde nicht gelöscht");
console.log("Node self-check erfolgreich");
} finally {
globalThis.fetch = originalFetch;
server.close();
fs.rmSync(tempRoot, { recursive: true, force: true });
}
}
void main();

View File

@ -1,163 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "../src/main/session-log";
import { setLogListener } from "../src/main/logger";
const tempDirs: string[] = [];
afterEach(() => {
// Ensure session log is shut down between tests
shutdownSessionLog();
// Ensure listener is cleared between tests
setLogListener(null);
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("session-log", () => {
it("initSessionLog creates directory and file", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath();
expect(logPath).not.toBeNull();
expect(fs.existsSync(logPath!)).toBe(true);
expect(fs.existsSync(path.join(baseDir, "session-logs"))).toBe(true);
expect(path.basename(logPath!)).toMatch(/^session_\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}\.txt$/);
const content = fs.readFileSync(logPath!, "utf8");
expect(content).toContain("=== Session gestartet:");
shutdownSessionLog();
});
it("logger listener writes to session log", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
// Simulate a log line via the listener
const { logger } = await import("../src/main/logger");
logger.info("Test-Nachricht für Session-Log");
// Wait for flush (200ms interval + margin)
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("Test-Nachricht für Session-Log");
shutdownSessionLog();
});
it("shutdownSessionLog writes closing line", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("=== Session beendet:");
});
it("shutdownSessionLog removes listener", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
// Log after shutdown - should NOT appear in session log
const { logger } = await import("../src/main/logger");
logger.info("Nach-Shutdown-Nachricht");
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).not.toContain("Nach-Shutdown-Nachricht");
});
it("cleanupOldSessionLogs deletes old files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a fake old session log
const oldFile = path.join(logsDir, "session_2020-01-01_00-00-00.txt");
fs.writeFileSync(oldFile, "old session");
// Set mtime to 30 days ago
const oldTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
fs.utimesSync(oldFile, oldTime, oldTime);
// Create a recent file
const newFile = path.join(logsDir, "session_2099-01-01_00-00-00.txt");
fs.writeFileSync(newFile, "new session");
// initSessionLog triggers cleanup
initSessionLog(baseDir);
// Wait for async cleanup
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(oldFile)).toBe(false);
expect(fs.existsSync(newFile)).toBe(true);
shutdownSessionLog();
});
it("cleanupOldSessionLogs keeps recent files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a file from 2 days ago (should be kept)
const recentFile = path.join(logsDir, "session_2025-12-01_00-00-00.txt");
fs.writeFileSync(recentFile, "recent session");
const recentTime = new Date(Date.now() - 2 * 24 * 60 * 60 * 1000);
fs.utimesSync(recentFile, recentTime, recentTime);
initSessionLog(baseDir);
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(recentFile)).toBe(true);
shutdownSessionLog();
});
it("multiple sessions create different files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const path1 = getSessionLogPath();
shutdownSessionLog();
// Small delay to ensure different timestamp
await new Promise((resolve) => setTimeout(resolve, 1100));
initSessionLog(baseDir);
const path2 = getSessionLogPath();
shutdownSessionLog();
expect(path1).not.toBeNull();
expect(path2).not.toBeNull();
expect(path1).not.toBe(path2);
expect(fs.existsSync(path1!)).toBe(true);
expect(fs.existsSync(path2!)).toBe(true);
});
});

View File

@ -1,513 +0,0 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { AppSettings } from "../src/shared/types";
import { defaultSettings } from "../src/main/constants";
import { createStoragePaths, emptySession, loadSession, loadSettings, normalizeSettings, saveSession, saveSessionAsync, saveSettings } from "../src/main/storage";
const tempDirs: string[] = [];
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("settings storage", () => {
it("does not persist provider credentials when rememberToken is disabled", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
saveSettings(paths, {
...defaultSettings(),
rememberToken: false,
token: "rd-token",
megaLogin: "mega-user",
megaPassword: "mega-pass",
bestToken: "best-token",
allDebridToken: "all-token"
});
const raw = JSON.parse(fs.readFileSync(paths.configFile, "utf8")) as Record<string, unknown>;
expect(raw.token).toBe("");
expect(raw.megaLogin).toBe("");
expect(raw.megaPassword).toBe("");
expect(raw.bestToken).toBe("");
expect(raw.allDebridToken).toBe("");
const loaded = loadSettings(paths);
expect(loaded.rememberToken).toBe(false);
expect(loaded.token).toBe("");
expect(loaded.megaLogin).toBe("");
expect(loaded.megaPassword).toBe("");
expect(loaded.bestToken).toBe("");
expect(loaded.allDebridToken).toBe("");
});
it("persists provider credentials when rememberToken is enabled", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
saveSettings(paths, {
...defaultSettings(),
rememberToken: true,
token: "rd-token",
megaLogin: "mega-user",
megaPassword: "mega-pass",
bestToken: "best-token",
allDebridToken: "all-token"
});
const loaded = loadSettings(paths);
expect(loaded.token).toBe("rd-token");
expect(loaded.megaLogin).toBe("mega-user");
expect(loaded.megaPassword).toBe("mega-pass");
expect(loaded.bestToken).toBe("best-token");
expect(loaded.allDebridToken).toBe("all-token");
});
it("normalizes invalid enum and numeric values", () => {
const normalized = normalizeSettings({
...defaultSettings(),
providerPrimary: "invalid-provider" as unknown as AppSettings["providerPrimary"],
providerSecondary: "invalid-provider" as unknown as AppSettings["providerSecondary"],
providerTertiary: "invalid-provider" as unknown as AppSettings["providerTertiary"],
cleanupMode: "broken" as unknown as AppSettings["cleanupMode"],
extractConflictMode: "broken" as unknown as AppSettings["extractConflictMode"],
completedCleanupPolicy: "broken" as unknown as AppSettings["completedCleanupPolicy"],
speedLimitMode: "broken" as unknown as AppSettings["speedLimitMode"],
maxParallel: 0,
retryLimit: 999,
reconnectWaitSeconds: 9999,
speedLimitKbps: -1,
outputDir: " ",
extractDir: " ",
mkvLibraryDir: " ",
updateRepo: " "
});
expect(normalized.providerPrimary).toBe("realdebrid");
expect(normalized.providerSecondary).toBe("none");
expect(normalized.providerTertiary).toBe("none");
expect(normalized.cleanupMode).toBe("none");
expect(normalized.extractConflictMode).toBe("overwrite");
expect(normalized.completedCleanupPolicy).toBe("never");
expect(normalized.speedLimitMode).toBe("global");
expect(normalized.maxParallel).toBe(1);
expect(normalized.retryLimit).toBe(99);
expect(normalized.reconnectWaitSeconds).toBe(600);
expect(normalized.speedLimitKbps).toBe(0);
expect(normalized.outputDir).toBe(defaultSettings().outputDir);
expect(normalized.extractDir).toBe(defaultSettings().extractDir);
expect(normalized.mkvLibraryDir).toBe(defaultSettings().mkvLibraryDir);
expect(normalized.updateRepo).toBe(defaultSettings().updateRepo);
});
it("normalizes malformed persisted config on load", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(
paths.configFile,
JSON.stringify({
providerPrimary: "not-valid",
completedCleanupPolicy: "not-valid",
maxParallel: "999",
retryLimit: "-3",
reconnectWaitSeconds: "1",
speedLimitMode: "not-valid",
updateRepo: ""
}),
"utf8"
);
const loaded = loadSettings(paths);
expect(loaded.providerPrimary).toBe("realdebrid");
expect(loaded.completedCleanupPolicy).toBe("never");
expect(loaded.maxParallel).toBe(50);
expect(loaded.retryLimit).toBe(0);
expect(loaded.reconnectWaitSeconds).toBe(10);
expect(loaded.speedLimitMode).toBe("global");
expect(loaded.updateRepo).toBe(defaultSettings().updateRepo);
});
it("keeps explicit none as fallback provider choice", () => {
const normalized = normalizeSettings({
...defaultSettings(),
providerSecondary: "none",
providerTertiary: "none"
});
expect(normalized.providerSecondary).toBe("none");
expect(normalized.providerTertiary).toBe("none");
});
it("normalizes archive password list line endings", () => {
const normalized = normalizeSettings({
...defaultSettings(),
archivePasswordList: "one\r\ntwo\r\nthree"
});
expect(normalized.archivePasswordList).toBe("one\ntwo\nthree");
});
it("assigns and preserves bandwidth schedule ids", () => {
const normalized = normalizeSettings({
...defaultSettings(),
bandwidthSchedules: [{ id: "", startHour: 1, endHour: 6, speedLimitKbps: 1024, enabled: true }]
});
const generatedId = normalized.bandwidthSchedules[0]?.id;
expect(typeof generatedId).toBe("string");
expect(generatedId?.length).toBeGreaterThan(0);
const normalizedAgain = normalizeSettings({
...defaultSettings(),
bandwidthSchedules: normalized.bandwidthSchedules
});
expect(normalizedAgain.bandwidthSchedules[0]?.id).toBe(generatedId);
});
it("resets stale active statuses to queued on session load", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const session = emptySession();
session.packages["pkg1"] = {
id: "pkg1",
name: "Test Package",
outputDir: "/tmp/out",
extractDir: "/tmp/extract",
status: "downloading",
itemIds: ["item1", "item2", "item3", "item4"],
cancelled: false,
enabled: true,
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item1"] = {
id: "item1",
packageId: "pkg1",
url: "https://example.com/file1.rar",
provider: null,
status: "downloading",
retries: 0,
speedBps: 1024,
downloadedBytes: 5000,
totalBytes: 10000,
progressPercent: 50,
fileName: "file1.rar",
targetPath: "/tmp/out/file1.rar",
resumable: true,
attempts: 1,
lastError: "some error",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item2"] = {
id: "item2",
packageId: "pkg1",
url: "https://example.com/file2.rar",
provider: null,
status: "paused",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "file2.rar",
targetPath: "/tmp/out/file2.rar",
resumable: false,
attempts: 0,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item3"] = {
id: "item3",
packageId: "pkg1",
url: "https://example.com/file3.rar",
provider: null,
status: "completed",
retries: 0,
speedBps: 0,
downloadedBytes: 10000,
totalBytes: 10000,
progressPercent: 100,
fileName: "file3.rar",
targetPath: "/tmp/out/file3.rar",
resumable: false,
attempts: 1,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
session.items["item4"] = {
id: "item4",
packageId: "pkg1",
url: "https://example.com/file4.rar",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "file4.rar",
targetPath: "/tmp/out/file4.rar",
resumable: false,
attempts: 0,
lastError: "",
fullStatus: "",
createdAt: Date.now(),
updatedAt: Date.now()
};
saveSession(paths, session);
const loaded = loadSession(paths);
// Active statuses (downloading, paused) should be reset to "queued"
expect(loaded.items["item1"].status).toBe("queued");
expect(loaded.items["item2"].status).toBe("queued");
// Speed should be cleared
expect(loaded.items["item1"].speedBps).toBe(0);
// lastError should be cleared for reset items
expect(loaded.items["item1"].lastError).toBe("");
// Completed and queued statuses should be preserved
expect(loaded.items["item3"].status).toBe("completed");
expect(loaded.items["item4"].status).toBe("queued");
// Downloaded bytes should be preserved
expect(loaded.items["item1"].downloadedBytes).toBe(5000);
// Package data should be preserved
expect(loaded.packages["pkg1"].name).toBe("Test Package");
});
it("returns empty session when session file contains invalid JSON", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(paths.sessionFile, "{{{corrupted json!!!", "utf8");
const loaded = loadSession(paths);
const empty = emptySession();
expect(loaded.packages).toEqual(empty.packages);
expect(loaded.items).toEqual(empty.items);
expect(loaded.packageOrder).toEqual(empty.packageOrder);
});
it("loads backup session when primary session is corrupted", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const backupSession = emptySession();
backupSession.packageOrder = ["pkg-backup"];
backupSession.packages["pkg-backup"] = {
id: "pkg-backup",
name: "Backup Package",
outputDir: path.join(dir, "out"),
extractDir: path.join(dir, "extract"),
status: "queued",
itemIds: ["item-backup"],
cancelled: false,
enabled: true,
createdAt: Date.now(),
updatedAt: Date.now()
};
backupSession.items["item-backup"] = {
id: "item-backup",
packageId: "pkg-backup",
url: "https://example.com/backup-file",
provider: null,
status: "queued",
retries: 0,
speedBps: 0,
downloadedBytes: 0,
totalBytes: null,
progressPercent: 0,
fileName: "backup-file.rar",
targetPath: path.join(dir, "out", "backup-file.rar"),
resumable: true,
attempts: 0,
lastError: "",
fullStatus: "Wartet",
createdAt: Date.now(),
updatedAt: Date.now()
};
fs.writeFileSync(`${paths.sessionFile}.bak`, JSON.stringify(backupSession), "utf8");
fs.writeFileSync(paths.sessionFile, "{broken-session-json", "utf8");
const loaded = loadSession(paths);
expect(loaded.packageOrder).toEqual(["pkg-backup"]);
expect(loaded.packages["pkg-backup"]?.name).toBe("Backup Package");
expect(loaded.items["item-backup"]?.fileName).toBe("backup-file.rar");
const restoredPrimary = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { packages?: Record<string, unknown> };
expect(restoredPrimary.packages && "pkg-backup" in restoredPrimary.packages).toBe(true);
});
it("returns defaults when config file contains invalid JSON", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
// Write invalid JSON to the config file
fs.writeFileSync(paths.configFile, "{{{{not valid json!!!}", "utf8");
const loaded = loadSettings(paths);
const defaults = defaultSettings();
expect(loaded.providerPrimary).toBe(defaults.providerPrimary);
expect(loaded.maxParallel).toBe(defaults.maxParallel);
expect(loaded.retryLimit).toBe(defaults.retryLimit);
expect(loaded.outputDir).toBe(defaults.outputDir);
expect(loaded.cleanupMode).toBe(defaults.cleanupMode);
});
it("loads backup config when primary config is corrupted", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const backupSettings = {
...defaultSettings(),
outputDir: path.join(dir, "backup-output"),
packageName: "from-backup"
};
fs.writeFileSync(`${paths.configFile}.bak`, JSON.stringify(backupSettings, null, 2), "utf8");
fs.writeFileSync(paths.configFile, "{broken-json", "utf8");
const loaded = loadSettings(paths);
expect(loaded.outputDir).toBe(backupSettings.outputDir);
expect(loaded.packageName).toBe("from-backup");
});
it("sanitizes malformed persisted session structures", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
fs.writeFileSync(paths.sessionFile, JSON.stringify({
version: "invalid",
packageOrder: [123, "pkg-valid"],
packages: {
"1": "bad-entry",
"pkg-valid": {
id: "pkg-valid",
name: "Valid Package",
outputDir: "C:/tmp/out",
extractDir: "C:/tmp/extract",
status: "downloading",
itemIds: ["item-valid", 123],
cancelled: false,
enabled: true
}
},
items: {
"item-valid": {
id: "item-valid",
packageId: "pkg-valid",
url: "https://example.com/file",
status: "queued",
fileName: "file.bin",
targetPath: "C:/tmp/out/file.bin"
},
"item-bad": "broken"
}
}), "utf8");
const loaded = loadSession(paths);
expect(Object.keys(loaded.packages)).toEqual(["pkg-valid"]);
expect(Object.keys(loaded.items)).toEqual(["item-valid"]);
expect(loaded.packageOrder).toEqual(["pkg-valid"]);
});
it("captures async session save payload before later mutations", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const session = emptySession();
session.summaryText = "before-mutation";
const pending = saveSessionAsync(paths, session);
session.summaryText = "after-mutation";
await pending;
const persisted = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { summaryText: string };
expect(persisted.summaryText).toBe("before-mutation");
});
it("creates session backup before sync and async session overwrites", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
const first = emptySession();
first.summaryText = "first";
saveSession(paths, first);
const second = emptySession();
second.summaryText = "second";
saveSession(paths, second);
const backupAfterSync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
expect(backupAfterSync.summaryText).toBe("first");
const third = emptySession();
third.summaryText = "third";
await saveSessionAsync(paths, third);
const backupAfterAsync = JSON.parse(fs.readFileSync(`${paths.sessionFile}.bak`, "utf8")) as { summaryText?: string };
const primaryAfterAsync = JSON.parse(fs.readFileSync(paths.sessionFile, "utf8")) as { summaryText?: string };
expect(backupAfterAsync.summaryText).toBe("second");
expect(primaryAfterAsync.summaryText).toBe("third");
});
it("applies defaults for missing fields when loading old config", () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-store-"));
tempDirs.push(dir);
const paths = createStoragePaths(dir);
// Write a minimal config that simulates an old version missing newer fields
fs.writeFileSync(
paths.configFile,
JSON.stringify({
token: "my-token",
rememberToken: true,
outputDir: "/custom/output"
}),
"utf8"
);
const loaded = loadSettings(paths);
const defaults = defaultSettings();
// Old fields should be preserved
expect(loaded.token).toBe("my-token");
expect(loaded.outputDir).toBe(path.resolve("/custom/output"));
// Missing new fields should get default values
expect(loaded.autoProviderFallback).toBe(defaults.autoProviderFallback);
expect(loaded.hybridExtract).toBe(defaults.hybridExtract);
expect(loaded.completedCleanupPolicy).toBe(defaults.completedCleanupPolicy);
expect(loaded.speedLimitMode).toBe(defaults.speedLimitMode);
expect(loaded.clipboardWatch).toBe(defaults.clipboardWatch);
expect(loaded.minimizeToTray).toBe(defaults.minimizeToTray);
expect(loaded.retryLimit).toBe(defaults.retryLimit);
expect(loaded.collectMkvToLibrary).toBe(defaults.collectMkvToLibrary);
expect(loaded.mkvLibraryDir).toBe(defaults.mkvLibraryDir);
expect(loaded.theme).toBe(defaults.theme);
expect(loaded.bandwidthSchedules).toEqual(defaults.bandwidthSchedules);
expect(loaded.updateRepo).toBe(defaults.updateRepo);
});
});

View File

@ -1,567 +0,0 @@
import fs from "node:fs";
import crypto from "node:crypto";
import { afterEach, describe, expect, it, vi } from "vitest";
import { checkGitHubUpdate, installLatestUpdate, isRemoteNewer, normalizeUpdateRepo, parseVersionParts } from "../src/main/update";
import { APP_VERSION } from "../src/main/constants";
import { UpdateCheckResult, UpdateInstallProgress } from "../src/shared/types";
const originalFetch = globalThis.fetch;
function sha256Hex(buffer: Buffer): string {
return crypto.createHash("sha256").update(buffer).digest("hex");
}
function sha512Hex(buffer: Buffer): string {
return crypto.createHash("sha512").update(buffer).digest("hex");
}
afterEach(() => {
globalThis.fetch = originalFetch;
vi.restoreAllMocks();
});
describe("update", () => {
it("normalizes update repo input", () => {
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo/releases/tag/v1.2.3")).toBe("owner/repo");
expect(normalizeUpdateRepo("codeberg.org/owner/repo.git")).toBe("owner/repo");
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
});
it("uses normalized repo slug for API requests", async () => {
let requestedUrl = "";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
return new Response(
JSON.stringify({
tag_name: `v${APP_VERSION}`,
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
assets: []
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
);
}) as typeof fetch;
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
expect(result.currentVersion).toBe(APP_VERSION);
expect(result.latestVersion).toBe(APP_VERSION);
expect(result.updateAvailable).toBe(false);
});
it("picks setup executable asset from release list", async () => {
globalThis.fetch = (async (): Promise<Response> => new Response(
JSON.stringify({
tag_name: "v9.9.9",
html_url: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
assets: [
{
name: "Real-Debrid-Downloader 9.9.9.exe",
browser_download_url: "https://example.invalid/portable.exe"
},
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup.exe",
digest: "sha256:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
}
]
}),
{
status: 200,
headers: { "Content-Type": "application/json" }
}
)) as typeof fetch;
const result = await checkGitHubUpdate("owner/repo");
expect(result.updateAvailable).toBe(true);
expect(result.setupAssetUrl).toBe("https://example.invalid/setup.exe");
expect(result.setupAssetName).toBe("Real-Debrid-Downloader Setup 9.9.9.exe");
});
it("falls back to alternate download URL when setup asset URL returns 404", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const executableDigest = sha256Hex(executablePayload);
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.includes("stale-setup.exe")) {
return new Response("missing", { status: 404 });
}
if (url.includes("/releases/download/v9.9.9/")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/stale-setup.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: `sha256:${executableDigest}`
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.includes("/releases/download/v9.9.9/"))).toBe(true);
expect(requestedUrls.filter((url) => url.includes("stale-setup.exe"))).toHaveLength(1);
});
it("skips draft tag payload and resolves setup asset from stable latest release", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: true,
prerelease: false,
assets: [
{
name: "Draft Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/draft-setup.exe"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.endsWith("/releases/latest")) {
const stableDigest = sha256Hex(executablePayload);
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Stable Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/stable-setup.exe",
digest: `sha256:${stableDigest}`
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("stable-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "",
setupAssetName: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/tags/v9.9.9"))).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/latest"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("stable-setup.exe"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("draft-setup.exe"))).toBe(false);
});
it("times out hanging release JSON body reads", async () => {
vi.useFakeTimers();
try {
const cancelSpy = vi.fn(async () => undefined);
globalThis.fetch = (async (): Promise<Response> => ({
ok: true,
status: 200,
headers: new Headers({ "Content-Type": "application/json" }),
json: () => new Promise(() => undefined),
body: {
cancel: cancelSpy
}
} as unknown as Response)) as typeof fetch;
const pending = checkGitHubUpdate("owner/repo");
await vi.advanceTimersByTimeAsync(13000);
const result = await pending;
expect(result.updateAvailable).toBe(false);
expect(String(result.error || "")).toMatch(/timeout/i);
expect(cancelSpy).toHaveBeenCalledTimes(1);
} finally {
vi.useRealTimers();
}
});
it("aborts hanging update body downloads on idle timeout", async () => {
const previousTimeout = process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS;
process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS = "1000";
try {
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("hang-setup.exe")) {
const body = new ReadableStream<Uint8Array>({
start(controller) {
controller.enqueue(new Uint8Array([1, 2, 3]));
}
});
return new Response(body, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/hang-setup.exe",
setupAssetName: "",
setupAssetDigest: "sha256:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/timeout/i);
} finally {
if (previousTimeout === undefined) {
delete process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS;
} else {
process.env.RD_UPDATE_BODY_IDLE_TIMEOUT_MS = previousTimeout;
}
}
}, 20000);
it("blocks installer start when SHA256 digest mismatches", async () => {
const executablePayload = fs.readFileSync(process.execPath);
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("mismatch-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: { "Content-Type": "application/octet-stream" }
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/mismatch-setup.exe",
setupAssetName: "setup.exe",
setupAssetDigest: "sha256:1111111111111111111111111111111111111111111111111111111111111111"
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/integrit|sha256|mismatch/i);
});
it("uses latest.yml SHA512 digest when API asset digest is missing", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const digestSha512Hex = sha512Hex(executablePayload);
const digestSha512Base64 = Buffer.from(digestSha512Hex, "hex").toString("base64");
const requestedUrls: string[] = [];
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
requestedUrls.push(url);
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup-no-digest.exe"
},
{
name: "latest.yml",
browser_download_url: "https://example.invalid/latest.yml"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("latest.yml")) {
return new Response(
`version: 9.9.9\npath: Real-Debrid-Downloader-Setup-9.9.9.exe\nsha512: ${digestSha512Base64}\n`,
{
status: 200,
headers: { "Content-Type": "text/yaml" }
}
);
}
if (url.includes("setup-no-digest.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/setup-no-digest.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(true);
expect(requestedUrls.some((url) => url.endsWith("/releases/tags/v9.9.9"))).toBe(true);
expect(requestedUrls.some((url) => url.includes("latest.yml"))).toBe(true);
});
it("rejects installer when latest.yml SHA512 digest does not match", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const wrongDigestBase64 = Buffer.alloc(64, 0x13).toString("base64");
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.endsWith("/releases/tags/v9.9.9")) {
return new Response(JSON.stringify({
tag_name: "v9.9.9",
draft: false,
prerelease: false,
assets: [
{
name: "Real-Debrid-Downloader Setup 9.9.9.exe",
browser_download_url: "https://example.invalid/setup-no-digest.exe"
},
{
name: "latest.yml",
browser_download_url: "https://example.invalid/latest.yml"
}
]
}), {
status: 200,
headers: { "Content-Type": "application/json" }
});
}
if (url.includes("latest.yml")) {
return new Response(
`version: 9.9.9\npath: Real-Debrid-Downloader Setup 9.9.9.exe\nsha512: ${wrongDigestBase64}\n`,
{
status: 200,
headers: { "Content-Type": "text/yaml" }
}
);
}
if (url.includes("setup-no-digest.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/setup-no-digest.exe",
setupAssetName: "Real-Debrid-Downloader Setup 9.9.9.exe",
setupAssetDigest: ""
};
const result = await installLatestUpdate("owner/repo", prechecked);
expect(result.started).toBe(false);
expect(result.message).toMatch(/sha512|integrit|mismatch/i);
});
it("emits install progress events while downloading and launching update", async () => {
const executablePayload = fs.readFileSync(process.execPath);
const digest = sha256Hex(executablePayload);
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
const url = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
if (url.includes("progress-setup.exe")) {
return new Response(executablePayload, {
status: 200,
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": String(executablePayload.length)
}
});
}
return new Response("missing", { status: 404 });
}) as typeof fetch;
const prechecked: UpdateCheckResult = {
updateAvailable: true,
currentVersion: APP_VERSION,
latestVersion: "9.9.9",
latestTag: "v9.9.9",
releaseUrl: "https://codeberg.org/owner/repo/releases/tag/v9.9.9",
setupAssetUrl: "https://example.invalid/progress-setup.exe",
setupAssetName: "setup.exe",
setupAssetDigest: `sha256:${digest}`
};
const progressEvents: UpdateInstallProgress[] = [];
const result = await installLatestUpdate("owner/repo", prechecked, (progress) => {
progressEvents.push(progress);
});
expect(result.started).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "starting")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "downloading")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "verifying")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "launching")).toBe(true);
expect(progressEvents.some((entry) => entry.stage === "done")).toBe(true);
});
});
describe("normalizeUpdateRepo extended", () => {
it("handles trailing slashes and extra path segments", () => {
expect(normalizeUpdateRepo("owner/repo/")).toBe("owner/repo");
expect(normalizeUpdateRepo("/owner/repo/")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo/tree/main/src")).toBe("owner/repo");
});
it("handles ssh-style git URLs", () => {
expect(normalizeUpdateRepo("git@codeberg.org:user/project.git")).toBe("user/project");
});
it("returns default for malformed inputs", () => {
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
});
it("rejects traversal-like owner or repo segments", () => {
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
});
it("handles www prefix", () => {
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("www.codeberg.org/owner/repo")).toBe("owner/repo");
});
});
describe("isRemoteNewer", () => {
it("detects newer major version", () => {
expect(isRemoteNewer("1.0.0", "2.0.0")).toBe(true);
});
it("detects newer minor version", () => {
expect(isRemoteNewer("1.2.0", "1.3.0")).toBe(true);
});
it("detects newer patch version", () => {
expect(isRemoteNewer("1.2.3", "1.2.4")).toBe(true);
});
it("returns false for same version", () => {
expect(isRemoteNewer("1.2.3", "1.2.3")).toBe(false);
});
it("returns false for older version", () => {
expect(isRemoteNewer("2.0.0", "1.0.0")).toBe(false);
expect(isRemoteNewer("1.3.0", "1.2.0")).toBe(false);
expect(isRemoteNewer("1.2.4", "1.2.3")).toBe(false);
});
it("handles versions with different segment counts", () => {
expect(isRemoteNewer("1.2", "1.2.1")).toBe(true);
expect(isRemoteNewer("1.2.1", "1.2")).toBe(false);
expect(isRemoteNewer("1", "1.0.1")).toBe(true);
});
it("handles v-prefix in version strings", () => {
expect(isRemoteNewer("v1.0.0", "v2.0.0")).toBe(true);
expect(isRemoteNewer("v1.0.0", "v1.0.0")).toBe(false);
});
});
describe("parseVersionParts", () => {
it("parses standard version strings", () => {
expect(parseVersionParts("1.2.3")).toEqual([1, 2, 3]);
expect(parseVersionParts("10.20.30")).toEqual([10, 20, 30]);
});
it("strips v prefix", () => {
expect(parseVersionParts("v1.2.3")).toEqual([1, 2, 3]);
expect(parseVersionParts("V1.2.3")).toEqual([1, 2, 3]);
});
it("handles single segment", () => {
expect(parseVersionParts("5")).toEqual([5]);
});
it("handles version with pre-release suffix", () => {
// Non-numeric suffixes are stripped per part
expect(parseVersionParts("1.2.3-beta")).toEqual([1, 2, 3]);
expect(parseVersionParts("1.2.3rc1")).toEqual([1, 2, 3]);
});
it("handles empty and whitespace", () => {
expect(parseVersionParts("")).toEqual([0]);
expect(parseVersionParts(" ")).toEqual([0]);
});
it("handles versions with extra dots", () => {
expect(parseVersionParts("1.2.3.4")).toEqual([1, 2, 3, 4]);
});
});

View File

@ -1,118 +0,0 @@
import { describe, expect, it } from "vitest";
import { extractHttpLinksFromText, parsePackagesFromLinksText, isHttpLink, sanitizeFilename, formatEta, filenameFromUrl, looksLikeOpaqueFilename } from "../src/main/utils";
describe("utils", () => {
it("validates http links", () => {
expect(isHttpLink("https://example.com/file")).toBe(true);
expect(isHttpLink("http://example.com/file")).toBe(true);
expect(isHttpLink("ftp://example.com")).toBe(false);
expect(isHttpLink("foo bar")).toBe(false);
});
it("extracts links from text and trims trailing punctuation", () => {
const links = extractHttpLinksFromText("See (https://example.com/test) and https://rapidgator.net/file/abc123, plus https://example.com/a.b.");
expect(links).toEqual([
"https://example.com/test",
"https://rapidgator.net/file/abc123",
"https://example.com/a.b"
]);
});
it("sanitizes filenames", () => {
expect(sanitizeFilename("foo/bar:baz*")).toBe("foo bar baz");
expect(sanitizeFilename(" ")).toBe("Paket");
expect(sanitizeFilename("test\0file.txt")).toBe("testfile.txt");
expect(sanitizeFilename("\0\0\0")).toBe("Paket");
expect(sanitizeFilename("..")).toBe("Paket");
expect(sanitizeFilename(".")).toBe("Paket");
expect(sanitizeFilename("release... ")).toBe("release");
expect(sanitizeFilename(" con ")).toBe("con_");
});
it("parses package markers", () => {
const parsed = parsePackagesFromLinksText(
"# package: A\nhttps://a.com/1\nhttps://a.com/2\n# package: B\nhttps://b.com/1\n",
"Default"
);
expect(parsed).toHaveLength(2);
expect(parsed[0].name).toBe("A");
expect(parsed[0].links).toHaveLength(2);
expect(parsed[1].name).toBe("B");
});
it("formats eta", () => {
expect(formatEta(-1)).toBe("--");
expect(formatEta(65)).toBe("01:05");
expect(formatEta(3661)).toBe("01:01:01");
});
it("normalizes filenames from links", () => {
expect(filenameFromUrl("https://rapidgator.net/file/id/show.part1.rar.html")).toBe("show.part1.rar");
expect(filenameFromUrl("https://debrid.example/dl/abc?filename=Movie.S01E01.mkv")).toBe("Movie.S01E01.mkv");
expect(filenameFromUrl("https://debrid.example/dl/%E0%A4%A")).toBe("%E0%A4%A");
expect(filenameFromUrl("https://debrid.example/dl/e51f6809bb6ca615601f5ac5db433737")).toBe("e51f6809bb6ca615601f5ac5db433737");
expect(filenameFromUrl("data:text/plain;base64,SGVsbG8=")).toBe("download.bin");
expect(filenameFromUrl("blob:https://example.com/12345678-1234-1234-1234-1234567890ab")).toBe("download.bin");
expect(looksLikeOpaqueFilename("download.bin")).toBe(true);
expect(looksLikeOpaqueFilename("e51f6809bb6ca615601f5ac5db433737")).toBe(true);
expect(looksLikeOpaqueFilename("movie.part1.rar")).toBe(false);
});
it("preserves unicode filenames", () => {
expect(sanitizeFilename("日本語ファイル.txt")).toBe("日本語ファイル.txt");
expect(sanitizeFilename("Ünïcödé Tëst.mkv")).toBe("Ünïcödé Tëst.mkv");
expect(sanitizeFilename("파일이름.rar")).toBe("파일이름.rar");
expect(sanitizeFilename("файл.zip")).toBe("файл.zip");
});
it("handles very long filenames", () => {
const longName = "a".repeat(300);
const result = sanitizeFilename(longName);
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
// The function should return a non-empty string and not crash
expect(result).toBe(longName);
});
it("formats eta with very large values without crashing", () => {
const result = formatEta(999999);
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
// 999999 seconds = 277h 46m 39s
expect(result).toBe("277:46:39");
});
it("formats eta with edge cases", () => {
expect(formatEta(0)).toBe("00:00");
expect(formatEta(NaN)).toBe("--");
expect(formatEta(Infinity)).toBe("--");
expect(formatEta(Number.MAX_SAFE_INTEGER)).toMatch(/^\d+:\d{2}:\d{2}$/);
});
it("extracts filenames from URLs with encoded characters", () => {
expect(filenameFromUrl("https://example.com/file%20with%20spaces.rar")).toBe("file with spaces.rar");
// %C3%A9 decodes to e-acute (UTF-8), which is preserved
expect(filenameFromUrl("https://example.com/t%C3%A9st%20file.zip")).toBe("t\u00e9st file.zip");
expect(filenameFromUrl("https://example.com/dl?filename=Movie%20Name%20S01E01.mkv")).toBe("Movie Name S01E01.mkv");
// Malformed percent-encoding should not crash
const result = filenameFromUrl("https://example.com/%ZZ%invalid");
expect(typeof result).toBe("string");
expect(result.length).toBeGreaterThan(0);
});
it("handles looksLikeOpaqueFilename edge cases", () => {
// Empty string -> sanitizeFilename returns "Paket" which is not opaque
expect(looksLikeOpaqueFilename("")).toBe(false);
expect(looksLikeOpaqueFilename("a")).toBe(false);
expect(looksLikeOpaqueFilename("ab")).toBe(false);
expect(looksLikeOpaqueFilename("abc")).toBe(false);
expect(looksLikeOpaqueFilename("download.bin")).toBe(true);
// 24-char hex string is opaque (matches /^[a-f0-9]{24,}$/)
expect(looksLikeOpaqueFilename("abcdef123456789012345678")).toBe(true);
expect(looksLikeOpaqueFilename("abcdef1234567890abcdef12")).toBe(true);
// Short hex strings (< 24 chars) are NOT considered opaque
expect(looksLikeOpaqueFilename("abcdef12345")).toBe(false);
// Real filename with extension
expect(looksLikeOpaqueFilename("Show.S01E01.720p.mkv")).toBe(false);
});
});

View File

@ -1,16 +0,0 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "Bundler",
"jsx": "react-jsx",
"lib": ["ES2022", "DOM", "DOM.Iterable"],
"strict": true,
"skipLibCheck": true,
"esModuleInterop": true,
"resolveJsonModule": true,
"isolatedModules": true,
"types": ["node", "vite/client"]
},
"include": ["src", "tests", "vite.config.mts"]
}

View File

@ -1,14 +0,0 @@
import { defineConfig } from "vite";
import react from "@vitejs/plugin-react";
import path from "node:path";
export default defineConfig({
plugins: [react()],
base: "./",
root: path.resolve(__dirname, "src/renderer"),
publicDir: path.resolve(__dirname, "assets"),
build: {
outDir: path.resolve(__dirname, "build/renderer"),
emptyOutDir: true
}
});

View File

@ -1,9 +0,0 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
environment: "node",
include: ["tests/**/*.test.ts"],
globals: true
}
});