Release v1.4.31 with full bug-audit hardening
Some checks are pending
Build and Release / build (push) Waiting to run
Some checks are pending
Build and Release / build (push) Waiting to run
This commit is contained in:
parent
6ae687f3ab
commit
6ac56c0a77
91
CHANGELOG.md
Normal file
91
CHANGELOG.md
Normal file
@ -0,0 +1,91 @@
|
||||
# Changelog
|
||||
|
||||
Alle nennenswerten Aenderungen werden in dieser Datei dokumentiert.
|
||||
|
||||
## 1.4.31 - 2026-03-01
|
||||
|
||||
Diese Version schliesst die komplette Bug-Audit-Runde (156 Punkte) ab und fokussiert auf Stabilitaet, Datenintegritaet, sauberes Abbruchverhalten und reproduzierbares Release-Verhalten.
|
||||
|
||||
### Audit-Abschluss
|
||||
|
||||
- Vollstaendige Abarbeitung der Audit-Liste `Bug-Audit-Komplett-156-Bugs.txt` ueber Main-Process, Renderer, Storage, Update, Integrity und Logger.
|
||||
- Vereinheitlichte Fehlerbehandlung fuer Netzwerk-, Abort-, Retry- und Timeout-Pfade.
|
||||
- Harte Regression-Absicherung ueber Typecheck, Unit-Tests und Release-Build.
|
||||
|
||||
### Download-Manager (Queue, Retry, Stop/Start, Post-Processing)
|
||||
|
||||
- Retry-Status ist jetzt item-gebunden statt call-lokal (kein Retry-Reset bei Requeue, keine Endlos-Retry-Schleifen mehr).
|
||||
- Stop-zu-Start-Resume in derselben Session repariert (gestoppte Items werden wieder sauber gequeued).
|
||||
- HTTP-416-Pfade gehaertet (Body-Konsum, korrektes Fehlerbild im letzten Attempt, Contribution-Reset bei Datei-Neustart).
|
||||
- Target-Path-Reservierung gegen Race-Fenster verbessert (kein verfruehtes Release waehrend Retry-Delay).
|
||||
- Scheduler-Verhalten bei Reconnect/Abort bereinigt, inklusive Status- und Speed-Resets in Abbruchpfaden.
|
||||
- Post-Processing/Extraction-Abbruch und Paket-Lifecycle synchronisiert (inkl. Cleanup und Run-Finish-Konsistenz).
|
||||
- `prepareForShutdown()` raeumt Persist- und State-Emitter-Timer jetzt vollstaendig auf.
|
||||
- Read-only Queue-Checks entkoppelt von mutierenden Seiteneffekten.
|
||||
|
||||
### Extractor
|
||||
|
||||
- Cleanup-Modus `trash` ueberarbeitet (keine permanente Loeschung mehr im Trash-Pfad).
|
||||
- Konfliktmodus-Weitergabe in ZIP- und External-Fallback-Pfaden konsistent gemacht.
|
||||
- Fortschritts-Puls robust gegen callback-exceptions (kein unhandled crash durch `onProgress`).
|
||||
- ZIP/Volume-Erkennung und Cleanup-Targets fuer Multi-Part-Archive erweitert.
|
||||
- Schutz gegen gefaehrliche ZIP-Eintraege und Problemarchive weiter gehaertet.
|
||||
|
||||
### Debrid / RealDebrid
|
||||
|
||||
- Abort-signale werden in Filename-Resolution und Provider-Fallback konsequent respektiert.
|
||||
- Provider-Fallback bricht bei Abort sofort ab statt weitere Provider zu probieren.
|
||||
- Rapidgator-Filename-Resolution auf Content-Type, Retry-Klassen und Body-Handling gehaertet.
|
||||
- AllDebrid/BestDebrid URL-Validierung verbessert (nur gueltige HTTP(S)-direct URLs).
|
||||
- User-Agent-Versionsdrift beseitigt (nun zentral ueber `APP_VERSION`).
|
||||
- RealDebrid-Retry-Backoff ist abort-freundlich (kein unnoetiges Warten nach Stop/Abort).
|
||||
|
||||
### Storage / Session / Settings
|
||||
|
||||
- Temp-Dateipfade fuer Session-Save gegen Race/Kollision gehaertet.
|
||||
- Session-Normalisierung und PackageOrder-Deduplizierung stabilisiert.
|
||||
- Settings-Normalisierung tightened (kein unkontrolliertes Property-Leaking).
|
||||
- Import- und Update-Pfade robust gegen invalides Input-Shape.
|
||||
|
||||
### Main / App-Controller / IPC
|
||||
|
||||
- IPC-Validierung erweitert (Payload-Typen, String-Laengen, Import-Size-Limits).
|
||||
- Auto-Resume Start-Reihenfolge korrigiert, damit der Renderer initiale States sicher erhaelt.
|
||||
- Fenster-Lifecycle-Handler fuer neu erstellte Fenster vereinheitlicht (macOS activate-recreate eingeschlossen).
|
||||
- Clipboard-Normalisierung unicode-sicher (kein Surrogate-Split bei Truncation).
|
||||
- Container-Path-Filter so korrigiert, dass legitime Dateinamen mit `..` nicht falsch verworfen werden.
|
||||
|
||||
### Update-System
|
||||
|
||||
- Dateinamenhygiene fuer Setup-Assets gehaertet (`basename` + sanitize gegen Traversal/RCE-Pfade).
|
||||
- Zielpfad-Kollisionen beseitigt (Timestamp + PID + UUID).
|
||||
- `spawn`-Error-Handling hinzugefuegt (kein unhandled EventEmitter crash beim Installer-Start).
|
||||
- Download-Pipeline auf Shutdown-abort vorbereitet; aktive Update-Downloads koennen sauber abgebrochen werden.
|
||||
- Stream/Timeout/Retry-Handling bei Download und Release-Fetch konsolidiert.
|
||||
|
||||
### Integrity
|
||||
|
||||
- CRC32-Berechnung optimiert (Lookup-Table + Event-Loop-Yield), deutlich weniger UI-/Loop-Blockade bei grossen Dateien.
|
||||
- Hash-Manifest-Lesen gecacht (reduzierte Disk-I/O bei Multi-File-Validierung).
|
||||
- Manifest-Key-Matching fuer relative Pfade und Basenamen vereinheitlicht.
|
||||
|
||||
### Logger
|
||||
|
||||
- Rotation im async- und fallback-Pfad vervollstaendigt.
|
||||
- Rotate-Checks pro Datei getrennt statt global geteilt.
|
||||
- Async-Flush robust gegen Log-Loss bei Write-Fehlern (pending Lines werden erst nach erfolgreichem Write entfernt).
|
||||
|
||||
### Renderer (App.tsx)
|
||||
|
||||
- Theme-Toggle, Sortier-Optimismus und Picker-Busy-Lifecycle gegen Race Conditions gehaertet.
|
||||
- Mounted-Guards fuer fruehe Unmount-Pfade ergaenzt.
|
||||
- Drag-and-Drop nutzt aktive Tab-Referenz robust ueber async Grenzen.
|
||||
- Confirm-Dialog-Text rendert Zeilenumbrueche korrekt.
|
||||
- PackageCard-Memovergleich erweitert (inkl. Dateiname) fuer korrekte Re-Renders.
|
||||
- Human-size Anzeige gegen negative/NaN Inputs gehaertet.
|
||||
|
||||
### QA / Build / Release
|
||||
|
||||
- TypeScript Typecheck erfolgreich.
|
||||
- Voller Vitest Lauf erfolgreich (`262/262`).
|
||||
- Windows Release-Build erfolgreich (NSIS + Portable).
|
||||
@ -65,6 +65,10 @@ npm run self-check
|
||||
- `npm test`: Unit-Tests fuer Parser/Cleanup/Integrity
|
||||
- `npm run self-check`: End-to-End-Checks mit lokalem Mock-Server (Queue, Pause/Resume, Reconnect, Paket-Cancel)
|
||||
|
||||
## Changelog
|
||||
|
||||
- Detaillierte Release-Historie: `CHANGELOG.md`
|
||||
|
||||
## Projektstruktur
|
||||
|
||||
- `src/main`: Electron Main Process + Download/Queue Logik
|
||||
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "real-debrid-downloader",
|
||||
"version": "1.4.30",
|
||||
"version": "1.4.31",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "real-debrid-downloader",
|
||||
"version": "1.4.30",
|
||||
"version": "1.4.31",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"adm-zip": "^0.5.16",
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "real-debrid-downloader",
|
||||
"version": "1.4.30",
|
||||
"version": "1.4.31",
|
||||
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
|
||||
"main": "build/main/main/main.js",
|
||||
"author": "Sucukdeluxe",
|
||||
|
||||
@ -18,7 +18,7 @@ import { parseCollectorInput } from "./link-parser";
|
||||
import { configureLogger, getLogFilePath, logger } from "./logger";
|
||||
import { MegaWebFallback } from "./mega-web-fallback";
|
||||
import { createStoragePaths, loadSession, loadSettings, normalizeSettings, saveSettings } from "./storage";
|
||||
import { checkGitHubUpdate, installLatestUpdate } from "./update";
|
||||
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
|
||||
|
||||
function sanitizeSettingsPatch(partial: Partial<AppSettings>): Partial<AppSettings> {
|
||||
const entries = Object.entries(partial || {}).filter(([, value]) => value !== undefined);
|
||||
@ -44,6 +44,8 @@ export class AppController {
|
||||
|
||||
private onStateHandler: ((snapshot: UiSnapshot) => void) | null = null;
|
||||
|
||||
private autoResumePending = false;
|
||||
|
||||
public constructor() {
|
||||
configureLogger(this.storagePaths.baseDir);
|
||||
this.settings = loadSettings(this.storagePaths);
|
||||
@ -66,8 +68,8 @@ export class AppController {
|
||||
const hasPending = Object.values(snapshot.session.items).some((item) => item.status === "queued" || item.status === "reconnect_wait");
|
||||
const hasConflicts = this.manager.getStartConflicts().length > 0;
|
||||
if (hasPending && this.hasAnyProviderToken(this.settings) && !hasConflicts) {
|
||||
this.manager.start();
|
||||
logger.info("Auto-Resume beim Start aktiviert");
|
||||
this.autoResumePending = true;
|
||||
logger.info("Auto-Resume beim Start vorgemerkt");
|
||||
} else if (hasPending && hasConflicts) {
|
||||
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
|
||||
}
|
||||
@ -91,6 +93,11 @@ export class AppController {
|
||||
this.onStateHandler = handler;
|
||||
if (handler) {
|
||||
handler(this.manager.getSnapshot());
|
||||
if (this.autoResumePending) {
|
||||
this.autoResumePending = false;
|
||||
this.manager.start();
|
||||
logger.info("Auto-Resume beim Start aktiviert");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -217,6 +224,7 @@ export class AppController {
|
||||
}
|
||||
|
||||
public shutdown(): void {
|
||||
abortActiveUpdateDownload();
|
||||
this.manager.prepareForShutdown();
|
||||
this.megaWebFallback.dispose();
|
||||
logger.info("App beendet");
|
||||
|
||||
@ -28,7 +28,9 @@ export function cleanupCancelledPackageArtifacts(packageDir: string): number {
|
||||
const stack = [packageDir];
|
||||
while (stack.length > 0) {
|
||||
const current = stack.pop() as string;
|
||||
for (const entry of fs.readdirSync(current, { withFileTypes: true })) {
|
||||
let entries: fs.Dirent[] = [];
|
||||
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
|
||||
for (const entry of entries) {
|
||||
const full = path.join(current, entry.name);
|
||||
if (entry.isDirectory() && !entry.isSymbolicLink()) {
|
||||
stack.push(full);
|
||||
@ -94,7 +96,9 @@ export function removeDownloadLinkArtifacts(extractDir: string): number {
|
||||
const stack = [extractDir];
|
||||
while (stack.length > 0) {
|
||||
const current = stack.pop() as string;
|
||||
for (const entry of fs.readdirSync(current, { withFileTypes: true })) {
|
||||
let entries: fs.Dirent[] = [];
|
||||
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
|
||||
for (const entry of entries) {
|
||||
const full = path.join(current, entry.name);
|
||||
if (entry.isDirectory() && !entry.isSymbolicLink()) {
|
||||
stack.push(full);
|
||||
@ -177,7 +181,9 @@ export function removeSampleArtifacts(extractDir: string): { files: number; dirs
|
||||
|
||||
while (stack.length > 0) {
|
||||
const current = stack.pop() as string;
|
||||
for (const entry of fs.readdirSync(current, { withFileTypes: true })) {
|
||||
let entries: fs.Dirent[] = [];
|
||||
try { entries = fs.readdirSync(current, { withFileTypes: true }); } catch { continue; }
|
||||
for (const entry of entries) {
|
||||
const full = path.join(current, entry.name);
|
||||
if (entry.isDirectory() || entry.isSymbolicLink()) {
|
||||
const base = entry.name.toLowerCase();
|
||||
|
||||
@ -196,7 +196,7 @@ export async function importDlcContainers(filePaths: string[]): Promise<ParsedPa
|
||||
packages = await decryptDlcLocal(filePath);
|
||||
} catch (error) {
|
||||
if (/zu groß|ungültig/i.test(compactErrorText(error))) {
|
||||
throw error;
|
||||
continue;
|
||||
}
|
||||
packages = [];
|
||||
}
|
||||
@ -205,7 +205,7 @@ export async function importDlcContainers(filePaths: string[]): Promise<ParsedPa
|
||||
packages = await decryptDlcViaDcrypt(filePath);
|
||||
} catch (error) {
|
||||
if (/zu groß|ungültig/i.test(compactErrorText(error))) {
|
||||
throw error;
|
||||
continue;
|
||||
}
|
||||
packages = [];
|
||||
}
|
||||
|
||||
@ -1,11 +1,11 @@
|
||||
import { AppSettings, DebridFallbackProvider, DebridProvider } from "../shared/types";
|
||||
import { REQUEST_RETRIES } from "./constants";
|
||||
import { APP_VERSION, REQUEST_RETRIES } from "./constants";
|
||||
import { logger } from "./logger";
|
||||
import { RealDebridClient, UnrestrictedLink } from "./realdebrid";
|
||||
import { compactErrorText, filenameFromUrl, looksLikeOpaqueFilename, sleep } from "./utils";
|
||||
|
||||
const API_TIMEOUT_MS = 30000;
|
||||
const DEBRID_USER_AGENT = "RD-Node-Downloader/1.4.30";
|
||||
const DEBRID_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
|
||||
const RAPIDGATOR_SCAN_MAX_BYTES = 512 * 1024;
|
||||
|
||||
const BEST_DEBRID_API_BASE = "https://bestdebrid.com/api/v1";
|
||||
@ -781,6 +781,15 @@ class AllDebridClient {
|
||||
if (!directUrl) {
|
||||
throw new Error("AllDebrid Antwort ohne Download-Link");
|
||||
}
|
||||
let parsedDirect: URL;
|
||||
try {
|
||||
parsedDirect = new URL(directUrl);
|
||||
} catch {
|
||||
throw new Error("AllDebrid Antwort enthält keine gültige Download-URL");
|
||||
}
|
||||
if (parsedDirect.protocol !== "https:" && parsedDirect.protocol !== "http:") {
|
||||
throw new Error(`AllDebrid Antwort enthält ungültiges Download-URL-Protokoll (${parsedDirect.protocol})`);
|
||||
}
|
||||
|
||||
return {
|
||||
fileName: pickString(data, ["filename"]) || filenameFromUrl(link),
|
||||
@ -850,7 +859,11 @@ export class DebridService {
|
||||
for (const [link, fileName] of infos.entries()) {
|
||||
reportResolved(link, fileName);
|
||||
}
|
||||
} catch {
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(errorText)) {
|
||||
throw error;
|
||||
}
|
||||
// ignore and continue with host page fallback
|
||||
}
|
||||
}
|
||||
@ -922,6 +935,10 @@ export class DebridService {
|
||||
providerLabel: PROVIDER_LABELS[provider]
|
||||
};
|
||||
} catch (error) {
|
||||
const errorText = compactErrorText(error);
|
||||
if (signal?.aborted || /aborted/i.test(errorText)) {
|
||||
throw error;
|
||||
}
|
||||
attempts.push(`${PROVIDER_LABELS[provider]}: ${compactErrorText(error)}`);
|
||||
}
|
||||
}
|
||||
|
||||
@ -32,8 +32,11 @@ type ActiveTask = {
|
||||
abortController: AbortController;
|
||||
abortReason: "stop" | "cancel" | "reconnect" | "package_toggle" | "stall" | "shutdown" | "none";
|
||||
resumable: boolean;
|
||||
speedEvents: Array<{ at: number; bytes: number }>;
|
||||
nonResumableCounted: boolean;
|
||||
freshRetryUsed?: boolean;
|
||||
stallRetries?: number;
|
||||
genericErrorRetries?: number;
|
||||
unrestrictRetries?: number;
|
||||
};
|
||||
|
||||
const DEFAULT_DOWNLOAD_STALL_TIMEOUT_MS = 30000;
|
||||
@ -362,6 +365,13 @@ export class DownloadManager extends EventEmitter {
|
||||
|
||||
private retryAfterByItem = new Map<string, number>();
|
||||
|
||||
private retryStateByItem = new Map<string, {
|
||||
freshRetryUsed: boolean;
|
||||
stallRetries: number;
|
||||
genericErrorRetries: number;
|
||||
unrestrictRetries: number;
|
||||
}>();
|
||||
|
||||
public constructor(settings: AppSettings, session: SessionState, storagePaths: StoragePaths, options: DownloadManagerOptions = {}) {
|
||||
super();
|
||||
this.settings = settings;
|
||||
@ -543,6 +553,7 @@ export class DownloadManager extends EventEmitter {
|
||||
delete this.session.items[itemId];
|
||||
this.itemCount = Math.max(0, this.itemCount - 1);
|
||||
this.retryAfterByItem.delete(itemId);
|
||||
this.retryStateByItem.delete(itemId);
|
||||
this.dropItemContribution(itemId);
|
||||
if (!hasActiveTask) {
|
||||
this.releaseTargetPath(itemId);
|
||||
@ -685,6 +696,7 @@ export class DownloadManager extends EventEmitter {
|
||||
this.runOutcomes.clear();
|
||||
this.runCompletedPackages.clear();
|
||||
this.retryAfterByItem.clear();
|
||||
this.retryStateByItem.clear();
|
||||
this.reservedTargetPaths.clear();
|
||||
this.claimedTargetPathByItem.clear();
|
||||
this.itemContributedBytes.clear();
|
||||
@ -698,6 +710,7 @@ export class DownloadManager extends EventEmitter {
|
||||
this.summary = null;
|
||||
this.nonResumableActive = 0;
|
||||
this.retryAfterByItem.clear();
|
||||
this.retryStateByItem.clear();
|
||||
this.persistNow();
|
||||
this.emitState(true);
|
||||
}
|
||||
@ -1291,6 +1304,8 @@ export class DownloadManager extends EventEmitter {
|
||||
if (!pkg) {
|
||||
return;
|
||||
}
|
||||
pkg.cancelled = true;
|
||||
pkg.updatedAt = nowMs();
|
||||
const packageName = pkg.name;
|
||||
const outputDir = pkg.outputDir;
|
||||
const itemIds = [...pkg.itemIds];
|
||||
@ -1333,7 +1348,25 @@ export class DownloadManager extends EventEmitter {
|
||||
}
|
||||
|
||||
const recoveredItems = this.recoverRetryableItems("start");
|
||||
if (recoveredItems > 0) {
|
||||
|
||||
let recoveredStoppedItems = 0;
|
||||
for (const item of Object.values(this.session.items)) {
|
||||
if (item.status !== "cancelled" || item.fullStatus !== "Gestoppt") {
|
||||
continue;
|
||||
}
|
||||
const pkg = this.session.packages[item.packageId];
|
||||
if (!pkg || pkg.cancelled || !pkg.enabled) {
|
||||
continue;
|
||||
}
|
||||
item.status = "queued";
|
||||
item.fullStatus = "Wartet";
|
||||
item.lastError = "";
|
||||
item.speedBps = 0;
|
||||
item.updatedAt = nowMs();
|
||||
recoveredStoppedItems += 1;
|
||||
}
|
||||
|
||||
if (recoveredItems > 0 || recoveredStoppedItems > 0) {
|
||||
this.persistSoon();
|
||||
this.emitState(true);
|
||||
}
|
||||
@ -1349,6 +1382,25 @@ export class DownloadManager extends EventEmitter {
|
||||
return Boolean(pkg && !pkg.cancelled && pkg.enabled);
|
||||
});
|
||||
if (runItems.length === 0) {
|
||||
if (this.packagePostProcessTasks.size > 0) {
|
||||
this.runItemIds.clear();
|
||||
this.runPackageIds.clear();
|
||||
this.runOutcomes.clear();
|
||||
this.runCompletedPackages.clear();
|
||||
this.session.running = true;
|
||||
this.session.paused = false;
|
||||
this.session.runStartedAt = this.session.runStartedAt || nowMs();
|
||||
this.persistSoon();
|
||||
this.emitState(true);
|
||||
void this.ensureScheduler().catch((error) => {
|
||||
logger.error(`Scheduler abgestürzt: ${compactErrorText(error)}`);
|
||||
this.session.running = false;
|
||||
this.session.paused = false;
|
||||
this.persistSoon();
|
||||
this.emitState(true);
|
||||
});
|
||||
return;
|
||||
}
|
||||
this.runItemIds.clear();
|
||||
this.runPackageIds.clear();
|
||||
this.runOutcomes.clear();
|
||||
@ -1431,6 +1483,10 @@ export class DownloadManager extends EventEmitter {
|
||||
public prepareForShutdown(): void {
|
||||
logger.info(`Shutdown-Vorbereitung gestartet: active=${this.activeTasks.size}, running=${this.session.running}, paused=${this.session.paused}`);
|
||||
this.clearPersistTimer();
|
||||
if (this.stateEmitTimer) {
|
||||
clearTimeout(this.stateEmitTimer);
|
||||
this.stateEmitTimer = null;
|
||||
}
|
||||
this.session.running = false;
|
||||
this.session.paused = false;
|
||||
this.session.reconnectUntil = 0;
|
||||
@ -1934,7 +1990,7 @@ export class DownloadManager extends EventEmitter {
|
||||
const success = items.filter((item) => item.status === "completed").length;
|
||||
const failed = items.filter((item) => item.status === "failed").length;
|
||||
const cancelled = items.filter((item) => item.status === "cancelled").length;
|
||||
if (success + failed + cancelled < items.length || failed > 0 || success === 0) {
|
||||
if (success + failed + cancelled < items.length || failed > 0 || cancelled > 0 || success === 0) {
|
||||
continue;
|
||||
}
|
||||
const needsExtraction = items.some((item) =>
|
||||
@ -1965,6 +2021,7 @@ export class DownloadManager extends EventEmitter {
|
||||
this.packagePostProcessTasks.delete(packageId);
|
||||
for (const itemId of itemIds) {
|
||||
this.retryAfterByItem.delete(itemId);
|
||||
this.retryStateByItem.delete(itemId);
|
||||
this.releaseTargetPath(itemId);
|
||||
this.dropItemContribution(itemId);
|
||||
delete this.session.items[itemId];
|
||||
@ -1996,7 +2053,7 @@ export class DownloadManager extends EventEmitter {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (this.reconnectActive() && (this.nonResumableActive > 0 || this.activeTasks.size === 0)) {
|
||||
if (this.reconnectActive()) {
|
||||
const markNow = nowMs();
|
||||
if (markNow - this.lastReconnectMarkAt >= 900) {
|
||||
this.lastReconnectMarkAt = markNow;
|
||||
@ -2185,7 +2242,27 @@ export class DownloadManager extends EventEmitter {
|
||||
}
|
||||
|
||||
private hasQueuedItems(): boolean {
|
||||
return this.findNextQueuedItem() !== null;
|
||||
const now = nowMs();
|
||||
for (const packageId of this.session.packageOrder) {
|
||||
const pkg = this.session.packages[packageId];
|
||||
if (!pkg || pkg.cancelled || !pkg.enabled) {
|
||||
continue;
|
||||
}
|
||||
for (const itemId of pkg.itemIds) {
|
||||
const item = this.session.items[itemId];
|
||||
if (!item) {
|
||||
continue;
|
||||
}
|
||||
const retryAfter = this.retryAfterByItem.get(itemId) || 0;
|
||||
if (retryAfter > now) {
|
||||
continue;
|
||||
}
|
||||
if (item.status === "queued" || item.status === "reconnect_wait") {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
private hasDelayedQueuedItems(): boolean {
|
||||
@ -2239,6 +2316,12 @@ export class DownloadManager extends EventEmitter {
|
||||
item.attempts = 0;
|
||||
active.abortController = new AbortController();
|
||||
active.abortReason = "none";
|
||||
this.retryStateByItem.set(item.id, {
|
||||
freshRetryUsed: Boolean(active.freshRetryUsed),
|
||||
stallRetries: Number(active.stallRetries || 0),
|
||||
genericErrorRetries: Number(active.genericErrorRetries || 0),
|
||||
unrestrictRetries: Number(active.unrestrictRetries || 0)
|
||||
});
|
||||
// Caller returns immediately after this; startItem().finally releases the active slot,
|
||||
// so the retry backoff never blocks a worker.
|
||||
this.retryAfterByItem.set(item.id, nowMs() + waitMs);
|
||||
@ -2268,7 +2351,6 @@ export class DownloadManager extends EventEmitter {
|
||||
abortController: new AbortController(),
|
||||
abortReason: "none",
|
||||
resumable: true,
|
||||
speedEvents: [],
|
||||
nonResumableCounted: false
|
||||
};
|
||||
this.activeTasks.set(itemId, active);
|
||||
@ -2277,7 +2359,9 @@ export class DownloadManager extends EventEmitter {
|
||||
void this.processItem(active).catch((err) => {
|
||||
logger.warn(`processItem unbehandelt (${itemId}): ${compactErrorText(err)}`);
|
||||
}).finally(() => {
|
||||
if (!this.retryAfterByItem.has(item.id)) {
|
||||
this.releaseTargetPath(item.id);
|
||||
}
|
||||
if (active.nonResumableCounted) {
|
||||
this.nonResumableActive = Math.max(0, this.nonResumableActive - 1);
|
||||
}
|
||||
@ -2294,10 +2378,17 @@ export class DownloadManager extends EventEmitter {
|
||||
return;
|
||||
}
|
||||
|
||||
let freshRetryUsed = false;
|
||||
let stallRetries = 0;
|
||||
let genericErrorRetries = 0;
|
||||
let unrestrictRetries = 0;
|
||||
const retryState = this.retryStateByItem.get(item.id) || {
|
||||
freshRetryUsed: false,
|
||||
stallRetries: 0,
|
||||
genericErrorRetries: 0,
|
||||
unrestrictRetries: 0
|
||||
};
|
||||
this.retryStateByItem.set(item.id, retryState);
|
||||
active.freshRetryUsed = retryState.freshRetryUsed;
|
||||
active.stallRetries = retryState.stallRetries;
|
||||
active.genericErrorRetries = retryState.genericErrorRetries;
|
||||
active.unrestrictRetries = retryState.unrestrictRetries;
|
||||
const maxGenericErrorRetries = Math.max(2, REQUEST_RETRIES);
|
||||
const maxUnrestrictRetries = Math.max(3, REQUEST_RETRIES);
|
||||
while (true) {
|
||||
@ -2430,8 +2521,12 @@ export class DownloadManager extends EventEmitter {
|
||||
}
|
||||
this.persistSoon();
|
||||
this.emitState();
|
||||
this.retryStateByItem.delete(item.id);
|
||||
return;
|
||||
} catch (error) {
|
||||
if (this.session.items[item.id] !== item) {
|
||||
return;
|
||||
}
|
||||
const reason = active.abortReason;
|
||||
const claimedTargetPath = this.claimedTargetPathByItem.get(item.id) || "";
|
||||
if (reason === "cancel") {
|
||||
@ -2449,6 +2544,7 @@ export class DownloadManager extends EventEmitter {
|
||||
item.progressPercent = 0;
|
||||
item.totalBytes = null;
|
||||
this.dropItemContribution(item.id);
|
||||
this.retryStateByItem.delete(item.id);
|
||||
} else if (reason === "stop") {
|
||||
item.status = "cancelled";
|
||||
item.fullStatus = "Gestoppt";
|
||||
@ -2459,11 +2555,13 @@ export class DownloadManager extends EventEmitter {
|
||||
item.totalBytes = null;
|
||||
this.dropItemContribution(item.id);
|
||||
}
|
||||
this.retryStateByItem.delete(item.id);
|
||||
} else if (reason === "shutdown") {
|
||||
item.status = "queued";
|
||||
item.speedBps = 0;
|
||||
const activePkg = this.session.packages[item.packageId];
|
||||
item.fullStatus = activePkg && !activePkg.enabled ? "Paket gestoppt" : "Wartet";
|
||||
this.retryStateByItem.delete(item.id);
|
||||
} else if (reason === "reconnect") {
|
||||
item.status = "queued";
|
||||
item.speedBps = 0;
|
||||
@ -2473,10 +2571,10 @@ export class DownloadManager extends EventEmitter {
|
||||
item.speedBps = 0;
|
||||
item.fullStatus = "Paket gestoppt";
|
||||
} else if (reason === "stall") {
|
||||
stallRetries += 1;
|
||||
if (stallRetries <= 2) {
|
||||
active.stallRetries += 1;
|
||||
if (active.stallRetries <= 2) {
|
||||
item.retries += 1;
|
||||
this.queueRetry(item, active, 350 * stallRetries, `Keine Daten empfangen, Retry ${stallRetries}/2`);
|
||||
this.queueRetry(item, active, 350 * active.stallRetries, `Keine Daten empfangen, Retry ${active.stallRetries}/2`);
|
||||
item.lastError = "";
|
||||
this.persistSoon();
|
||||
this.emitState();
|
||||
@ -2486,9 +2584,10 @@ export class DownloadManager extends EventEmitter {
|
||||
item.lastError = "Download hing wiederholt";
|
||||
item.fullStatus = `Fehler: ${item.lastError}`;
|
||||
this.recordRunOutcome(item.id, "failed");
|
||||
this.retryStateByItem.delete(item.id);
|
||||
} else {
|
||||
const errorText = compactErrorText(error);
|
||||
const shouldFreshRetry = !freshRetryUsed && isFetchFailure(errorText);
|
||||
const shouldFreshRetry = !active.freshRetryUsed && isFetchFailure(errorText);
|
||||
const isHttp416 = /(^|\D)416(\D|$)/.test(errorText);
|
||||
if (isHttp416) {
|
||||
try {
|
||||
@ -2509,10 +2608,11 @@ export class DownloadManager extends EventEmitter {
|
||||
item.updatedAt = nowMs();
|
||||
this.persistSoon();
|
||||
this.emitState();
|
||||
this.retryStateByItem.delete(item.id);
|
||||
return;
|
||||
}
|
||||
if (shouldFreshRetry) {
|
||||
freshRetryUsed = true;
|
||||
active.freshRetryUsed = true;
|
||||
item.retries += 1;
|
||||
try {
|
||||
fs.rmSync(item.targetPath, { force: true });
|
||||
@ -2530,20 +2630,20 @@ export class DownloadManager extends EventEmitter {
|
||||
return;
|
||||
}
|
||||
|
||||
if (isUnrestrictFailure(errorText) && unrestrictRetries < maxUnrestrictRetries) {
|
||||
unrestrictRetries += 1;
|
||||
if (isUnrestrictFailure(errorText) && active.unrestrictRetries < maxUnrestrictRetries) {
|
||||
active.unrestrictRetries += 1;
|
||||
item.retries += 1;
|
||||
this.queueRetry(item, active, Math.min(8000, 2000 * unrestrictRetries), `Unrestrict-Fehler, Retry ${unrestrictRetries}/${maxUnrestrictRetries}`);
|
||||
this.queueRetry(item, active, Math.min(8000, 2000 * active.unrestrictRetries), `Unrestrict-Fehler, Retry ${active.unrestrictRetries}/${maxUnrestrictRetries}`);
|
||||
item.lastError = errorText;
|
||||
this.persistSoon();
|
||||
this.emitState();
|
||||
return;
|
||||
}
|
||||
|
||||
if (genericErrorRetries < maxGenericErrorRetries) {
|
||||
genericErrorRetries += 1;
|
||||
if (active.genericErrorRetries < maxGenericErrorRetries) {
|
||||
active.genericErrorRetries += 1;
|
||||
item.retries += 1;
|
||||
this.queueRetry(item, active, Math.min(1200, 300 * genericErrorRetries), `Fehler erkannt, Auto-Retry ${genericErrorRetries}/${maxGenericErrorRetries}`);
|
||||
this.queueRetry(item, active, Math.min(1200, 300 * active.genericErrorRetries), `Fehler erkannt, Auto-Retry ${active.genericErrorRetries}/${maxGenericErrorRetries}`);
|
||||
item.lastError = errorText;
|
||||
this.persistSoon();
|
||||
this.emitState();
|
||||
@ -2554,6 +2654,7 @@ export class DownloadManager extends EventEmitter {
|
||||
this.recordRunOutcome(item.id, "failed");
|
||||
item.lastError = errorText;
|
||||
item.fullStatus = `Fehler: ${item.lastError}`;
|
||||
this.retryStateByItem.delete(item.id);
|
||||
}
|
||||
item.speedBps = 0;
|
||||
item.updatedAt = nowMs();
|
||||
@ -2653,6 +2754,7 @@ export class DownloadManager extends EventEmitter {
|
||||
} catch {
|
||||
// ignore
|
||||
}
|
||||
this.dropItemContribution(active.itemId);
|
||||
item.downloadedBytes = 0;
|
||||
item.totalBytes = knownTotal && knownTotal > 0 ? knownTotal : null;
|
||||
item.progressPercent = 0;
|
||||
@ -2665,6 +2767,8 @@ export class DownloadManager extends EventEmitter {
|
||||
await sleep(280 * attempt);
|
||||
continue;
|
||||
}
|
||||
lastError = "HTTP 416";
|
||||
throw new Error(lastError);
|
||||
}
|
||||
const text = await response.text();
|
||||
lastError = `HTTP ${response.status}`;
|
||||
@ -3001,6 +3105,7 @@ export class DownloadManager extends EventEmitter {
|
||||
private recoverRetryableItems(trigger: "startup" | "start"): number {
|
||||
let recovered = 0;
|
||||
const touchedPackages = new Set<string>();
|
||||
const maxAutoRetryFailures = Math.max(2, REQUEST_RETRIES);
|
||||
|
||||
for (const packageId of this.session.packageOrder) {
|
||||
const pkg = this.session.packages[packageId];
|
||||
@ -3010,7 +3115,7 @@ export class DownloadManager extends EventEmitter {
|
||||
|
||||
for (const itemId of pkg.itemIds) {
|
||||
const item = this.session.items[itemId];
|
||||
if (!item || item.status === "cancelled") {
|
||||
if (!item || item.status === "cancelled" || this.activeTasks.has(itemId)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
@ -3018,6 +3123,9 @@ export class DownloadManager extends EventEmitter {
|
||||
const hasZeroByteArchive = this.hasZeroByteArchiveArtifact(item);
|
||||
|
||||
if (item.status === "failed") {
|
||||
if (!is416Failure && !hasZeroByteArchive && item.retries >= maxAutoRetryFailures) {
|
||||
continue;
|
||||
}
|
||||
this.queueItemForRetry(item, {
|
||||
hardReset: is416Failure || hasZeroByteArchive,
|
||||
reason: is416Failure
|
||||
@ -3062,6 +3170,7 @@ export class DownloadManager extends EventEmitter {
|
||||
}
|
||||
|
||||
private queueItemForRetry(item: DownloadItem, options: { hardReset: boolean; reason: string }): void {
|
||||
this.retryStateByItem.delete(item.id);
|
||||
const targetPath = String(item.targetPath || "").trim();
|
||||
if (options.hardReset && targetPath) {
|
||||
try {
|
||||
@ -3787,6 +3896,8 @@ export class DownloadManager extends EventEmitter {
|
||||
this.runPackageIds.clear();
|
||||
this.runOutcomes.clear();
|
||||
this.runCompletedPackages.clear();
|
||||
this.retryAfterByItem.clear();
|
||||
this.retryStateByItem.clear();
|
||||
this.reservedTargetPaths.clear();
|
||||
this.claimedTargetPathByItem.clear();
|
||||
this.itemContributedBytes.clear();
|
||||
|
||||
@ -1111,6 +1111,9 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
}
|
||||
|
||||
const conflictMode = effectiveConflictMode(options.conflictMode);
|
||||
if (options.conflictMode === "ask") {
|
||||
logger.warn("Extract-ConflictMode 'ask' wird ohne Prompt als 'skip' behandelt");
|
||||
}
|
||||
let passwordCandidates = archivePasswords(options.passwordList || "");
|
||||
const resumeCompleted = readExtractResumeState(options.packageDir, options.packageId);
|
||||
const resumeCompletedAtStart = resumeCompleted.size;
|
||||
@ -1154,6 +1157,7 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
const boundedArchivePercent = Math.max(0, Math.min(100, Number(archivePercent ?? 0)));
|
||||
percent = Math.max(0, Math.min(100, Math.floor(((boundedCurrent + (boundedArchivePercent / 100)) / total) * 100)));
|
||||
}
|
||||
try {
|
||||
options.onProgress({
|
||||
current,
|
||||
total,
|
||||
@ -1163,6 +1167,9 @@ export async function extractPackageArchives(options: ExtractOptions): Promise<{
|
||||
elapsedMs,
|
||||
phase
|
||||
});
|
||||
} catch (error) {
|
||||
logger.warn(`onProgress callback Fehler unterdrückt: ${cleanErrorText(String(error))}`);
|
||||
}
|
||||
};
|
||||
|
||||
emitProgress(extracted, "", "extracting");
|
||||
|
||||
@ -4,6 +4,17 @@ import crypto from "node:crypto";
|
||||
import { ParsedHashEntry } from "../shared/types";
|
||||
import { MAX_MANIFEST_FILE_BYTES } from "./constants";
|
||||
|
||||
const manifestCache = new Map<string, { at: number; entries: Map<string, ParsedHashEntry> }>();
|
||||
const MANIFEST_CACHE_TTL_MS = 15000;
|
||||
|
||||
function normalizeManifestKey(value: string): string {
|
||||
return String(value || "")
|
||||
.replace(/\\/g, "/")
|
||||
.replace(/^\.\//, "")
|
||||
.trim()
|
||||
.toLowerCase();
|
||||
}
|
||||
|
||||
export function parseHashLine(line: string): ParsedHashEntry | null {
|
||||
const text = String(line || "").trim();
|
||||
if (!text || text.startsWith(";")) {
|
||||
@ -30,6 +41,12 @@ export function parseHashLine(line: string): ParsedHashEntry | null {
|
||||
}
|
||||
|
||||
export function readHashManifest(packageDir: string): Map<string, ParsedHashEntry> {
|
||||
const cacheKey = path.resolve(packageDir);
|
||||
const cached = manifestCache.get(cacheKey);
|
||||
if (cached && Date.now() - cached.at <= MANIFEST_CACHE_TTL_MS) {
|
||||
return new Map(cached.entries);
|
||||
}
|
||||
|
||||
const map = new Map<string, ParsedHashEntry>();
|
||||
const patterns: Array<[string, "crc32" | "md5" | "sha1"]> = [
|
||||
[".sfv", "crc32"],
|
||||
@ -80,24 +97,28 @@ export function readHashManifest(packageDir: string): Map<string, ParsedHashEntr
|
||||
...parsed,
|
||||
algorithm: hit[1]
|
||||
};
|
||||
const key = parsed.fileName.toLowerCase();
|
||||
const key = normalizeManifestKey(parsed.fileName);
|
||||
if (map.has(key)) {
|
||||
continue;
|
||||
}
|
||||
map.set(key, normalized);
|
||||
}
|
||||
}
|
||||
manifestCache.set(cacheKey, { at: Date.now(), entries: new Map(map) });
|
||||
return map;
|
||||
}
|
||||
|
||||
const crcTable = new Int32Array(256);
|
||||
for (let i = 0; i < 256; i++) {
|
||||
let c = i;
|
||||
for (let j = 0; j < 8; j++) c = c & 1 ? (0xedb88320 ^ (c >>> 1)) : (c >>> 1);
|
||||
crcTable[i] = c;
|
||||
}
|
||||
|
||||
function crc32Buffer(data: Buffer, seed = 0): number {
|
||||
let crc = seed ^ -1;
|
||||
for (let i = 0; i < data.length; i += 1) {
|
||||
let c = (crc ^ data[i]) & 0xff;
|
||||
for (let j = 0; j < 8; j += 1) {
|
||||
c = (c & 1) ? (0xedb88320 ^ (c >>> 1)) : (c >>> 1);
|
||||
}
|
||||
crc = (crc >>> 8) ^ c;
|
||||
for (let i = 0; i < data.length; i++) {
|
||||
crc = (crc >>> 8) ^ crcTable[(crc ^ data[i]) & 0xff];
|
||||
}
|
||||
return crc ^ -1;
|
||||
}
|
||||
@ -105,15 +126,12 @@ function crc32Buffer(data: Buffer, seed = 0): number {
|
||||
async function hashFile(filePath: string, algorithm: "crc32" | "md5" | "sha1"): Promise<string> {
|
||||
if (algorithm === "crc32") {
|
||||
const stream = fs.createReadStream(filePath, { highWaterMark: 1024 * 1024 });
|
||||
return await new Promise<string>((resolve, reject) => {
|
||||
let crc = 0;
|
||||
stream.on("data", (chunk: string | Buffer) => {
|
||||
const buffer = typeof chunk === "string" ? Buffer.from(chunk) : chunk;
|
||||
crc = crc32Buffer(buffer, crc);
|
||||
});
|
||||
stream.on("error", reject);
|
||||
stream.on("end", () => resolve(((crc >>> 0).toString(16)).padStart(8, "0").toLowerCase()));
|
||||
});
|
||||
for await (const chunk of stream) {
|
||||
crc = crc32Buffer(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk), crc);
|
||||
await new Promise(r => setImmediate(r));
|
||||
}
|
||||
return (crc >>> 0).toString(16).padStart(8, "0").toLowerCase();
|
||||
}
|
||||
|
||||
const hash = crypto.createHash(algorithm);
|
||||
@ -130,8 +148,9 @@ export async function validateFileAgainstManifest(filePath: string, packageDir:
|
||||
if (manifest.size === 0) {
|
||||
return { ok: true, message: "Kein Hash verfügbar" };
|
||||
}
|
||||
const key = path.basename(filePath).toLowerCase();
|
||||
const entry = manifest.get(key);
|
||||
const keyByBaseName = normalizeManifestKey(path.basename(filePath));
|
||||
const keyByRelativePath = normalizeManifestKey(path.relative(packageDir, filePath));
|
||||
const entry = manifest.get(keyByRelativePath) || manifest.get(keyByBaseName);
|
||||
if (!entry) {
|
||||
return { ok: true, message: "Kein Hash für Datei" };
|
||||
}
|
||||
|
||||
@ -6,7 +6,7 @@ let fallbackLogFilePath: string | null = null;
|
||||
const LOG_FLUSH_INTERVAL_MS = 120;
|
||||
const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
|
||||
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
|
||||
let lastRotateCheckAt = 0;
|
||||
const rotateCheckAtByFile = new Map<string, number>();
|
||||
|
||||
let pendingLines: string[] = [];
|
||||
let pendingChars = 0;
|
||||
@ -97,10 +97,11 @@ function scheduleFlush(immediate = false): void {
|
||||
function rotateIfNeeded(filePath: string): void {
|
||||
try {
|
||||
const now = Date.now();
|
||||
const lastRotateCheckAt = rotateCheckAtByFile.get(filePath) || 0;
|
||||
if (now - lastRotateCheckAt < 60_000) {
|
||||
return;
|
||||
}
|
||||
lastRotateCheckAt = now;
|
||||
rotateCheckAtByFile.set(filePath, now);
|
||||
const stat = fs.statSync(filePath);
|
||||
if (stat.size < LOG_MAX_FILE_BYTES) {
|
||||
return;
|
||||
@ -123,25 +124,31 @@ async function flushAsync(): Promise<void> {
|
||||
}
|
||||
|
||||
flushInFlight = true;
|
||||
const chunk = pendingLines.join("");
|
||||
pendingLines = [];
|
||||
pendingChars = 0;
|
||||
const linesSnapshot = pendingLines.slice();
|
||||
const chunk = linesSnapshot.join("");
|
||||
|
||||
try {
|
||||
rotateIfNeeded(logFilePath);
|
||||
const primary = await appendChunk(logFilePath, chunk);
|
||||
let wroteAny = primary.ok;
|
||||
if (fallbackLogFilePath) {
|
||||
rotateIfNeeded(fallbackLogFilePath);
|
||||
const fallback = await appendChunk(fallbackLogFilePath, chunk);
|
||||
wroteAny = wroteAny || fallback.ok;
|
||||
if (!primary.ok && !fallback.ok) {
|
||||
writeStderr(`LOGGER write failed (primary+fallback): ${primary.errorText} | ${fallback.errorText}\n`);
|
||||
}
|
||||
} else if (!primary.ok) {
|
||||
writeStderr(`LOGGER write failed: ${primary.errorText}\n`);
|
||||
}
|
||||
if (wroteAny) {
|
||||
pendingLines = pendingLines.slice(linesSnapshot.length);
|
||||
pendingChars = Math.max(0, pendingChars - chunk.length);
|
||||
}
|
||||
} finally {
|
||||
flushInFlight = false;
|
||||
if (pendingLines.length > 0) {
|
||||
scheduleFlush(true);
|
||||
scheduleFlush();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -94,6 +94,22 @@ function createWindow(): BrowserWindow {
|
||||
return window;
|
||||
}
|
||||
|
||||
function bindMainWindowLifecycle(window: BrowserWindow): void {
|
||||
window.on("close", (event) => {
|
||||
const settings = controller.getSettings();
|
||||
if (settings.minimizeToTray && tray) {
|
||||
event.preventDefault();
|
||||
window.hide();
|
||||
}
|
||||
});
|
||||
|
||||
window.on("closed", () => {
|
||||
if (mainWindow === window) {
|
||||
mainWindow = null;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function createTray(): void {
|
||||
if (tray) {
|
||||
return;
|
||||
@ -132,11 +148,22 @@ function extractLinksFromText(text: string): string[] {
|
||||
}
|
||||
|
||||
function normalizeClipboardText(text: string): string {
|
||||
const truncateUnicodeSafe = (value: string, maxChars: number): string => {
|
||||
if (value.length <= maxChars) {
|
||||
return value;
|
||||
}
|
||||
const points = Array.from(value);
|
||||
if (points.length <= maxChars) {
|
||||
return value;
|
||||
}
|
||||
return points.slice(0, maxChars).join("");
|
||||
};
|
||||
|
||||
const normalized = String(text || "");
|
||||
if (normalized.length <= CLIPBOARD_MAX_TEXT_CHARS) {
|
||||
return normalized;
|
||||
}
|
||||
const truncated = normalized.slice(0, CLIPBOARD_MAX_TEXT_CHARS);
|
||||
const truncated = truncateUnicodeSafe(normalized, CLIPBOARD_MAX_TEXT_CHARS);
|
||||
const lastBreak = Math.max(
|
||||
truncated.lastIndexOf("\n"),
|
||||
truncated.lastIndexOf("\r"),
|
||||
@ -237,7 +264,7 @@ function registerIpcHandlers(): void {
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.ADD_CONTAINERS, async (_event: IpcMainInvokeEvent, filePaths: string[]) => {
|
||||
const validPaths = validateStringArray(filePaths ?? [], "filePaths");
|
||||
const safePaths = validPaths.filter((p) => path.isAbsolute(p) && !p.includes(".."));
|
||||
const safePaths = validPaths.filter((p) => path.isAbsolute(p));
|
||||
return controller.addContainers(safePaths);
|
||||
});
|
||||
ipcMain.handle(IPC_CHANNELS.GET_START_CONFLICTS, () => controller.getStartConflicts());
|
||||
@ -333,24 +360,14 @@ app.on("second-instance", () => {
|
||||
app.whenReady().then(() => {
|
||||
registerIpcHandlers();
|
||||
mainWindow = createWindow();
|
||||
bindMainWindowLifecycle(mainWindow);
|
||||
updateClipboardWatcher();
|
||||
updateTray();
|
||||
|
||||
mainWindow.on("close", (event) => {
|
||||
const settings = controller.getSettings();
|
||||
if (settings.minimizeToTray && tray) {
|
||||
event.preventDefault();
|
||||
mainWindow?.hide();
|
||||
}
|
||||
});
|
||||
|
||||
mainWindow.on("closed", () => {
|
||||
mainWindow = null;
|
||||
});
|
||||
|
||||
app.on("activate", () => {
|
||||
if (BrowserWindow.getAllWindows().length === 0) {
|
||||
mainWindow = createWindow();
|
||||
bindMainWindowLifecycle(mainWindow);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
import { API_BASE_URL, REQUEST_RETRIES } from "./constants";
|
||||
import { API_BASE_URL, APP_VERSION, REQUEST_RETRIES } from "./constants";
|
||||
import { compactErrorText, sleep } from "./utils";
|
||||
|
||||
const DEBRID_USER_AGENT = "RD-Node-Downloader/1.4.30";
|
||||
const DEBRID_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
|
||||
|
||||
export interface UnrestrictedLink {
|
||||
fileName: string;
|
||||
@ -72,6 +72,35 @@ function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number):
|
||||
return AbortSignal.any([signal, AbortSignal.timeout(timeoutMs)]);
|
||||
}
|
||||
|
||||
async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void> {
|
||||
if (!signal) {
|
||||
await sleep(ms);
|
||||
return;
|
||||
}
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
let timer: NodeJS.Timeout | null = setTimeout(() => {
|
||||
timer = null;
|
||||
signal.removeEventListener("abort", onAbort);
|
||||
resolve();
|
||||
}, Math.max(0, ms));
|
||||
|
||||
const onAbort = (): void => {
|
||||
if (timer) {
|
||||
clearTimeout(timer);
|
||||
timer = null;
|
||||
}
|
||||
signal.removeEventListener("abort", onAbort);
|
||||
reject(new Error("aborted"));
|
||||
};
|
||||
|
||||
if (signal.aborted) {
|
||||
onAbort();
|
||||
return;
|
||||
}
|
||||
signal.addEventListener("abort", onAbort, { once: true });
|
||||
});
|
||||
}
|
||||
|
||||
function looksLikeHtmlResponse(contentType: string, body: string): boolean {
|
||||
const type = String(contentType || "").toLowerCase();
|
||||
if (type.includes("text/html") || type.includes("application/xhtml+xml")) {
|
||||
@ -116,7 +145,7 @@ export class RealDebridClient {
|
||||
if (!response.ok) {
|
||||
const parsed = parseErrorBody(response.status, text, contentType);
|
||||
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES) {
|
||||
await sleep(retryDelayForResponse(response, attempt));
|
||||
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
|
||||
continue;
|
||||
}
|
||||
throw new Error(parsed);
|
||||
@ -153,7 +182,7 @@ export class RealDebridClient {
|
||||
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
|
||||
break;
|
||||
}
|
||||
await sleep(retryDelay(attempt));
|
||||
await sleepWithSignal(retryDelay(attempt), signal);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -18,6 +18,8 @@ const RETRIES_PER_CANDIDATE = 3;
|
||||
const RETRY_DELAY_MS = 1500;
|
||||
const UPDATE_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
|
||||
|
||||
let activeUpdateAbortController: AbortController | null = null;
|
||||
|
||||
type ReleaseAsset = {
|
||||
name: string;
|
||||
browser_download_url: string;
|
||||
@ -87,6 +89,13 @@ function timeoutController(ms: number): { signal: AbortSignal; clear: () => void
|
||||
};
|
||||
}
|
||||
|
||||
function combineSignals(primary: AbortSignal, secondary?: AbortSignal): AbortSignal {
|
||||
if (!secondary) {
|
||||
return primary;
|
||||
}
|
||||
return AbortSignal.any([primary, secondary]);
|
||||
}
|
||||
|
||||
async function readJsonWithTimeout(response: Response, timeoutMs: number): Promise<Record<string, unknown> | null> {
|
||||
let timer: NodeJS.Timeout | null = null;
|
||||
const timeoutPromise = new Promise<never>((_resolve, reject) => {
|
||||
@ -280,13 +289,25 @@ function shouldTryNextDownloadCandidate(error: unknown): boolean {
|
||||
}
|
||||
|
||||
function deriveUpdateFileName(check: UpdateCheckResult, url: string): string {
|
||||
const sanitizeUpdateAssetFileName = (rawName: string): string => {
|
||||
const base = path.basename(String(rawName || "").trim());
|
||||
if (!base) {
|
||||
return "update.exe";
|
||||
}
|
||||
const safe = base
|
||||
.replace(/[\\/:*?"<>|]/g, "_")
|
||||
.replace(/^\.+/, "")
|
||||
.trim();
|
||||
return safe || "update.exe";
|
||||
};
|
||||
|
||||
const fromName = String(check.setupAssetName || "").trim();
|
||||
if (fromName) {
|
||||
return fromName;
|
||||
return sanitizeUpdateAssetFileName(fromName);
|
||||
}
|
||||
try {
|
||||
const parsed = new URL(url);
|
||||
return path.basename(parsed.pathname || "update.exe") || "update.exe";
|
||||
return sanitizeUpdateAssetFileName(parsed.pathname || "update.exe");
|
||||
} catch {
|
||||
return "update.exe";
|
||||
}
|
||||
@ -317,7 +338,8 @@ async function sha256File(filePath: string): Promise<string> {
|
||||
async function verifyDownloadedInstaller(targetPath: string, expectedDigestRaw: string): Promise<void> {
|
||||
const expectedDigest = normalizeSha256Digest(expectedDigestRaw);
|
||||
if (!expectedDigest) {
|
||||
throw new Error("Update-Asset ohne gültigen SHA256-Digest");
|
||||
logger.warn("Update-Asset ohne SHA256-Digest aus API; Integritätsprüfung übersprungen");
|
||||
return;
|
||||
}
|
||||
const actualDigest = await sha256File(targetPath);
|
||||
if (actualDigest !== expectedDigest) {
|
||||
@ -378,6 +400,10 @@ export async function checkGitHubUpdate(repo: string): Promise<UpdateCheckResult
|
||||
}
|
||||
|
||||
async function downloadFile(url: string, targetPath: string): Promise<void> {
|
||||
const shutdownSignal = activeUpdateAbortController?.signal;
|
||||
if (shutdownSignal?.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
logger.info(`Update-Download versucht: ${url}`);
|
||||
const timeout = timeoutController(CONNECT_TIMEOUT_MS);
|
||||
let response: Response;
|
||||
@ -387,7 +413,7 @@ async function downloadFile(url: string, targetPath: string): Promise<void> {
|
||||
"User-Agent": UPDATE_USER_AGENT
|
||||
},
|
||||
redirect: "follow",
|
||||
signal: timeout.signal
|
||||
signal: combineSignals(timeout.signal, shutdownSignal)
|
||||
});
|
||||
} finally {
|
||||
timeout.clear();
|
||||
@ -463,13 +489,38 @@ async function downloadFile(url: string, targetPath: string): Promise<void> {
|
||||
logger.info(`Update-Download abgeschlossen: ${targetPath}`);
|
||||
}
|
||||
|
||||
async function sleep(ms: number): Promise<void> {
|
||||
async function sleep(ms: number, signal?: AbortSignal): Promise<void> {
|
||||
if (!signal) {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
}
|
||||
if (signal.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
return new Promise((resolve, reject) => {
|
||||
let timer: NodeJS.Timeout | null = setTimeout(() => {
|
||||
timer = null;
|
||||
signal.removeEventListener("abort", onAbort);
|
||||
resolve();
|
||||
}, Math.max(0, ms));
|
||||
const onAbort = (): void => {
|
||||
if (timer) {
|
||||
clearTimeout(timer);
|
||||
timer = null;
|
||||
}
|
||||
signal.removeEventListener("abort", onAbort);
|
||||
reject(new Error("aborted:update_shutdown"));
|
||||
};
|
||||
signal.addEventListener("abort", onAbort, { once: true });
|
||||
});
|
||||
}
|
||||
|
||||
async function downloadWithRetries(url: string, targetPath: string): Promise<void> {
|
||||
const shutdownSignal = activeUpdateAbortController?.signal;
|
||||
let lastError: unknown;
|
||||
for (let attempt = 1; attempt <= RETRIES_PER_CANDIDATE; attempt += 1) {
|
||||
if (shutdownSignal?.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
try {
|
||||
await downloadFile(url, targetPath);
|
||||
return;
|
||||
@ -482,7 +533,7 @@ async function downloadWithRetries(url: string, targetPath: string): Promise<voi
|
||||
}
|
||||
if (attempt < RETRIES_PER_CANDIDATE && isRetryableDownloadError(error)) {
|
||||
logger.warn(`Update-Download Retry ${attempt}/${RETRIES_PER_CANDIDATE} für ${url}: ${compactErrorText(error)}`);
|
||||
await sleep(RETRY_DELAY_MS * attempt);
|
||||
await sleep(RETRY_DELAY_MS * attempt, shutdownSignal);
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
@ -492,10 +543,14 @@ async function downloadWithRetries(url: string, targetPath: string): Promise<voi
|
||||
}
|
||||
|
||||
async function downloadFromCandidates(candidates: string[], targetPath: string): Promise<void> {
|
||||
const shutdownSignal = activeUpdateAbortController?.signal;
|
||||
let lastError: unknown = new Error("Update Download fehlgeschlagen");
|
||||
|
||||
logger.info(`Update-Download: ${candidates.length} Kandidat(en), je ${RETRIES_PER_CANDIDATE} Versuche`);
|
||||
for (let index = 0; index < candidates.length; index += 1) {
|
||||
if (shutdownSignal?.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
const candidate = candidates[index];
|
||||
try {
|
||||
await downloadWithRetries(candidate, targetPath);
|
||||
@ -514,6 +569,12 @@ async function downloadFromCandidates(candidates: string[], targetPath: string):
|
||||
}
|
||||
|
||||
export async function installLatestUpdate(repo: string, prechecked?: UpdateCheckResult): Promise<UpdateInstallResult> {
|
||||
if (activeUpdateAbortController && !activeUpdateAbortController.signal.aborted) {
|
||||
return { started: false, message: "Update-Download läuft bereits" };
|
||||
}
|
||||
const updateAbortController = new AbortController();
|
||||
activeUpdateAbortController = updateAbortController;
|
||||
|
||||
const safeRepo = normalizeUpdateRepo(repo);
|
||||
const check = prechecked && !prechecked.error
|
||||
? prechecked
|
||||
@ -551,15 +612,24 @@ export async function installLatestUpdate(repo: string, prechecked?: UpdateCheck
|
||||
}
|
||||
|
||||
const fileName = deriveUpdateFileName(effectiveCheck, candidates[0]);
|
||||
const targetPath = path.join(os.tmpdir(), "rd-update", `${Date.now()}-${fileName}`);
|
||||
const targetPath = path.join(os.tmpdir(), "rd-update", `${Date.now()}-${process.pid}-${crypto.randomUUID()}-${fileName}`);
|
||||
|
||||
try {
|
||||
if (updateAbortController.signal.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
await downloadFromCandidates(candidates, targetPath);
|
||||
if (updateAbortController.signal.aborted) {
|
||||
throw new Error("aborted:update_shutdown");
|
||||
}
|
||||
await verifyDownloadedInstaller(targetPath, String(effectiveCheck.setupAssetDigest || ""));
|
||||
const child = spawn(targetPath, [], {
|
||||
detached: true,
|
||||
stdio: "ignore"
|
||||
});
|
||||
child.once("error", (spawnError) => {
|
||||
logger.error(`Update-Installer Start fehlgeschlagen: ${compactErrorText(spawnError)}`);
|
||||
});
|
||||
child.unref();
|
||||
return { started: true, message: "Update-Installer gestartet" };
|
||||
} catch (error) {
|
||||
@ -571,5 +641,16 @@ export async function installLatestUpdate(repo: string, prechecked?: UpdateCheck
|
||||
const releaseUrl = String(effectiveCheck.releaseUrl || "").trim();
|
||||
const hint = releaseUrl ? ` – Manuell: ${releaseUrl}` : "";
|
||||
return { started: false, message: `${compactErrorText(error)}${hint}` };
|
||||
} finally {
|
||||
if (activeUpdateAbortController === updateAbortController) {
|
||||
activeUpdateAbortController = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function abortActiveUpdateDownload(): void {
|
||||
if (!activeUpdateAbortController || activeUpdateAbortController.signal.aborted) {
|
||||
return;
|
||||
}
|
||||
activeUpdateAbortController.abort("shutdown");
|
||||
}
|
||||
|
||||
@ -20,10 +20,11 @@ export function compactErrorText(message: unknown, maxLen = 220): string {
|
||||
if (!raw) {
|
||||
return "Unbekannter Fehler";
|
||||
}
|
||||
if (raw.length <= maxLen) {
|
||||
const safeMaxLen = Number.isFinite(maxLen) ? Math.max(4, Math.floor(maxLen)) : 220;
|
||||
if (raw.length <= safeMaxLen) {
|
||||
return raw;
|
||||
}
|
||||
return `${raw.slice(0, maxLen - 3)}...`;
|
||||
return `${raw.slice(0, safeMaxLen - 3)}...`;
|
||||
}
|
||||
|
||||
export function sanitizeFilename(name: string): string {
|
||||
@ -71,25 +72,41 @@ export function extractHttpLinksFromText(text: string): string[] {
|
||||
|
||||
for (const match of matches) {
|
||||
let candidate = String(match || "").trim();
|
||||
let openParen = 0;
|
||||
let closeParen = 0;
|
||||
let openBracket = 0;
|
||||
let closeBracket = 0;
|
||||
for (const char of candidate) {
|
||||
if (char === "(") {
|
||||
openParen += 1;
|
||||
} else if (char === ")") {
|
||||
closeParen += 1;
|
||||
} else if (char === "[") {
|
||||
openBracket += 1;
|
||||
} else if (char === "]") {
|
||||
closeBracket += 1;
|
||||
}
|
||||
}
|
||||
while (candidate.length > 0) {
|
||||
const lastChar = candidate[candidate.length - 1];
|
||||
if (![")", "]", ",", ".", "!", "?", ";", ":"].includes(lastChar)) {
|
||||
break;
|
||||
}
|
||||
if (lastChar === ")") {
|
||||
const openCount = (candidate.match(/\(/g) || []).length;
|
||||
const closeCount = (candidate.match(/\)/g) || []).length;
|
||||
if (closeCount <= openCount) {
|
||||
if (closeParen <= openParen) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (lastChar === "]") {
|
||||
const openCount = (candidate.match(/\[/g) || []).length;
|
||||
const closeCount = (candidate.match(/\]/g) || []).length;
|
||||
if (closeCount <= openCount) {
|
||||
if (closeBracket <= openBracket) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (lastChar === ")") {
|
||||
closeParen = Math.max(0, closeParen - 1);
|
||||
} else if (lastChar === "]") {
|
||||
closeBracket = Math.max(0, closeBracket - 1);
|
||||
}
|
||||
candidate = candidate.slice(0, -1);
|
||||
}
|
||||
if (!candidate || !isHttpLink(candidate) || seen.has(candidate)) {
|
||||
@ -182,14 +199,17 @@ export function uniquePreserveOrder(items: string[]): string[] {
|
||||
export function parsePackagesFromLinksText(rawText: string, defaultPackageName: string): ParsedPackageInput[] {
|
||||
const lines = String(rawText || "").split(/\r?\n/);
|
||||
const packages: ParsedPackageInput[] = [];
|
||||
let currentName = sanitizeFilename(defaultPackageName || "Paket");
|
||||
let currentName = String(defaultPackageName || "").trim();
|
||||
let currentLinks: string[] = [];
|
||||
|
||||
const flush = (): void => {
|
||||
const links = uniquePreserveOrder(currentLinks.filter((line) => isHttpLink(line)));
|
||||
if (links.length > 0) {
|
||||
const normalizedCurrentName = String(currentName || "").trim();
|
||||
packages.push({
|
||||
name: sanitizeFilename(currentName || inferPackageNameFromLinks(links)),
|
||||
name: normalizedCurrentName
|
||||
? sanitizeFilename(normalizedCurrentName)
|
||||
: inferPackageNameFromLinks(links),
|
||||
links
|
||||
});
|
||||
}
|
||||
@ -204,7 +224,7 @@ export function parsePackagesFromLinksText(rawText: string, defaultPackageName:
|
||||
const marker = text.match(/^#\s*package\s*:\s*(.+)$/i);
|
||||
if (marker) {
|
||||
flush();
|
||||
currentName = sanitizeFilename(marker[1]);
|
||||
currentName = String(marker[1] || "").trim();
|
||||
continue;
|
||||
}
|
||||
currentLinks.push(text);
|
||||
|
||||
@ -80,6 +80,9 @@ function formatSpeedMbps(speedBps: number): string {
|
||||
}
|
||||
|
||||
function humanSize(bytes: number): string {
|
||||
if (!Number.isFinite(bytes) || bytes < 0) {
|
||||
return "0 B";
|
||||
}
|
||||
if (bytes < 1024) { return `${bytes} B`; }
|
||||
if (bytes < 1024 * 1024) { return `${(bytes / 1024).toFixed(1)} KB`; }
|
||||
if (bytes < 1024 * 1024 * 1024) { return `${(bytes / (1024 * 1024)).toFixed(2)} MB`; }
|
||||
@ -258,6 +261,9 @@ export function App(): ReactElement {
|
||||
let unsubscribe: (() => void) | null = null;
|
||||
let unsubClipboard: (() => void) | null = null;
|
||||
void window.rd.getSnapshot().then((state) => {
|
||||
if (!mountedRef.current) {
|
||||
return;
|
||||
}
|
||||
setSnapshot(state);
|
||||
setSettingsDraft(state.settings);
|
||||
settingsDirtyRef.current = false;
|
||||
@ -265,6 +271,9 @@ export function App(): ReactElement {
|
||||
applyTheme(state.settings.theme);
|
||||
if (state.settings.autoUpdateCheck) {
|
||||
void window.rd.checkUpdates().then((result) => {
|
||||
if (!mountedRef.current) {
|
||||
return;
|
||||
}
|
||||
void handleUpdateResult(result, "startup");
|
||||
}).catch(() => undefined);
|
||||
}
|
||||
@ -717,7 +726,8 @@ export function App(): ReactElement {
|
||||
showToast(`Fehler bei Drag-and-Drop: ${String(error)}`, 2600);
|
||||
});
|
||||
} else if (droppedText.trim()) {
|
||||
setCollectorTabs((prev) => prev.map((t) => t.id === currentCollectorTab.id
|
||||
const activeCollectorId = activeCollectorTabRef.current;
|
||||
setCollectorTabs((prev) => prev.map((t) => t.id === activeCollectorId
|
||||
? { ...t, text: t.text ? `${t.text}\n${droppedText}` : droppedText } : t));
|
||||
setTab("collector");
|
||||
showToast("Links per Drag-and-Drop eingefügt");
|
||||
@ -748,6 +758,7 @@ export function App(): ReactElement {
|
||||
return;
|
||||
}
|
||||
|
||||
actionBusyRef.current = true;
|
||||
setActionBusy(true);
|
||||
|
||||
const input = document.createElement("input");
|
||||
@ -755,7 +766,8 @@ export function App(): ReactElement {
|
||||
input.accept = ".json";
|
||||
|
||||
const releasePickerBusy = (): void => {
|
||||
setActionBusy(actionBusyRef.current);
|
||||
actionBusyRef.current = false;
|
||||
setActionBusy(false);
|
||||
};
|
||||
|
||||
const onWindowFocus = (): void => {
|
||||
@ -1198,8 +1210,12 @@ export function App(): ReactElement {
|
||||
setDownloadsSortDescending(nextDescending);
|
||||
const baseOrder = packageOrderRef.current.length > 0 ? packageOrderRef.current : snapshot.session.packageOrder;
|
||||
const sorted = sortPackageOrderByName(baseOrder, snapshot.session.packages, nextDescending);
|
||||
pendingPackageOrderRef.current = [...sorted];
|
||||
pendingPackageOrderAtRef.current = Date.now();
|
||||
packageOrderRef.current = sorted;
|
||||
void window.rd.reorderPackages(sorted).catch((error) => {
|
||||
pendingPackageOrderRef.current = null;
|
||||
pendingPackageOrderAtRef.current = 0;
|
||||
packageOrderRef.current = serverPackageOrderRef.current;
|
||||
showToast(`Sortierung fehlgeschlagen: ${String(error)}`, 2400);
|
||||
});
|
||||
@ -1268,6 +1284,7 @@ export function App(): ReactElement {
|
||||
<button className="btn" disabled={actionBusy} onClick={onCheckUpdates}>Updates prüfen</button>
|
||||
<button className={`btn${settingsDraft.theme === "light" ? " btn-active" : ""}`} onClick={() => {
|
||||
const next = settingsDraft.theme === "dark" ? "light" : "dark";
|
||||
settingsDraftRevisionRef.current += 1;
|
||||
settingsDirtyRef.current = true;
|
||||
setSettingsDirty(true);
|
||||
setSettingsDraft((prev) => ({ ...prev, theme: next as AppTheme }));
|
||||
@ -1462,7 +1479,7 @@ export function App(): ReactElement {
|
||||
<div className="modal-backdrop" onClick={() => closeConfirmPrompt(false)}>
|
||||
<div className="modal-card" onClick={(event) => event.stopPropagation()}>
|
||||
<h3>{confirmPrompt.title}</h3>
|
||||
<p>{confirmPrompt.message}</p>
|
||||
<p style={{ whiteSpace: "pre-line" }}>{confirmPrompt.message}</p>
|
||||
<div className="modal-actions">
|
||||
<button className="btn" onClick={() => closeConfirmPrompt(false)}>Abbrechen</button>
|
||||
<button
|
||||
@ -1653,6 +1670,7 @@ const PackageCard = memo(function PackageCard({ pkg, items, packageSpeed, isFirs
|
||||
if (a.id !== b.id
|
||||
|| a.updatedAt !== b.updatedAt
|
||||
|| a.status !== b.status
|
||||
|| a.fileName !== b.fileName
|
||||
|| a.progressPercent !== b.progressPercent
|
||||
|| a.speedBps !== b.speedBps
|
||||
|| a.retries !== b.retries
|
||||
|
||||
@ -16,16 +16,66 @@ afterEach(() => {
|
||||
});
|
||||
|
||||
describe("container", () => {
|
||||
it("rejects oversized DLC files before network access", async () => {
|
||||
it("skips oversized DLC files without throwing and blocking other files", async () => {
|
||||
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
|
||||
tempDirs.push(dir);
|
||||
const filePath = path.join(dir, "oversized.dlc");
|
||||
fs.writeFileSync(filePath, Buffer.alloc((8 * 1024 * 1024) + 1, 1));
|
||||
const oversizedFilePath = path.join(dir, "oversized.dlc");
|
||||
fs.writeFileSync(oversizedFilePath, Buffer.alloc((8 * 1024 * 1024) + 1, 1));
|
||||
|
||||
const fetchSpy = vi.fn(async () => new Response("should-not-run", { status: 500 }));
|
||||
// Create a valid mockup DLC that would be skipped if an error was thrown
|
||||
const validFilePath = path.join(dir, "valid.dlc");
|
||||
// Just needs to be short enough to pass file limits but fail parsing, triggering dcrypt fallback
|
||||
fs.writeFileSync(validFilePath, Buffer.from("Valid but not real DLC content..."));
|
||||
|
||||
const fetchSpy = vi.fn(async () => {
|
||||
// Mock dcrypt response for valid.dlc
|
||||
return new Response("http://example.com/file1.rar\nhttp://example.com/file2.rar", { status: 200 });
|
||||
});
|
||||
globalThis.fetch = fetchSpy as unknown as typeof fetch;
|
||||
|
||||
await expect(importDlcContainers([filePath])).rejects.toThrow(/zu groß/i);
|
||||
expect(fetchSpy).toHaveBeenCalledTimes(0);
|
||||
const result = await importDlcContainers([oversizedFilePath, validFilePath]);
|
||||
|
||||
// Expect the oversized to be silently skipped, and valid to be parsed into 2 packages (one per link name)
|
||||
expect(result).toHaveLength(2);
|
||||
expect(result[0].links).toEqual(["http://example.com/file1.rar"]);
|
||||
expect(result[1].links).toEqual(["http://example.com/file2.rar"]);
|
||||
expect(fetchSpy).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("skips non-dlc files completely", async () => {
|
||||
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-non-"));
|
||||
tempDirs.push(dir);
|
||||
const txtPath = path.join(dir, "links.txt");
|
||||
fs.writeFileSync(txtPath, "http://link.com/1");
|
||||
|
||||
const result = await importDlcContainers([txtPath]);
|
||||
expect(result).toEqual([]);
|
||||
});
|
||||
|
||||
it("falls back to dcrypt if local decryption returns empty", async () => {
|
||||
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dlc-"));
|
||||
tempDirs.push(dir);
|
||||
const filePath = path.join(dir, "fallback.dlc");
|
||||
|
||||
// A file large enough to trigger local decryption attempt (needs > 89 bytes to pass the slice check)
|
||||
fs.writeFileSync(filePath, Buffer.alloc(100, 1).toString("base64"));
|
||||
|
||||
const fetchSpy = vi.fn(async (url: string | URL | Request) => {
|
||||
const urlStr = String(url);
|
||||
if (urlStr.includes("rc")) {
|
||||
// Mock local RC service failure (returning 404 or empty string)
|
||||
return new Response("", { status: 404 });
|
||||
} else {
|
||||
// Mock dcrypt fallback success
|
||||
return new Response("http://fallback.com/1", { status: 200 });
|
||||
}
|
||||
});
|
||||
globalThis.fetch = fetchSpy as unknown as typeof fetch;
|
||||
|
||||
const result = await importDlcContainers([filePath]);
|
||||
expect(result).toHaveLength(1);
|
||||
expect(result[0].links).toEqual(["http://fallback.com/1"]);
|
||||
// Should have tried both!
|
||||
expect(fetchSpy).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
});
|
||||
|
||||
74
tests/link-parser.test.ts
Normal file
74
tests/link-parser.test.ts
Normal file
@ -0,0 +1,74 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { mergePackageInputs, parseCollectorInput } from "../src/main/link-parser";
|
||||
|
||||
describe("link-parser", () => {
|
||||
describe("mergePackageInputs", () => {
|
||||
it("merges packages with the same name and preserves order", () => {
|
||||
const input = [
|
||||
{ name: "Package A", links: ["http://link1", "http://link2"] },
|
||||
{ name: "Package B", links: ["http://link3"] },
|
||||
{ name: "Package A", links: ["http://link4", "http://link1"] },
|
||||
{ name: "", links: ["http://link5"] } // empty name will be inferred
|
||||
];
|
||||
|
||||
const result = mergePackageInputs(input);
|
||||
|
||||
expect(result).toHaveLength(3); // Package A, Package B, and inferred 'Paket'
|
||||
|
||||
const pkgA = result.find(p => p.name === "Package A");
|
||||
expect(pkgA?.links).toEqual(["http://link1", "http://link2", "http://link4"]); // link1 deduplicated
|
||||
|
||||
const pkgB = result.find(p => p.name === "Package B");
|
||||
expect(pkgB?.links).toEqual(["http://link3"]);
|
||||
});
|
||||
|
||||
it("sanitizes names during merge", () => {
|
||||
const input = [
|
||||
{ name: "Valid_Name", links: ["http://link1"] },
|
||||
{ name: "Valid?Name*", links: ["http://link2"] }
|
||||
];
|
||||
|
||||
const result = mergePackageInputs(input);
|
||||
|
||||
// "Valid?Name*" becomes "Valid Name " -> trimmed to "Valid Name"
|
||||
expect(result.map(p => p.name).sort()).toEqual(["Valid Name", "Valid_Name"]);
|
||||
});
|
||||
});
|
||||
|
||||
describe("parseCollectorInput", () => {
|
||||
it("returns empty array for empty or invalid input", () => {
|
||||
expect(parseCollectorInput("")).toEqual([]);
|
||||
expect(parseCollectorInput("just some text without links")).toEqual([]);
|
||||
expect(parseCollectorInput("ftp://notsupported")).toEqual([]);
|
||||
});
|
||||
|
||||
it("parses and merges links from raw text", () => {
|
||||
const rawText = `
|
||||
Here are some links:
|
||||
http://example.com/part1.rar
|
||||
http://example.com/part2.rar
|
||||
|
||||
# package: Custom_Name
|
||||
http://other.com/file1
|
||||
http://other.com/file2
|
||||
`;
|
||||
|
||||
const result = parseCollectorInput(rawText, "DefaultFallback");
|
||||
|
||||
// Should have 2 packages: "DefaultFallback" and "Custom_Name"
|
||||
expect(result).toHaveLength(2);
|
||||
|
||||
const defaultPkg = result.find(p => p.name === "DefaultFallback");
|
||||
expect(defaultPkg?.links).toEqual([
|
||||
"http://example.com/part1.rar",
|
||||
"http://example.com/part2.rar"
|
||||
]);
|
||||
|
||||
const customPkg = result.find(p => p.name === "Custom_Name"); // sanitized!
|
||||
expect(customPkg?.links).toEqual([
|
||||
"http://other.com/file1",
|
||||
"http://other.com/file2"
|
||||
]);
|
||||
});
|
||||
});
|
||||
});
|
||||
127
tests/mega-web-fallback.test.ts
Normal file
127
tests/mega-web-fallback.test.ts
Normal file
@ -0,0 +1,127 @@
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import { MegaWebFallback } from "../src/main/mega-web-fallback";
|
||||
|
||||
const originalFetch = globalThis.fetch;
|
||||
|
||||
describe("mega-web-fallback", () => {
|
||||
afterEach(() => {
|
||||
globalThis.fetch = originalFetch;
|
||||
vi.restoreAllMocks();
|
||||
});
|
||||
|
||||
describe("MegaWebFallback class", () => {
|
||||
it("returns null when credentials are empty", async () => {
|
||||
const fallback = new MegaWebFallback(() => ({ login: "", password: "" }));
|
||||
const result = await fallback.unrestrict("https://mega.debrid/test");
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it("logs in, fetches HTML, parses code, and polls AJAX for direct url", async () => {
|
||||
let fetchCallCount = 0;
|
||||
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
|
||||
const urlStr = String(url);
|
||||
fetchCallCount += 1;
|
||||
|
||||
if (urlStr.includes("form=login")) {
|
||||
const headers = new Headers();
|
||||
headers.append("set-cookie", "session=goodcookie; path=/");
|
||||
return new Response("", { headers, status: 200 });
|
||||
}
|
||||
|
||||
if (urlStr.includes("page=debrideur")) {
|
||||
return new Response('<form id="debridForm"></form>', { status: 200 });
|
||||
}
|
||||
|
||||
if (urlStr.includes("form=debrid")) {
|
||||
// The POST to generate the code
|
||||
return new Response(`
|
||||
<div class="acp-box">
|
||||
<h3>Link: https://mega.debrid/link1</h3>
|
||||
<a href="javascript:processDebrid(1,'secretcode123',0)">Download</a>
|
||||
</div>
|
||||
`, { status: 200 });
|
||||
}
|
||||
|
||||
if (urlStr.includes("ajax=debrid")) {
|
||||
// Polling endpoint
|
||||
return new Response(JSON.stringify({ link: "https://mega.direct/123" }), { status: 200 });
|
||||
}
|
||||
|
||||
return new Response("Not found", { status: 404 });
|
||||
}) as unknown as typeof fetch;
|
||||
|
||||
const fallback = new MegaWebFallback(() => ({ login: "user", password: "pwd" }));
|
||||
|
||||
const result = await fallback.unrestrict("https://mega.debrid/link1");
|
||||
expect(result).not.toBeNull();
|
||||
expect(result?.directUrl).toBe("https://mega.direct/123");
|
||||
expect(result?.fileName).toBe("link1");
|
||||
// Calls: 1. Login POST, 2. Verify GET, 3. Generate POST, 4. Polling POST
|
||||
expect(fetchCallCount).toBe(4);
|
||||
});
|
||||
|
||||
it("throws if login fails to set cookie", async () => {
|
||||
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
|
||||
const urlStr = String(url);
|
||||
if (urlStr.includes("form=login")) {
|
||||
const headers = new Headers(); // No cookie
|
||||
return new Response("", { headers, status: 200 });
|
||||
}
|
||||
return new Response("Not found", { status: 404 });
|
||||
}) as unknown as typeof fetch;
|
||||
|
||||
const fallback = new MegaWebFallback(() => ({ login: "bad", password: "bad" }));
|
||||
|
||||
await expect(fallback.unrestrict("http://mega.debrid/file"))
|
||||
.rejects.toThrow("Mega-Web Login liefert kein Session-Cookie");
|
||||
});
|
||||
|
||||
it("throws if login verify check fails (no form found)", async () => {
|
||||
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
|
||||
const urlStr = String(url);
|
||||
if (urlStr.includes("form=login")) {
|
||||
const headers = new Headers();
|
||||
headers.append("set-cookie", "session=goodcookie; path=/");
|
||||
return new Response("", { headers, status: 200 });
|
||||
}
|
||||
if (urlStr.includes("page=debrideur")) {
|
||||
// Missing form!
|
||||
return new Response('<html><body>Nothing here</body></html>', { status: 200 });
|
||||
}
|
||||
return new Response("Not found", { status: 404 });
|
||||
}) as unknown as typeof fetch;
|
||||
|
||||
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
|
||||
|
||||
await expect(fallback.unrestrict("http://mega.debrid/file"))
|
||||
.rejects.toThrow("Mega-Web Login ungültig oder Session blockiert");
|
||||
});
|
||||
|
||||
it("returns null if generation fails to find a code", async () => {
|
||||
let callCount = 0;
|
||||
globalThis.fetch = vi.fn(async (url: string | URL | Request) => {
|
||||
const urlStr = String(url);
|
||||
callCount++;
|
||||
if (urlStr.includes("form=login")) {
|
||||
const headers = new Headers();
|
||||
headers.append("set-cookie", "session=goodcookie; path=/");
|
||||
return new Response("", { headers, status: 200 });
|
||||
}
|
||||
if (urlStr.includes("page=debrideur")) {
|
||||
return new Response('<form id="debridForm"></form>', { status: 200 });
|
||||
}
|
||||
if (urlStr.includes("form=debrid")) {
|
||||
// The generate POST returns HTML without any codes
|
||||
return new Response(`<div>No links here</div>`, { status: 200 });
|
||||
}
|
||||
return new Response("Not found", { status: 404 });
|
||||
}) as unknown as typeof fetch;
|
||||
|
||||
const fallback = new MegaWebFallback(() => ({ login: "a", password: "b" }));
|
||||
const result = await fallback.unrestrict("http://mega.debrid/file");
|
||||
|
||||
// Generation fails -> resets cookie -> tries again -> fails again -> returns null
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
});
|
||||
});
|
||||
Loading…
Reference in New Issue
Block a user