Compare commits

..

No commits in common. "main" and "v1.6.0" have entirely different histories.
main ... v1.6.0

53 changed files with 1006 additions and 4030 deletions

9
.gitignore vendored
View File

@ -28,12 +28,3 @@ coverage/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Forgejo deployment runtime files
deploy/forgejo/.env
deploy/forgejo/forgejo/
deploy/forgejo/postgres/
deploy/forgejo/caddy/data/
deploy/forgejo/caddy/config/
deploy/forgejo/caddy/logs/
deploy/forgejo/backups/

View File

@ -1,23 +0,0 @@
## Release + Update Source (Wichtig)
- Primäre Plattform ist `https://git.24-music.de`
- Standard-Repo: `Administrator/real-debrid-downloader`
- Nicht mehr primär über Codeberg/GitHub releasen
## Releasen
1. Token setzen:
- PowerShell: `$env:GITEA_TOKEN="<token>"`
2. Release ausführen:
- `npm run release:gitea -- <version> [notes]`
Das Script:
- bumped `package.json`
- baut Windows-Artefakte
- pusht `main` + Tag
- erstellt Release auf `git.24-music.de`
- lädt Assets hoch
## Auto-Update
- Updater nutzt aktuell `git.24-music.de` als Standardquelle

View File

@ -1,6 +1,6 @@
# Multi Debrid Downloader
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** with fast queue management, automatic extraction, and robust error handling.
![Platform](https://img.shields.io/badge/platform-Windows%2010%2F11-0078D6)
![Electron](https://img.shields.io/badge/Electron-31.x-47848F)
@ -65,18 +65,18 @@ Desktop downloader with fast queue management, automatic extraction, and robust
- Minimize-to-tray with tray menu controls.
- Speed limits globally or per download.
- Bandwidth schedules for time-based speed profiles.
- Built-in auto-updater via `git.24-music.de` Releases.
- Built-in auto-updater via Codeberg Releases.
- Long path support (>260 characters) on Windows.
## Installation
### Option A: prebuilt releases (recommended)
1. Download a release from the `git.24-music.de` Releases page.
1. Download a release from the Codeberg Releases page.
2. Run the installer or portable build.
3. Add your debrid tokens in Settings.
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
Releases: `https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases`
### Option B: build from source
@ -103,34 +103,21 @@ npm run dev
| `npm test` | Runs Vitest unit tests |
| `npm run self-check` | Runs integrated end-to-end self-checks |
| `npm run release:win` | Creates Windows installer and portable build |
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
| `npm run release:codeberg -- <version> [notes]` | One-command version bump + build + tag + Codeberg release upload |
### One-command git.24-music release
### One-command Codeberg release
```bash
npm run release:gitea -- 1.6.31 "- Maintenance update"
npm run release:codeberg -- 1.4.42 "- Maintenance update"
```
This command will:
1. Bump `package.json` version.
2. Build setup/portable artifacts (`npm run release:win`).
3. Commit and push `main` to your `git.24-music.de` remote.
3. Commit and push `main` to your Codeberg remote.
4. Create and push tag `v<version>`.
5. Create/update the Gitea release and upload required assets.
Required once before release:
```bash
git remote add gitea https://git.24-music.de/<user>/<repo>.git
```
PowerShell token setup:
```powershell
$env:GITEA_TOKEN="<dein-token>"
```
5. Create/update the Codeberg release and upload required assets.
## Typical workflow
@ -160,43 +147,14 @@ The app stores runtime files in Electron's `userData` directory, including:
## Troubleshooting
- Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords and native extractor installation (7-Zip/WinRAR). Optional JVM extractor can be forced with `RD_EXTRACT_BACKEND=jvm`.
- Extraction fails: check archive passwords, JVM runtime (`resources/extractor-jvm`), or force legacy mode with `RD_EXTRACT_BACKEND=legacy`.
- Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers.
- Stalled downloads: the app auto-detects stalls within 10 seconds and retries automatically.
## Changelog
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
### v1.6.61 (2026-03-05)
- Fixed leftover empty package folders in `Downloader Unfertig` after successful extraction.
- Resume marker files (`.rd_extract_progress*.json`) are now treated as ignorable for empty-folder cleanup.
- Deferred post-processing now clears resume markers before running empty-directory removal.
### v1.6.60 (2026-03-05)
- Added package-scoped password cache for extraction: once the first archive in a package is solved, following archives in the same package reuse that password first.
- Kept fallback behavior intact (`""` and other candidates are still tested), but moved empty-password probing behind the learned password to reduce per-archive delays.
- Added cache invalidation on real `wrong_password` failures so stale passwords are automatically discarded.
### v1.6.59 (2026-03-05)
- Switched default extraction backend to native tools (`legacy`) for more stable archive-to-archive flow.
- Prioritized 7-Zip as primary native extractor, with WinRAR/UnRAR as fallback.
- JVM extractor remains available as opt-in via `RD_EXTRACT_BACKEND=jvm`.
### v1.6.58 (2026-03-05)
- Fixed extraction progress oscillation (`1% -> 100% -> 1%` loops) during password retries.
- Kept strict archive completion logic, but normalized in-progress archive percent to avoid false visual done states before real completion.
### v1.6.57 (2026-03-05)
- Fixed extraction flow so archives are marked done only on real completion, not on temporary `100%` progress spikes.
- Improved password handling: after the first successful archive, the discovered password is prioritized for subsequent archives.
- Fixed progress parsing for password retries (reset/restart handling), reducing visible and real gaps between archive extractions.
Release history is available on [Codeberg Releases](https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases).
## License

75
_upload_release.mjs Normal file
View File

@ -0,0 +1,75 @@
import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const credResult = spawnSync("git", ["credential", "fill"], {
input: "protocol=https\nhost=codeberg.org\n\n",
encoding: "utf8",
stdio: ["pipe", "pipe", "pipe"]
});
const creds = new Map();
for (const line of credResult.stdout.split(/\r?\n/)) {
if (line.includes("=")) {
const [k, v] = line.split("=", 2);
creds.set(k, v);
}
}
const auth = "Basic " + Buffer.from(creds.get("username") + ":" + creds.get("password")).toString("base64");
const owner = "Sucukdeluxe";
const repo = "real-debrid-downloader";
const tag = "v1.5.35";
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
async function main() {
await fetch(baseApi, {
method: "PATCH",
headers: { Authorization: auth, "Content-Type": "application/json" },
body: JSON.stringify({ has_releases: true })
});
const createRes = await fetch(`${baseApi}/releases`, {
method: "POST",
headers: { Authorization: auth, "Content-Type": "application/json", Accept: "application/json" },
body: JSON.stringify({
tag_name: tag,
target_commitish: "main",
name: tag,
body: "- Fix: Fortschritt zeigt jetzt kombinierten Wert (Download + Entpacken)\n- Fix: Pausieren zeigt nicht mehr 'Warte auf Daten'\n- Pixel-perfekte Dual-Layer Progress-Bar Texte (clip-path)",
draft: false,
prerelease: false
})
});
const release = await createRes.json();
if (!createRes.ok) {
console.error("Create failed:", JSON.stringify(release));
process.exit(1);
}
console.log("Release created:", release.id);
const files = [
"Real-Debrid-Downloader Setup 1.5.35.exe",
"Real-Debrid-Downloader 1.5.35.exe",
"latest.yml",
"Real-Debrid-Downloader Setup 1.5.35.exe.blockmap"
];
for (const f of files) {
const filePath = path.join("release", f);
const data = fs.readFileSync(filePath);
const uploadUrl = `${baseApi}/releases/${release.id}/assets?name=${encodeURIComponent(f)}`;
const res = await fetch(uploadUrl, {
method: "POST",
headers: { Authorization: auth, "Content-Type": "application/octet-stream" },
body: data
});
if (res.ok) {
console.log("Uploaded:", f);
} else if (res.status === 409 || res.status === 422) {
console.log("Skipped existing:", f);
} else {
console.error("Upload failed for", f, ":", res.status);
}
}
console.log(`Done! https://codeberg.org/${owner}/${repo}/releases/tag/${tag}`);
}
main().catch(e => { console.error(e.message); process.exit(1); });

View File

@ -25,11 +25,11 @@ AppPublisher=Sucukdeluxe
DefaultDirName={autopf}\{#MyAppName}
DefaultGroupName={#MyAppName}
OutputDir={#MyOutputDir}
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
Compression=lzma
SolidCompression=yes
WizardStyle=modern
PrivilegesRequired=lowest
PrivilegesRequired=admin
ArchitecturesInstallIn64BitMode=x64compatible
UninstallDisplayIcon={app}\{#MyAppExeName}
SetupIconFile={#MyIconFile}
@ -39,8 +39,8 @@ Name: "german"; MessagesFile: "compiler:Languages\German.isl"
Name: "english"; MessagesFile: "compiler:Default.isl"
[Files]
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"; Flags: ignoreversion
[Icons]
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"

View File

@ -1,7 +1,7 @@
{
"name": "real-debrid-downloader",
"version": "1.6.66",
"description": "Desktop downloader",
"version": "1.6.0",
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
"main": "build/main/main/main.js",
"author": "Sucukdeluxe",
"license": "MIT",
@ -17,8 +17,7 @@
"test": "vitest run",
"self-check": "tsx tests/self-check.ts",
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
"release:gitea": "node scripts/release_gitea.mjs",
"release:forgejo": "node scripts/release_gitea.mjs"
"release:codeberg": "node scripts/release_codeberg.mjs"
},
"dependencies": {
"adm-zip": "^0.5.16",

View File

@ -3,9 +3,7 @@ package com.sucukdeluxe.extractor;
import net.lingala.zip4j.ZipFile;
import net.lingala.zip4j.exception.ZipException;
import net.lingala.zip4j.model.FileHeader;
import net.sf.sevenzipjbinding.ExtractAskMode;
import net.sf.sevenzipjbinding.ExtractOperationResult;
import net.sf.sevenzipjbinding.IArchiveExtractCallback;
import net.sf.sevenzipjbinding.IArchiveOpenCallback;
import net.sf.sevenzipjbinding.IArchiveOpenVolumeCallback;
import net.sf.sevenzipjbinding.IInArchive;
@ -28,7 +26,6 @@ import java.io.InputStream;
import java.io.OutputStream;
import java.io.RandomAccessFile;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Base64;
import java.util.HashMap;
@ -45,18 +42,12 @@ public final class JBindExtractorMain {
private static final Pattern NUMBERED_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.zip\\.\\d{3}$");
private static final Pattern OLD_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.z\\d{2,3}$");
private static final Pattern SEVEN_ZIP_SPLIT_RE = Pattern.compile("(?i).*\\.7z\\.001$");
private static final Pattern DIGIT_SUFFIX_RE = Pattern.compile("\\d{2,3}");
private static final Pattern WINDOWS_SPECIAL_CHARS_RE = Pattern.compile("[:<>*?\"\\|]");
private static volatile boolean sevenZipInitialized = false;
private JBindExtractorMain() {
}
public static void main(String[] args) {
if (args.length == 1 && "--daemon".equals(args[0])) {
runDaemon();
return;
}
int exit = 1;
try {
ExtractionRequest request = parseArgs(args);
@ -71,127 +62,6 @@ public final class JBindExtractorMain {
System.exit(exit);
}
private static void runDaemon() {
System.out.println("RD_DAEMON_READY");
System.out.flush();
java.io.BufferedReader reader = new java.io.BufferedReader(
new java.io.InputStreamReader(System.in, StandardCharsets.UTF_8));
try {
String line;
while ((line = reader.readLine()) != null) {
line = line.trim();
if (line.isEmpty()) {
continue;
}
int exitCode = 1;
try {
ExtractionRequest request = parseDaemonRequest(line);
exitCode = runExtraction(request);
} catch (IllegalArgumentException error) {
emitError("Argumentfehler: " + safeMessage(error));
exitCode = 2;
} catch (Throwable error) {
emitError(safeMessage(error));
exitCode = 1;
}
System.out.println("RD_REQUEST_DONE " + exitCode);
System.out.flush();
}
} catch (IOException ignored) {
// stdin closed parent process exited
}
}
private static ExtractionRequest parseDaemonRequest(String jsonLine) {
// Minimal JSON parsing without external dependencies.
// Expected format: {"archive":"...","target":"...","conflict":"...","backend":"...","passwords":["...","..."]}
ExtractionRequest request = new ExtractionRequest();
request.archiveFile = new File(extractJsonString(jsonLine, "archive"));
request.targetDir = new File(extractJsonString(jsonLine, "target"));
String conflict = extractJsonString(jsonLine, "conflict");
if (conflict.length() > 0) {
request.conflictMode = ConflictMode.fromValue(conflict);
}
String backend = extractJsonString(jsonLine, "backend");
if (backend.length() > 0) {
request.backend = Backend.fromValue(backend);
}
// Parse passwords array
int pwStart = jsonLine.indexOf("\"passwords\"");
if (pwStart >= 0) {
int arrStart = jsonLine.indexOf('[', pwStart);
int arrEnd = jsonLine.indexOf(']', arrStart);
if (arrStart >= 0 && arrEnd > arrStart) {
String arrContent = jsonLine.substring(arrStart + 1, arrEnd);
int idx = 0;
while (idx < arrContent.length()) {
int qStart = arrContent.indexOf('"', idx);
if (qStart < 0) break;
int qEnd = findClosingQuote(arrContent, qStart + 1);
if (qEnd < 0) break;
request.passwords.add(unescapeJsonString(arrContent.substring(qStart + 1, qEnd)));
idx = qEnd + 1;
}
}
}
if (request.archiveFile == null || !request.archiveFile.exists() || !request.archiveFile.isFile()) {
throw new IllegalArgumentException("Archiv nicht gefunden: " +
(request.archiveFile == null ? "null" : request.archiveFile.getAbsolutePath()));
}
if (request.targetDir == null) {
throw new IllegalArgumentException("--target fehlt");
}
return request;
}
private static String extractJsonString(String json, String key) {
String search = "\"" + key + "\"";
int keyIdx = json.indexOf(search);
if (keyIdx < 0) return "";
int colonIdx = json.indexOf(':', keyIdx + search.length());
if (colonIdx < 0) return "";
int qStart = json.indexOf('"', colonIdx + 1);
if (qStart < 0) return "";
int qEnd = findClosingQuote(json, qStart + 1);
if (qEnd < 0) return "";
return unescapeJsonString(json.substring(qStart + 1, qEnd));
}
private static int findClosingQuote(String s, int from) {
for (int i = from; i < s.length(); i++) {
char c = s.charAt(i);
if (c == '\\') {
i++; // skip escaped character
continue;
}
if (c == '"') return i;
}
return -1;
}
private static String unescapeJsonString(String s) {
if (s.indexOf('\\') < 0) return s;
StringBuilder sb = new StringBuilder(s.length());
for (int i = 0; i < s.length(); i++) {
char c = s.charAt(i);
if (c == '\\' && i + 1 < s.length()) {
char next = s.charAt(i + 1);
switch (next) {
case '"': sb.append('"'); i++; break;
case '\\': sb.append('\\'); i++; break;
case '/': sb.append('/'); i++; break;
case 'n': sb.append('\n'); i++; break;
case 'r': sb.append('\r'); i++; break;
case 't': sb.append('\t'); i++; break;
default: sb.append(c); break;
}
} else {
sb.append(c);
}
}
return sb.toString();
}
private static int runExtraction(ExtractionRequest request) throws Exception {
List<String> passwords = normalizePasswords(request.passwords);
Exception lastError = null;
@ -282,35 +152,30 @@ public final class JBindExtractorMain {
}
ensureDirectory(output.getParentFile());
rejectSymlink(output);
long[] remaining = new long[] { itemUnits };
boolean extractionSuccess = false;
try {
InputStream in = zipFile.getInputStream(header);
OutputStream out = new FileOutputStream(output);
try {
OutputStream out = new FileOutputStream(output);
try {
byte[] buffer = new byte[BUFFER_SIZE];
while (true) {
int read = in.read(buffer);
if (read < 0) {
break;
}
if (read == 0) {
continue;
}
out.write(buffer, 0, read);
long accounted = Math.min(remaining[0], (long) read);
remaining[0] -= accounted;
progress.advance(accounted);
byte[] buffer = new byte[BUFFER_SIZE];
while (true) {
int read = in.read(buffer);
if (read < 0) {
break;
}
} finally {
try {
out.close();
} catch (Throwable ignored) {
if (read == 0) {
continue;
}
out.write(buffer, 0, read);
long accounted = Math.min(remaining[0], (long) read);
remaining[0] -= accounted;
progress.advance(accounted);
}
} finally {
try {
out.close();
} catch (Throwable ignored) {
}
try {
in.close();
} catch (Throwable ignored) {
@ -323,19 +188,11 @@ public final class JBindExtractorMain {
if (modified > 0) {
output.setLastModified(modified);
}
extractionSuccess = true;
} catch (ZipException error) {
if (isWrongPassword(error, encrypted)) {
throw new WrongPasswordException(error);
}
throw error;
} finally {
if (!extractionSuccess && output.exists()) {
try {
output.delete();
} catch (Throwable ignored) {
}
}
}
}
@ -362,99 +219,98 @@ public final class JBindExtractorMain {
try {
context = openSevenZipArchive(request.archiveFile, password);
IInArchive archive = context.archive;
int itemCount = archive.getNumberOfItems();
if (itemCount <= 0) {
throw new IOException("Archiv enthalt keine Eintrage oder konnte nicht gelesen werden: " + request.archiveFile.getAbsolutePath());
}
ISimpleInArchive simple = archive.getSimpleInterface();
ISimpleInArchiveItem[] items = simple.getArchiveItems();
// Pre-scan: collect file indices, sizes, output paths, and detect encryption
long totalUnits = 0;
boolean encrypted = false;
List<Integer> fileIndices = new ArrayList<Integer>();
List<File> outputFiles = new ArrayList<File>();
List<Long> fileSizes = new ArrayList<Long>();
for (ISimpleInArchiveItem item : items) {
if (item == null || item.isFolder()) {
continue;
}
try {
encrypted = encrypted || item.isEncrypted();
} catch (Throwable ignored) {
// ignore encrypted flag read issues
}
totalUnits += safeSize(item.getSize());
}
ProgressTracker progress = new ProgressTracker(totalUnits);
progress.emitStart();
Set<String> reserved = new HashSet<String>();
for (ISimpleInArchiveItem item : items) {
if (item == null) {
continue;
}
for (int i = 0; i < itemCount; i++) {
Boolean isFolder = (Boolean) archive.getProperty(i, PropID.IS_FOLDER);
String entryPath = (String) archive.getProperty(i, PropID.PATH);
String entryName = normalizeEntryName(entryPath, "item-" + i);
if (Boolean.TRUE.equals(isFolder)) {
String entryName = normalizeEntryName(item.getPath(), "item-" + item.getItemIndex());
if (item.isFolder()) {
File dir = resolveDirectory(request.targetDir, entryName);
ensureDirectory(dir);
reserved.add(pathKey(dir));
continue;
}
try {
Boolean isEncrypted = (Boolean) archive.getProperty(i, PropID.ENCRYPTED);
encrypted = encrypted || Boolean.TRUE.equals(isEncrypted);
} catch (Throwable ignored) {
// ignore encrypted flag read issues
}
Long rawSize = (Long) archive.getProperty(i, PropID.SIZE);
long itemSize = safeSize(rawSize);
totalUnits += itemSize;
long itemUnits = safeSize(item.getSize());
File output = resolveOutputFile(request.targetDir, entryName, request.conflictMode, reserved);
fileIndices.add(i);
outputFiles.add(output); // null if skipped
fileSizes.add(itemSize);
}
if (fileIndices.isEmpty()) {
// All items are folders or skipped
ProgressTracker progress = new ProgressTracker(1);
progress.emitStart();
progress.emitDone();
return;
}
ProgressTracker progress = new ProgressTracker(totalUnits);
progress.emitStart();
// Build index array for bulk extract
int[] indices = new int[fileIndices.size()];
for (int i = 0; i < fileIndices.size(); i++) {
indices[i] = fileIndices.get(i);
}
// Map from archive index to our position in fileIndices/outputFiles
Map<Integer, Integer> indexToPos = new HashMap<Integer, Integer>();
for (int i = 0; i < fileIndices.size(); i++) {
indexToPos.put(fileIndices.get(i), i);
}
// Bulk extraction state
final boolean encryptedFinal = encrypted;
final String effectivePassword = password == null ? "" : password;
final File[] currentOutput = new File[1];
final FileOutputStream[] currentStream = new FileOutputStream[1];
final boolean[] currentSuccess = new boolean[1];
final long[] currentRemaining = new long[1];
final Throwable[] firstError = new Throwable[1];
final int[] currentPos = new int[] { -1 };
try {
archive.extract(indices, false, new BulkExtractCallback(
archive, indexToPos, fileIndices, outputFiles, fileSizes,
progress, encryptedFinal, effectivePassword, currentOutput,
currentStream, currentSuccess, currentRemaining, currentPos, firstError
));
} catch (SevenZipException error) {
if (looksLikeWrongPassword(error, encryptedFinal)) {
throw new WrongPasswordException(error);
if (output == null) {
progress.advance(itemUnits);
continue;
}
throw error;
}
if (firstError[0] != null) {
if (firstError[0] instanceof WrongPasswordException) {
throw (WrongPasswordException) firstError[0];
ensureDirectory(output.getParentFile());
final FileOutputStream out = new FileOutputStream(output);
final long[] remaining = new long[] { itemUnits };
try {
ExtractOperationResult result = item.extractSlow(new ISequentialOutStream() {
@Override
public int write(byte[] data) throws SevenZipException {
if (data == null || data.length == 0) {
return 0;
}
try {
out.write(data);
} catch (IOException error) {
throw new SevenZipException("Fehler beim Schreiben: " + error.getMessage(), error);
}
long accounted = Math.min(remaining[0], (long) data.length);
remaining[0] -= accounted;
progress.advance(accounted);
return data.length;
}
}, password == null ? "" : password);
if (remaining[0] > 0) {
progress.advance(remaining[0]);
}
if (result != ExtractOperationResult.OK) {
if (isPasswordFailure(result, encrypted)) {
throw new WrongPasswordException(new IOException("Falsches Passwort"));
}
throw new IOException("7z-Fehler: " + result.name());
}
} catch (SevenZipException error) {
if (looksLikeWrongPassword(error, encrypted)) {
throw new WrongPasswordException(error);
}
throw error;
} finally {
try {
out.close();
} catch (Throwable ignored) {
}
}
try {
java.util.Date modified = item.getLastWriteTime();
if (modified != null) {
output.setLastModified(modified.getTime());
}
} catch (Throwable ignored) {
// best effort
}
throw (Exception) firstError[0];
}
progress.emitDone();
@ -472,31 +328,14 @@ public final class JBindExtractorMain {
if (SEVEN_ZIP_SPLIT_RE.matcher(nameLower).matches()) {
VolumedArchiveInStream volumed = new VolumedArchiveInStream(archiveFile.getName(), callback);
try {
IInArchive archive = SevenZip.openInArchive(null, volumed, callback);
return new SevenZipArchiveContext(archive, null, volumed, callback);
} catch (Exception error) {
callback.close();
throw error;
}
IInArchive archive = SevenZip.openInArchive(null, volumed, callback);
return new SevenZipArchiveContext(archive, null, volumed, callback);
}
RandomAccessFile raf = new RandomAccessFile(archiveFile, "r");
RandomAccessFileInStream stream = new RandomAccessFileInStream(raf);
try {
IInArchive archive = SevenZip.openInArchive(null, stream, callback);
return new SevenZipArchiveContext(archive, stream, null, callback);
} catch (Exception error) {
try {
stream.close();
} catch (Throwable ignored) {
}
try {
raf.close();
} catch (Throwable ignored) {
}
throw error;
}
IInArchive archive = SevenZip.openInArchive(null, stream, callback);
return new SevenZipArchiveContext(archive, stream, null, callback);
}
private static boolean isWrongPassword(ZipException error, boolean encrypted) {
@ -557,7 +396,7 @@ public final class JBindExtractorMain {
}
if (siblingName.startsWith(prefix) && siblingName.length() >= prefix.length() + 2) {
String suffix = siblingName.substring(prefix.length());
if (DIGIT_SUFFIX_RE.matcher(suffix).matches()) {
if (suffix.matches("\\d{2,3}")) {
return true;
}
}
@ -641,12 +480,6 @@ public final class JBindExtractorMain {
}
if (normalized.matches("^[a-zA-Z]:.*")) {
normalized = normalized.substring(2);
while (normalized.startsWith("/")) {
normalized = normalized.substring(1);
}
while (normalized.startsWith("\\")) {
normalized = normalized.substring(1);
}
}
File targetCanonical = targetDir.getCanonicalFile();
File output = new File(targetCanonical, normalized);
@ -655,8 +488,7 @@ public final class JBindExtractorMain {
String outputPath = outputCanonical.getPath();
String targetPathNorm = isWindows() ? targetPath.toLowerCase(Locale.ROOT) : targetPath;
String outputPathNorm = isWindows() ? outputPath.toLowerCase(Locale.ROOT) : outputPath;
String targetPrefix = targetPathNorm.endsWith(File.separator) ? targetPathNorm : targetPathNorm + File.separator;
if (!outputPathNorm.equals(targetPathNorm) && !outputPathNorm.startsWith(targetPrefix)) {
if (!outputPathNorm.equals(targetPathNorm) && !outputPathNorm.startsWith(targetPathNorm + File.separator)) {
throw new IOException("Path Traversal blockiert: " + entryName);
}
return outputCanonical;
@ -674,50 +506,20 @@ public final class JBindExtractorMain {
if (entry.length() == 0) {
return fallback;
}
// Sanitize Windows special characters from each path segment
String[] segments = entry.split("/", -1);
StringBuilder sanitized = new StringBuilder();
for (int i = 0; i < segments.length; i++) {
if (i > 0) {
sanitized.append('/');
}
sanitized.append(WINDOWS_SPECIAL_CHARS_RE.matcher(segments[i]).replaceAll("_"));
}
entry = sanitized.toString();
if (entry.length() == 0) {
return fallback;
}
return entry;
}
private static long safeSize(Long value) {
if (value == null) {
return 0;
return 1;
}
long size = value.longValue();
if (size <= 0) {
return 0;
return 1;
}
return size;
}
private static void rejectSymlink(File file) throws IOException {
if (file == null) {
return;
}
if (Files.isSymbolicLink(file.toPath())) {
throw new IOException("Zieldatei ist ein Symlink, Schreiben verweigert: " + file.getAbsolutePath());
}
// Also check parent directories for symlinks
File parent = file.getParentFile();
while (parent != null) {
if (Files.isSymbolicLink(parent.toPath())) {
throw new IOException("Elternverzeichnis ist ein Symlink, Schreiben verweigert: " + parent.getAbsolutePath());
}
parent = parent.getParentFile();
}
}
private static void ensureDirectory(File dir) throws IOException {
if (dir == null) {
return;
@ -879,176 +681,6 @@ public final class JBindExtractorMain {
private final List<String> passwords = new ArrayList<String>();
}
/**
* Bulk extraction callback that implements both IArchiveExtractCallback and
* ICryptoGetTextPassword. Using the bulk IInArchive.extract() API instead of
* per-item extractSlow() is critical for performance solid RAR archives
* otherwise re-decode from the beginning for every single item.
*/
private static final class BulkExtractCallback implements IArchiveExtractCallback, ICryptoGetTextPassword {
private final IInArchive archive;
private final Map<Integer, Integer> indexToPos;
private final List<Integer> fileIndices;
private final List<File> outputFiles;
private final List<Long> fileSizes;
private final ProgressTracker progress;
private final boolean encrypted;
private final String password;
private final File[] currentOutput;
private final FileOutputStream[] currentStream;
private final boolean[] currentSuccess;
private final long[] currentRemaining;
private final int[] currentPos;
private final Throwable[] firstError;
BulkExtractCallback(IInArchive archive, Map<Integer, Integer> indexToPos,
List<Integer> fileIndices, List<File> outputFiles, List<Long> fileSizes,
ProgressTracker progress, boolean encrypted, String password,
File[] currentOutput, FileOutputStream[] currentStream,
boolean[] currentSuccess, long[] currentRemaining, int[] currentPos,
Throwable[] firstError) {
this.archive = archive;
this.indexToPos = indexToPos;
this.fileIndices = fileIndices;
this.outputFiles = outputFiles;
this.fileSizes = fileSizes;
this.progress = progress;
this.encrypted = encrypted;
this.password = password;
this.currentOutput = currentOutput;
this.currentStream = currentStream;
this.currentSuccess = currentSuccess;
this.currentRemaining = currentRemaining;
this.currentPos = currentPos;
this.firstError = firstError;
}
@Override
public String cryptoGetTextPassword() {
return password;
}
@Override
public void setTotal(long total) {
// 7z reports total compressed bytes; we track uncompressed via ProgressTracker
}
@Override
public void setCompleted(long complete) {
// Not used we track per-write progress
}
@Override
public ISequentialOutStream getStream(int index, ExtractAskMode extractAskMode) throws SevenZipException {
closeCurrentStream();
Integer pos = indexToPos.get(index);
if (pos == null) {
return null;
}
currentPos[0] = pos;
currentOutput[0] = outputFiles.get(pos);
currentSuccess[0] = false;
currentRemaining[0] = fileSizes.get(pos);
if (extractAskMode != ExtractAskMode.EXTRACT) {
currentOutput[0] = null;
return null;
}
if (currentOutput[0] == null) {
progress.advance(currentRemaining[0]);
return null;
}
try {
ensureDirectory(currentOutput[0].getParentFile());
rejectSymlink(currentOutput[0]);
currentStream[0] = new FileOutputStream(currentOutput[0]);
} catch (IOException error) {
throw new SevenZipException("Fehler beim Erstellen: " + error.getMessage(), error);
}
return new ISequentialOutStream() {
@Override
public int write(byte[] data) throws SevenZipException {
if (data == null || data.length == 0) {
return 0;
}
try {
currentStream[0].write(data);
} catch (IOException error) {
throw new SevenZipException("Fehler beim Schreiben: " + error.getMessage(), error);
}
long accounted = Math.min(currentRemaining[0], (long) data.length);
currentRemaining[0] -= accounted;
progress.advance(accounted);
return data.length;
}
};
}
@Override
public void prepareOperation(ExtractAskMode extractAskMode) {
// no-op
}
@Override
public void setOperationResult(ExtractOperationResult result) throws SevenZipException {
if (currentRemaining[0] > 0) {
progress.advance(currentRemaining[0]);
currentRemaining[0] = 0;
}
if (result == ExtractOperationResult.OK) {
currentSuccess[0] = true;
closeCurrentStream();
if (currentPos[0] >= 0 && currentOutput[0] != null) {
try {
int archiveIndex = fileIndices.get(currentPos[0]);
java.util.Date modified = (java.util.Date) archive.getProperty(archiveIndex, PropID.LAST_MODIFICATION_TIME);
if (modified != null) {
currentOutput[0].setLastModified(modified.getTime());
}
} catch (Throwable ignored) {
// best effort
}
}
} else {
closeCurrentStream();
if (currentOutput[0] != null && currentOutput[0].exists()) {
try {
currentOutput[0].delete();
} catch (Throwable ignored) {
}
}
if (firstError[0] == null) {
if (isPasswordFailure(result, encrypted)) {
firstError[0] = new WrongPasswordException(new IOException("Falsches Passwort"));
} else {
firstError[0] = new IOException("7z-Fehler: " + result.name());
}
}
}
}
private void closeCurrentStream() {
if (currentStream[0] != null) {
try {
currentStream[0].close();
} catch (Throwable ignored) {
}
currentStream[0] = null;
}
if (!currentSuccess[0] && currentOutput[0] != null && currentOutput[0].exists()) {
try {
currentOutput[0].delete();
} catch (Throwable ignored) {
}
}
}
}
private static final class WrongPasswordException extends Exception {
private static final long serialVersionUID = 1L;
@ -1196,11 +828,12 @@ public final class JBindExtractorMain {
if (filename == null || filename.trim().length() == 0) {
return null;
}
// Always resolve relative to the archive's parent directory.
// Never accept absolute paths to prevent path traversal.
String baseName = new File(filename).getName();
File direct = new File(filename);
if (direct.isAbsolute() && direct.exists()) {
return direct;
}
if (archiveDir != null) {
File relative = new File(archiveDir, baseName);
File relative = new File(archiveDir, filename);
if (relative.exists()) {
return relative;
}
@ -1210,13 +843,13 @@ public final class JBindExtractorMain {
if (!sibling.isFile()) {
continue;
}
if (sibling.getName().equalsIgnoreCase(baseName)) {
if (sibling.getName().equalsIgnoreCase(filename)) {
return sibling;
}
}
}
}
return null;
return direct.exists() ? direct : null;
}
@Override

View File

@ -2,17 +2,8 @@ const path = require("path");
const { rcedit } = require("rcedit");
module.exports = async function afterPack(context) {
const productFilename = context.packager?.appInfo?.productFilename;
if (!productFilename) {
console.warn(" • rcedit: skipped — productFilename not available");
return;
}
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
const exePath = path.join(context.appOutDir, `${context.packager.appInfo.productFilename}.exe`);
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
console.log(` • rcedit: patching icon → ${exePath}`);
try {
await rcedit(exePath, { icon: iconPath });
} catch (error) {
console.warn(` • rcedit: failed — ${String(error)}`);
}
await rcedit(exePath, { icon: iconPath });
};

View File

@ -31,21 +31,18 @@ async function main(): Promise<void> {
login: settings.megaLogin,
password: settings.megaPassword
}));
try {
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
for (const link of links) {
try {
const result = await service.unrestrictLink(link);
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
} catch (error) {
console.log(`[FAIL] ${String(error)}`);
}
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
for (const link of links) {
try {
const result = await service.unrestrictLink(link);
console.log(`[OK] ${result.providerLabel} -> ${result.fileName}`);
} catch (error) {
console.log(`[FAIL] ${String(error)}`);
}
} finally {
megaWeb.dispose();
}
megaWeb.dispose();
}
main().catch(e => { console.error(e); process.exit(1); });
void main();

View File

@ -16,8 +16,8 @@ function sleep(ms) {
}
function cookieFrom(headers) {
const cookies = headers.getSetCookie();
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
const raw = headers.get("set-cookie") || "";
return raw.split(",").map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
}
function parseDebridCodes(html) {
@ -47,9 +47,6 @@ async function resolveCode(cookie, code) {
});
const text = (await res.text()).trim();
if (text === "reload") {
if (attempt % 5 === 0) {
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
}
await sleep(800);
continue;
}
@ -101,13 +98,7 @@ async function main() {
redirect: "manual"
});
if (loginRes.status >= 400) {
throw new Error(`Login failed with HTTP ${loginRes.status}`);
}
const cookie = cookieFrom(loginRes.headers);
if (!cookie) {
throw new Error("Login returned no session cookie");
}
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
@ -145,4 +136,4 @@ async function main() {
}
}
await main().catch((e) => { console.error(e); process.exit(1); });
await main();

View File

@ -66,8 +66,6 @@ async function callRealDebrid(link) {
};
}
// megaCookie is intentionally cached at module scope so that multiple
// callMegaDebrid() invocations reuse the same session cookie.
async function callMegaDebrid(link) {
if (!megaCookie) {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
@ -79,15 +77,13 @@ async function callMegaDebrid(link) {
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
redirect: "manual"
});
if (loginRes.status >= 400) {
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
}
megaCookie = loginRes.headers.getSetCookie()
megaCookie = (loginRes.headers.get("set-cookie") || "")
.split(",")
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
if (!megaCookie) {
return { ok: false, error: "Mega-Web login returned no session cookie" };
return { ok: false, error: "Mega-Web login failed" };
}
}
@ -294,4 +290,4 @@ async function main() {
}
}
await main().catch((e) => { console.error(e); process.exit(1); });
await main();

View File

@ -2,15 +2,7 @@ import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const NPM_RELEASE_WIN = process.platform === "win32"
? {
command: process.env.ComSpec || "cmd.exe",
args: ["/d", "/s", "/c", "npm run release:win"]
}
: {
command: "npm",
args: ["run", "release:win"]
};
const NPM_EXECUTABLE = process.platform === "win32" ? "npm.cmd" : "npm";
function run(command, args, options = {}) {
const result = spawnSync(command, args, {
@ -45,8 +37,7 @@ function runWithInput(command, args, input) {
cwd: process.cwd(),
encoding: "utf8",
input,
stdio: ["pipe", "pipe", "pipe"],
timeout: 10000
stdio: ["pipe", "pipe", "pipe"]
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
@ -68,74 +59,37 @@ function parseArgs(argv) {
return { help: false, dryRun, version, notes };
}
function parseRemoteUrl(url) {
function parseCodebergRemote(url) {
const raw = String(url || "").trim();
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
const httpsMatch = raw.match(/^https?:\/\/(?:www\.)?codeberg\.org\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (httpsMatch) {
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
return { owner: httpsMatch[1], repo: httpsMatch[2] };
}
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
const sshMatch = raw.match(/^git@codeberg\.org:([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshMatch) {
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
return { owner: sshMatch[1], repo: sshMatch[2] };
}
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshAltMatch) {
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
}
throw new Error(`Cannot parse remote URL: ${raw}`);
throw new Error(`Cannot parse Codeberg remote URL: ${raw}`);
}
function normalizeBaseUrl(url) {
const raw = String(url || "").trim().replace(/\/+$/, "");
if (!raw) {
return "";
}
if (!/^https?:\/\//i.test(raw)) {
throw new Error("GITEA_BASE_URL must start with http:// or https://");
}
return raw;
}
function getGiteaRepo() {
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
const remotes = forcedRemote
? [forcedRemote]
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
function getCodebergRepo() {
const remotes = ["codeberg", "origin"];
for (const remote of remotes) {
try {
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
const parsed = parseRemoteUrl(remoteUrl);
const remoteBase = `https://${parsed.host}`.toLowerCase();
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
continue;
if (/codeberg\.org/i.test(remoteUrl)) {
const parsed = parseCodebergRemote(remoteUrl);
return { remote, ...parsed };
}
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
} catch {
// try next remote
}
}
if (preferredBase) {
throw new Error(
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
);
}
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
throw new Error("No Codeberg remote found. Add one with: git remote add codeberg https://codeberg.org/<owner>/<repo>.git");
}
function getAuthHeader(host) {
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
if (explicitToken) {
return `token ${explicitToken}`;
}
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
function getCodebergAuthHeader() {
const credentialText = runWithInput("git", ["credential", "fill"], "protocol=https\nhost=codeberg.org\n\n");
const map = new Map();
for (const line of credentialText.split(/\r?\n/)) {
if (!line.includes("=")) {
@ -147,9 +101,7 @@ function getAuthHeader(host) {
const username = map.get("username") || "";
const password = map.get("password") || "";
if (!username || !password) {
throw new Error(
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
);
throw new Error("Missing Codeberg credentials in git credential helper");
}
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
return `Basic ${token}`;
@ -190,8 +142,7 @@ function updatePackageVersion(rootDir, version) {
const packagePath = path.join(rootDir, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
if (String(packageJson.version || "") === version) {
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
return;
throw new Error(`package.json is already at version ${version}`);
}
packageJson.version = version;
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
@ -246,7 +197,8 @@ function ensureTagMissing(tag) {
}
}
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
if (byTag.ok) {
return byTag.body;
@ -266,34 +218,13 @@ async function createOrGetRelease(baseApi, tag, authHeader, notes) {
return created.body;
}
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDir, files) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
for (const fileName of files) {
const filePath = path.join(releaseDir, fileName);
const fileSize = fs.statSync(filePath).size;
const fileData = fs.readFileSync(filePath);
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
// Stream large files instead of loading them entirely into memory
const fileStream = fs.createReadStream(filePath);
const response = await fetch(uploadUrl, {
method: "POST",
headers: {
Accept: "application/json",
Authorization: authHeader,
"Content-Type": "application/octet-stream",
"Content-Length": String(fileSize)
},
body: fileStream,
duplex: "half"
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
const response = await apiRequest("POST", uploadUrl, authHeader, fileData, "application/octet-stream");
if (response.ok) {
process.stdout.write(`Uploaded: ${fileName}\n`);
continue;
@ -302,7 +233,7 @@ async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, f
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
continue;
}
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(response.body)}`);
}
}
@ -310,44 +241,46 @@ async function main() {
const rootDir = process.cwd();
const args = parseArgs(process.argv);
if (args.help) {
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
process.stdout.write("Usage: npm run release:codeberg -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Example: npm run release:codeberg -- 1.4.42 \"- Small fixes\"\n");
return;
}
const version = ensureVersionString(args.version);
const tag = `v${version}`;
const releaseNotes = args.notes || `- Release ${tag}`;
const repo = getGiteaRepo();
const { remote, owner, repo } = getCodebergRepo();
ensureNoTrackedChanges();
ensureTagMissing(tag);
if (args.dryRun) {
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
return;
}
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_RELEASE_WIN.command, NPM_RELEASE_WIN.args);
run(NPM_EXECUTABLE, ["run", "release:win"]);
const assets = ensureAssetsExist(rootDir, version);
if (args.dryRun) {
process.stdout.write(`Dry run complete. Assets exist for ${tag}.\n`);
return;
}
run("git", ["add", "package.json"]);
run("git", ["commit", "-m", `Release ${tag}`]);
run("git", ["push", repo.remote, "main"]);
run("git", ["push", remote, "main"]);
run("git", ["tag", tag]);
run("git", ["push", repo.remote, tag]);
run("git", ["push", remote, tag]);
const authHeader = getAuthHeader(repo.host);
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
const authHeader = getCodebergAuthHeader();
const baseRepoApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
const patchReleaseEnabled = await apiRequest("PATCH", baseRepoApi, authHeader, JSON.stringify({ has_releases: true }));
if (!patchReleaseEnabled.ok) {
throw new Error(`Failed to enable releases (${patchReleaseEnabled.status}): ${JSON.stringify(patchReleaseEnabled.body)}`);
}
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
const release = await createOrGetRelease(owner, repo, tag, authHeader, releaseNotes);
await uploadReleaseAssets(owner, repo, release.id, authHeader, assets.releaseDir, assets.files);
process.stdout.write(`Release published: ${release.html_url}\n`);
}
main().catch((error) => {

View File

@ -0,0 +1,24 @@
import fs from "node:fs";
import path from "node:path";
const version = process.argv[2];
if (!version) {
console.error("Usage: node scripts/set_version_node.mjs <version>");
process.exit(1);
}
const root = process.cwd();
const packageJsonPath = path.join(root, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, "utf8"));
packageJson.version = version;
fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
const constantsPath = path.join(root, "src", "main", "constants.ts");
const constants = fs.readFileSync(constantsPath, "utf8").replace(
/APP_VERSION = "[^"]+"/,
`APP_VERSION = "${version}"`
);
fs.writeFileSync(constantsPath, constants, "utf8");
console.log(`Set version to ${version}`);

View File

@ -5,7 +5,6 @@ import {
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
ParsedPackageInput,
SessionStats,
StartConflictEntry,
@ -22,7 +21,7 @@ import { parseCollectorInput } from "./link-parser";
import { configureLogger, getLogFilePath, logger } from "./logger";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
import { MegaWebFallback } from "./mega-web-fallback";
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { addHistoryEntry, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
import { startDebugServer, stopDebugServer } from "./debug-server";
@ -82,15 +81,8 @@ export class AppController {
void this.manager.getStartConflicts().then((conflicts) => {
const hasConflicts = conflicts.length > 0;
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
// If the onState handler is already set (renderer connected), start immediately.
// Otherwise mark as pending so the onState setter triggers the start.
if (this.onStateHandler) {
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
} else {
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
}
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
} else if (hasConflicts) {
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
}
@ -105,8 +97,6 @@ export class AppController {
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|| settings.bestToken.trim()
|| settings.allDebridToken.trim()
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
|| settings.oneFichierApiKey.trim()
);
}
@ -122,9 +112,6 @@ export class AppController {
this.autoResumePending = false;
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
logger.info("Auto-Resume beim Start aktiviert");
} else {
// Trigger pending extractions without starting the session
this.manager.triggerIdleExtractions();
}
}
}
@ -171,12 +158,6 @@ export class AppController {
}
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
// Stop active downloads before installing. Extractions may continue briefly
// until prepareForShutdown() is called during app quit.
if (this.manager.isSessionRunning()) {
this.manager.stop();
}
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
? this.lastUpdateCheck
@ -286,14 +267,7 @@ export class AppController {
}
public exportBackup(): string {
const settings = { ...this.settings };
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = settings[key];
if (typeof val === "string" && val.length > 0) {
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
}
}
const settings = this.settings;
const session = this.manager.getSession();
return JSON.stringify({ version: 1, settings, session }, null, 2);
}
@ -308,36 +282,12 @@ export class AppController {
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
}
const importedSettings = parsed.settings as AppSettings;
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = (importedSettings as Record<string, unknown>)[key];
if (typeof val === "string" && val.startsWith("***")) {
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
}
}
const restoredSettings = normalizeSettings(importedSettings);
const restoredSettings = normalizeSettings(parsed.settings as AppSettings);
this.settings = restoredSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
// Full stop including extraction abort — the old session is being replaced,
// so no extraction tasks from it should keep running.
this.manager.stop();
this.manager.abortAllPostProcessing();
// Cancel any deferred persist timer and queued async writes so the old
// in-memory session does not overwrite the restored session file on disk.
this.manager.clearPersistTimer();
cancelPendingAsyncSaves();
const restoredSession = normalizeLoadedSessionTransientFields(
normalizeLoadedSession(parsed.session)
);
const restoredSession = parsed.session as ReturnType<typeof loadSession>;
saveSession(this.storagePaths, restoredSession);
// Prevent prepareForShutdown from overwriting the restored session file
// with the old in-memory session when the app quits after backup restore.
this.manager.skipShutdownPersist = true;
// Block all persistence (including persistSoon from any IPC operations
// the user might trigger before restarting) to protect the restored backup.
this.manager.blockAllPersistence = true;
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
}
@ -362,8 +312,8 @@ export class AppController {
clearHistory(this.storagePaths);
}
public setPackagePriority(packageId: string, priority: PackagePriority): void {
this.manager.setPackagePriority(packageId, priority);
public setPackagePriority(packageId: string, priority: string): void {
this.manager.setPackagePriority(packageId, priority as any);
}
public skipItems(itemIds: string[]): void {

View File

@ -19,8 +19,7 @@ export const CHUNK_SIZE = 512 * 1024;
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
export const STREAM_HIGH_WATER_MARK = 2 * 1024 * 1024; // 2 MB stream buffer (JDownloader: Java NIO FileChannel default ~8 MB)
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
@ -35,7 +34,7 @@ export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
export const SPEED_WINDOW_SECONDS = 1;
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
export const DEFAULT_UPDATE_REPO = "Sucukdeluxe/real-debrid-downloader";
export function defaultSettings(): AppSettings {
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
@ -45,9 +44,6 @@ export function defaultSettings(): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: "",
archivePasswordList: "",
rememberToken: true,
providerPrimary: "realdebrid",
@ -89,7 +85,6 @@ export function defaultSettings(): AppSettings {
totalDownloadedAllTime: 0,
bandwidthSchedules: [],
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
extractCpuPriority: "high",
autoExtractWhenStopped: true
extractCpuPriority: "high"
};
}

View File

@ -164,7 +164,7 @@ async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]>
const dlcData = content.slice(0, -88);
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
const rcResponse = await fetch(rcUrl, { method: "GET" });
if (!rcResponse.ok) {
return [];
}
@ -217,8 +217,7 @@ async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<s
const response = await fetch(DCRYPT_UPLOAD_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
body: form
});
if (response.status === 413) {
return null;
@ -236,8 +235,7 @@ async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
const response = await fetch(DCRYPT_PASTE_URL, {
method: "POST",
body: form,
signal: AbortSignal.timeout(30000)
body: form
});
if (response.status === 413) {
return null;

View File

@ -11,16 +11,11 @@ const RAPIDGATOR_SCAN_MAX_BYTES = 512 * 1024;
const BEST_DEBRID_API_BASE = "https://bestdebrid.com/api/v1";
const ALL_DEBRID_API_BASE = "https://api.alldebrid.com/v4";
const ONEFICHIER_API_BASE = "https://api.1fichier.com/v1";
const ONEFICHIER_URL_RE = /^https?:\/\/(?:www\.)?(?:1fichier\.com|alterupload\.com|cjoint\.net|desfichiers\.com|dfichiers\.com|megadl\.fr|mesfichiers\.org|piecejointe\.net|pjointe\.com|tenvoi\.com|dl4free\.com)\/\?([a-z0-9]{5,20})$/i;
const PROVIDER_LABELS: Record<DebridProvider, string> = {
realdebrid: "Real-Debrid",
megadebrid: "Mega-Debrid",
bestdebrid: "BestDebrid",
alldebrid: "AllDebrid",
ddownload: "DDownload",
onefichier: "1Fichier"
alldebrid: "AllDebrid"
};
interface ProviderUnrestrictedLink extends UnrestrictedLink {
@ -322,7 +317,7 @@ async function runWithConcurrency<T>(items: T[], concurrency: number, worker: (i
let index = 0;
let firstError: unknown = null;
const next = (): T | undefined => {
if (firstError || index >= items.length) {
if (index >= items.length) {
return undefined;
}
const item = items[index];
@ -422,7 +417,6 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES + 2) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
@ -438,11 +432,9 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
&& !contentType.includes("text/plain")
&& !contentType.includes("text/xml")
&& !contentType.includes("application/xml")) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
if (!contentType && Number.isFinite(contentLength) && contentLength > RAPIDGATOR_SCAN_MAX_BYTES) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
@ -454,7 +446,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
return "";
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (/aborted/i.test(errorText)) {
throw error;
}
if (attempt >= REQUEST_RETRIES + 2 || !isRetryableErrorText(errorText)) {
@ -535,7 +527,7 @@ export async function checkRapidgatorOnline(
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (/aborted/i.test(errorText)) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
break; // fall through to GET
}
@ -556,12 +548,10 @@ export async function checkRapidgatorOnline(
});
if (response.status === 404) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
@ -571,7 +561,6 @@ export async function checkRapidgatorOnline(
const finalUrl = response.url || link;
if (!finalUrl.includes(fileId)) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
@ -588,7 +577,7 @@ export async function checkRapidgatorOnline(
return { online: true, fileName, fileSize };
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (/aborted/i.test(errorText)) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
return null;
}
@ -645,7 +634,7 @@ class MegaDebridClient {
throw new Error("Mega-Web Antwort ohne Download-Link");
}
if (!lastError) {
lastError = "Mega-Web Antwort leer";
lastError = web ? "Mega-Web Antwort ohne Download-Link" : "Mega-Web Antwort leer";
}
// Don't retry permanent hoster errors (dead link, file removed, etc.)
if (/permanent ungültig|hosternotavailable|file.?not.?found|file.?unavailable|link.?is.?dead/i.test(lastError)) {
@ -655,7 +644,7 @@ class MegaDebridClient {
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "Mega-Web Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "Mega-Web Unrestrict fehlgeschlagen");
}
}
@ -674,11 +663,7 @@ class BestDebridClient {
try {
return await this.tryRequest(request, link, signal);
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
lastError = errorText;
lastError = compactErrorText(error);
}
}
@ -743,7 +728,7 @@ class BestDebridClient {
throw new Error("BestDebrid Antwort ohne Download-Link");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -837,7 +822,7 @@ class AllDebridClient {
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
@ -949,7 +934,7 @@ class AllDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -959,257 +944,7 @@ class AllDebridClient {
}
}
throw new Error(String(lastError || "AllDebrid Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}
// ── 1Fichier Client ──
class OneFichierClient {
private apiKey: string;
public constructor(apiKey: string) {
this.apiKey = apiKey;
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
if (!ONEFICHIER_URL_RE.test(link)) {
throw new Error("Kein 1Fichier-Link");
}
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
if (signal?.aborted) throw new Error("aborted:debrid");
try {
const res = await fetch(`${ONEFICHIER_API_BASE}/download/get_token.cgi`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${this.apiKey}`
},
body: JSON.stringify({ url: link, pretty: 1 }),
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const json = await res.json() as Record<string, unknown>;
if (json.status === "KO" || json.error) {
const msg = String(json.message || json.error || "Unbekannter 1Fichier-Fehler");
throw new Error(msg);
}
const directUrl = String(json.url || "");
if (!directUrl) {
throw new Error("1Fichier: Keine Download-URL in Antwort");
}
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
throw error;
}
if (attempt < REQUEST_RETRIES) {
await sleep(retryDelay(attempt), signal);
}
}
}
throw new Error(`1Fichier-Unrestrict fehlgeschlagen: ${lastError}`);
}
}
const DDOWNLOAD_URL_RE = /^https?:\/\/(?:www\.)?(?:ddownload\.com|ddl\.to)\/([a-z0-9]+)/i;
const DDOWNLOAD_WEB_BASE = "https://ddownload.com";
const DDOWNLOAD_WEB_UA = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36";
class DdownloadClient {
private login: string;
private password: string;
private cookies: string = "";
public constructor(login: string, password: string) {
this.login = login;
this.password = password;
}
private async webLogin(signal?: AbortSignal): Promise<void> {
// Step 1: GET login page to extract form token
const loginPageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/login.html`, {
headers: { "User-Agent": DDOWNLOAD_WEB_UA },
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const loginPageHtml = await loginPageRes.text();
const tokenMatch = loginPageHtml.match(/name="token" value="([^"]+)"/);
const pageCookies = (loginPageRes.headers.getSetCookie?.() || []).map((c: string) => c.split(";")[0]).join("; ");
// Step 2: POST login
const body = new URLSearchParams({
op: "login",
token: tokenMatch?.[1] || "",
rand: "",
redirect: "",
login: this.login,
password: this.password
});
const loginRes = await fetch(`${DDOWNLOAD_WEB_BASE}/`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
...(pageCookies ? { Cookie: pageCookies } : {})
},
body: body.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Drain body
try { await loginRes.text(); } catch { /* ignore */ }
const setCookies = loginRes.headers.getSetCookie?.() || [];
const xfss = setCookies.find((c: string) => c.startsWith("xfss="));
const loginCookie = setCookies.find((c: string) => c.startsWith("login="));
if (!xfss) {
throw new Error("DDownload Login fehlgeschlagen (kein Session-Cookie)");
}
this.cookies = [loginCookie, xfss].filter(Boolean).map((c: string) => c.split(";")[0]).join("; ");
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
const match = link.match(DDOWNLOAD_URL_RE);
if (!match) {
throw new Error("Kein DDownload-Link");
}
const fileCode = match[1];
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
// Login if no session yet
if (!this.cookies) {
await this.webLogin(signal);
}
// Step 1: GET file page to extract form fields
const filePageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
Cookie: this.cookies
},
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Premium with direct downloads enabled → redirect immediately
if (filePageRes.status >= 300 && filePageRes.status < 400) {
const directUrl = filePageRes.headers.get("location") || "";
try { await filePageRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const html = await filePageRes.text();
// Check for file not found
if (/File Not Found|file was removed|file was banned/i.test(html)) {
throw new Error("DDownload: Datei nicht gefunden");
}
// Extract form fields
const idVal = html.match(/name="id" value="([^"]+)"/)?.[1] || fileCode;
const randVal = html.match(/name="rand" value="([^"]+)"/)?.[1] || "";
const fileNameMatch = html.match(/class="file-info-name"[^>]*>([^<]+)</);
const fileName = fileNameMatch?.[1]?.trim() || filenameFromUrl(link);
// Step 2: POST download2 for premium download
const dlBody = new URLSearchParams({
op: "download2",
id: idVal,
rand: randVal,
referer: "",
method_premium: "1",
adblock_detected: "0"
});
const dlRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
Cookie: this.cookies,
Referer: `${DDOWNLOAD_WEB_BASE}/${fileCode}`
},
body: dlBody.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (dlRes.status >= 300 && dlRes.status < 400) {
const directUrl = dlRes.headers.get("location") || "";
try { await dlRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: fileName || filenameFromUrl(directUrl),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const dlHtml = await dlRes.text();
// Try to find direct URL in response HTML
const directMatch = dlHtml.match(/https?:\/\/[a-z0-9]+\.(?:dstorage\.org|ddownload\.com|ddl\.to|ucdn\.to)[^\s"'<>]+/i);
if (directMatch) {
return {
fileName,
directUrl: directMatch[0],
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
// Check for error messages
const errMatch = dlHtml.match(/class="err"[^>]*>([^<]+)</i);
if (errMatch) {
throw new Error(`DDownload: ${errMatch[1].trim()}`);
}
throw new Error("DDownload: Kein Download-Link erhalten");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
// Re-login on auth errors
if (/login|session|cookie/i.test(lastError)) {
this.cookies = "";
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
break;
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "DDownload Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "AllDebrid Unrestrict fehlgeschlagen");
}
}
@ -1218,9 +953,6 @@ export class DebridService {
private options: DebridServiceOptions;
private cachedDdownloadClient: DdownloadClient | null = null;
private cachedDdownloadKey = "";
public constructor(settings: AppSettings, options: DebridServiceOptions = {}) {
this.settings = cloneSettings(settings);
this.options = options;
@ -1230,16 +962,6 @@ export class DebridService {
this.settings = cloneSettings(next);
}
private getDdownloadClient(login: string, password: string): DdownloadClient {
const key = `${login}\0${password}`;
if (this.cachedDdownloadClient && this.cachedDdownloadKey === key) {
return this.cachedDdownloadClient;
}
this.cachedDdownloadClient = new DdownloadClient(login, password);
this.cachedDdownloadKey = key;
return this.cachedDdownloadClient;
}
public async resolveFilenames(
links: string[],
onResolved?: (link: string, fileName: string) => void,
@ -1274,7 +996,7 @@ export class DebridService {
}
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
// ignore and continue with host page fallback
@ -1292,46 +1014,6 @@ export class DebridService {
public async unrestrictLink(link: string, signal?: AbortSignal, settingsSnapshot?: AppSettings): Promise<ProviderUnrestrictedLink> {
const settings = settingsSnapshot ? cloneSettings(settingsSnapshot) : cloneSettings(this.settings);
// 1Fichier is a direct file hoster. If the link is a 1fichier.com URL
// and the API key is configured, use 1Fichier directly before debrid providers.
if (ONEFICHIER_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "onefichier")) {
try {
const result = await this.unrestrictViaProvider(settings, "onefichier", link, signal);
return {
...result,
provider: "onefichier",
providerLabel: PROVIDER_LABELS["onefichier"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain
}
}
// DDownload is a direct file hoster, not a debrid service.
// If the link is a ddownload.com/ddl.to URL and the account is configured,
// use DDownload directly before trying any debrid providers.
if (DDOWNLOAD_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "ddownload")) {
try {
const result = await this.unrestrictViaProvider(settings, "ddownload", link, signal);
return {
...result,
provider: "ddownload",
providerLabel: PROVIDER_LABELS["ddownload"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain (debrid services may also support ddownload links)
}
}
const order = toProviderOrder(
settings.providerPrimary,
settings.providerSecondary,
@ -1359,11 +1041,7 @@ export class DebridService {
providerLabel: PROVIDER_LABELS[primary]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${errorText}`);
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${compactErrorText(error)}`);
}
}
@ -1393,7 +1071,7 @@ export class DebridService {
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
if (signal?.aborted || /aborted/i.test(errorText)) {
throw error;
}
attempts.push(`${PROVIDER_LABELS[provider]}: ${compactErrorText(error)}`);
@ -1417,12 +1095,6 @@ export class DebridService {
if (provider === "alldebrid") {
return Boolean(settings.allDebridToken.trim());
}
if (provider === "ddownload") {
return Boolean(settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim());
}
if (provider === "onefichier") {
return Boolean(settings.oneFichierApiKey.trim());
}
return Boolean(settings.bestToken.trim());
}
@ -1436,12 +1108,6 @@ export class DebridService {
if (provider === "alldebrid") {
return new AllDebridClient(settings.allDebridToken).unrestrictLink(link, signal);
}
if (provider === "ddownload") {
return this.getDdownloadClient(settings.ddownloadLogin, settings.ddownloadPassword).unrestrictLink(link, signal);
}
if (provider === "onefichier") {
return new OneFichierClient(settings.oneFichierApiKey).unrestrictLink(link, signal);
}
return new BestDebridClient(settings.bestToken).unrestrictLink(link, signal);
}
}

View File

@ -261,7 +261,7 @@ export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
const port = getPort(baseDir);
server = http.createServer(handleRequest);
server.listen(port, "127.0.0.1", () => {
server.listen(port, "0.0.0.0", () => {
logger.info(`Debug-Server gestartet auf Port ${port}`);
});
server.on("error", (err) => {

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -7,7 +7,6 @@ import { IPC_CHANNELS } from "../shared/ipc";
import { getLogFilePath, logger } from "./logger";
import { APP_NAME } from "./constants";
import { extractHttpLinksFromText } from "./utils";
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
/* ── IPC validation helpers ────────────────────────────────────── */
function validateString(value: unknown, name: string): string {
@ -51,7 +50,6 @@ process.on("unhandledRejection", (reason) => {
let mainWindow: BrowserWindow | null = null;
let tray: Tray | null = null;
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
let lastClipboardText = "";
const controller = new AppController();
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
@ -82,7 +80,7 @@ function createWindow(): BrowserWindow {
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu"
]
}
});
@ -131,7 +129,7 @@ function createTray(): void {
const contextMenu = Menu.buildFromTemplate([
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
{ type: "separator" },
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
{ label: "Start", click: () => { controller.start(); } },
{ label: "Stop", click: () => { controller.stop(); } },
{ type: "separator" },
{ label: "Beenden", click: () => { app.quit(); } }
@ -189,12 +187,7 @@ function startClipboardWatcher(): void {
}
lastClipboardText = normalizeClipboardText(clipboard.readText());
clipboardTimer = setInterval(() => {
let text: string;
try {
text = normalizeClipboardText(clipboard.readText());
} catch {
return;
}
const text = normalizeClipboardText(clipboard.readText());
if (text === lastClipboardText || !text.trim()) {
return;
}
@ -243,9 +236,9 @@ function registerIpcHandlers(): void {
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
});
if (result.started) {
updateQuitTimer = setTimeout(() => {
setTimeout(() => {
app.quit();
}, 2500);
}, 800);
}
return result;
});
@ -296,12 +289,12 @@ function registerIpcHandlers(): void {
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds ?? [], "packageIds");
return controller.startPackages(packageIds ?? []);
if (!Array.isArray(packageIds)) throw new Error("packageIds muss ein Array sein");
return controller.startPackages(packageIds);
});
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.startItems(itemIds ?? []);
return controller.startItems(itemIds);
});
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
@ -344,18 +337,15 @@ function registerIpcHandlers(): void {
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
validateString(packageId, "packageId");
validateString(priority, "priority");
if (priority !== "high" && priority !== "normal" && priority !== "low") {
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
}
return controller.setPackagePriority(packageId, priority);
return controller.setPackagePriority(packageId, priority as any);
});
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.skipItems(itemIds ?? []);
if (!Array.isArray(itemIds)) throw new Error("itemIds must be an array");
return controller.skipItems(itemIds);
});
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.resetItems(itemIds ?? []);
if (!Array.isArray(itemIds)) throw new Error("itemIds must be an array");
return controller.resetItems(itemIds);
});
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
@ -459,11 +449,6 @@ function registerIpcHandlers(): void {
return { restored: false, message: "Abgebrochen" };
}
const filePath = result.filePaths[0];
const stat = await fs.promises.stat(filePath);
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
if (stat.size > BACKUP_MAX_BYTES) {
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
}
const json = await fs.promises.readFile(filePath, "utf8");
return controller.importBackup(json);
});
@ -487,7 +472,6 @@ app.on("second-instance", () => {
});
app.whenReady().then(() => {
cleanupStaleSubstDrives();
registerIpcHandlers();
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
@ -500,9 +484,6 @@ app.whenReady().then(() => {
bindMainWindowLifecycle(mainWindow);
}
});
}).catch((error) => {
console.error("App startup failed:", error);
app.quit();
});
app.on("window-all-closed", () => {
@ -512,10 +493,8 @@ app.on("window-all-closed", () => {
});
app.on("before-quit", () => {
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
stopClipboardWatcher();
destroyTray();
shutdownDaemon();
try {
controller.shutdown();
} catch (error) {

View File

@ -228,23 +228,22 @@ export class MegaWebFallback {
}
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
const overallSignal = withTimeoutSignal(signal, 180000);
return this.runExclusive(async () => {
throwIfAborted(overallSignal);
throwIfAborted(signal);
const creds = this.getCredentials();
if (!creds.login.trim() || !creds.password.trim()) {
return null;
}
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
await this.login(creds.login, creds.password, overallSignal);
await this.login(creds.login, creds.password, signal);
}
const generated = await this.generate(link, overallSignal);
const generated = await this.generate(link, signal);
if (!generated) {
this.cookie = "";
await this.login(creds.login, creds.password, overallSignal);
const retry = await this.generate(link, overallSignal);
await this.login(creds.login, creds.password, signal);
const retry = await this.generate(link, signal);
if (!retry) {
return null;
}
@ -262,7 +261,7 @@ export class MegaWebFallback {
fileSize: null,
retriesUsed: 0
};
}, overallSignal);
}, signal);
}
public invalidateSession(): void {

View File

@ -8,7 +8,6 @@ export interface UnrestrictedLink {
directUrl: string;
fileSize: number | null;
retriesUsed: number;
skipTlsVerify?: boolean;
}
function shouldRetryStatus(status: number): boolean {
@ -63,8 +62,7 @@ function isRetryableErrorText(text: string): boolean {
|| lower.includes("aborted")
|| lower.includes("econnreset")
|| lower.includes("enotfound")
|| lower.includes("etimedout")
|| lower.includes("html statt json");
|| lower.includes("etimedout");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
@ -79,11 +77,6 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
await sleep(ms);
return;
}
// Check before entering the Promise constructor to avoid a race where the timer
// resolves before the aborted check runs (especially when ms=0).
if (signal.aborted) {
throw new Error("aborted");
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
@ -100,6 +93,10 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
reject(new Error("aborted"));
};
if (signal.aborted) {
onAbort();
return;
}
signal.addEventListener("abort", onAbort, { once: true });
});
}
@ -168,15 +165,6 @@ export class RealDebridClient {
if (!directUrl) {
throw new Error("Unrestrict ohne Download-URL");
}
try {
const parsedUrl = new URL(directUrl);
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
}
} catch (urlError) {
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
}
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
const fileSizeRaw = Number(payload.filesize ?? NaN);
@ -188,7 +176,7 @@ export class RealDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
if (signal?.aborted || /aborted/i.test(lastError)) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -198,6 +186,6 @@ export class RealDebridClient {
}
}
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
throw new Error(lastError || "Unrestrict fehlgeschlagen");
}
}

View File

@ -76,12 +76,7 @@ async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<v
export function initSessionLog(baseDir: string): void {
sessionLogsDir = path.join(baseDir, "session-logs");
try {
fs.mkdirSync(sessionLogsDir, { recursive: true });
} catch {
sessionLogsDir = null;
return;
}
fs.mkdirSync(sessionLogsDir, { recursive: true });
const timestamp = formatTimestamp();
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);

View File

@ -5,8 +5,8 @@ import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, Down
import { defaultSettings } from "./constants";
import { logger } from "./logger";
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
@ -17,7 +17,7 @@ const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
function asText(value: unknown): string {
@ -91,18 +91,6 @@ function normalizeColumnOrder(raw: unknown): string[] {
return result;
}
const DEPRECATED_UPDATE_REPOS = new Set([
"sucukdeluxe/real-debrid-downloader"
]);
function migrateUpdateRepo(raw: string, fallback: string): string {
const trimmed = raw.trim();
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
return fallback;
}
return trimmed;
}
export function normalizeSettings(settings: AppSettings): AppSettings {
const defaults = defaultSettings();
const normalized: AppSettings = {
@ -111,10 +99,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
megaPassword: asText(settings.megaPassword),
bestToken: asText(settings.bestToken),
allDebridToken: asText(settings.allDebridToken),
ddownloadLogin: asText(settings.ddownloadLogin),
ddownloadPassword: asText(settings.ddownloadPassword),
oneFichierApiKey: asText(settings.oneFichierApiKey),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n/g, "\n"),
rememberToken: Boolean(settings.rememberToken),
providerPrimary: settings.providerPrimary,
providerSecondary: settings.providerSecondary,
@ -145,7 +130,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
speedLimitMode: settings.speedLimitMode,
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
updateRepo: asText(settings.updateRepo) || defaults.updateRepo,
clipboardWatch: Boolean(settings.clipboardWatch),
minimizeToTray: Boolean(settings.minimizeToTray),
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
@ -155,8 +140,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
columnOrder: normalizeColumnOrder(settings.columnOrder),
extractCpuPriority: settings.extractCpuPriority,
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
extractCpuPriority: settings.extractCpuPriority
};
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
@ -204,9 +188,7 @@ function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: ""
archivePasswordList: ""
};
}
@ -250,7 +232,7 @@ function readSettingsFile(filePath: string): AppSettings | null {
}
}
export function normalizeLoadedSession(raw: unknown): SessionState {
function normalizeLoadedSession(raw: unknown): SessionState {
const fallback = emptySession();
const parsed = asRecord(raw);
if (!parsed) {
@ -357,8 +339,7 @@ export function normalizeLoadedSession(raw: unknown): SessionState {
return true;
});
for (const packageId of Object.keys(packagesById)) {
if (!seenOrder.has(packageId)) {
seenOrder.add(packageId);
if (!packageOrder.includes(packageId)) {
packageOrder.push(packageId);
}
}
@ -430,7 +411,7 @@ function sessionBackupPath(sessionFile: string): string {
return `${sessionFile}.bak`;
}
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
@ -442,19 +423,6 @@ export function normalizeLoadedSessionTransientFields(session: SessionState): Se
item.speedBps = 0;
}
// Reset package-level active statuses to queued (mirrors item reset above)
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const pkg of Object.values(session.packages)) {
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
pkg.status = "queued";
}
pkg.postProcessLabel = undefined;
}
// Clear stale session-level running/paused flags
session.running = false;
session.paused = false;
return session;
}
@ -480,17 +448,12 @@ export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
const tempPath = `${paths.configFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
}
let asyncSettingsSaveRunning = false;
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
let asyncSettingsSaveQueued: { paths: StoragePaths; payload: string } | null = null;
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
@ -504,7 +467,6 @@ async function writeSettingsPayload(paths: StoragePaths, payload: string): Promi
await fsp.copyFile(tempPath, paths.configFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
@ -514,7 +476,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
if (asyncSettingsSaveRunning) {
asyncSettingsSaveQueued = { paths, settings };
asyncSettingsSaveQueued = { paths, payload };
return;
}
asyncSettingsSaveRunning = true;
@ -527,7 +489,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
if (asyncSettingsSaveQueued) {
const queued = asyncSettingsSaveQueued;
asyncSettingsSaveQueued = null;
void saveSettingsAsync(queued.paths, queued.settings);
void writeSettingsPayload(queued.paths, queued.payload).catch((err) => logger.error(`Async Settings-Save (queued) fehlgeschlagen: ${String(err)}`));
}
}
}
@ -580,7 +542,6 @@ export function loadSession(paths: StoragePaths): SessionState {
}
export function saveSession(paths: StoragePaths, session: SessionState): void {
syncSaveGeneration += 1;
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.sessionFile)) {
try {
@ -591,41 +552,25 @@ export function saveSession(paths: StoragePaths, session: SessionState): void {
}
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
}
let asyncSaveRunning = false;
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let syncSaveGeneration = 0;
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
async function writeSessionPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
const tempPath = sessionTempPath(paths.sessionFile, "async");
await fsp.writeFile(tempPath, payload, "utf8");
// If a synchronous save occurred after this async save started, discard the stale write
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
try {
await fsp.rename(tempPath, paths.sessionFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
await fsp.copyFile(tempPath, paths.sessionFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
@ -637,9 +582,8 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
return;
}
asyncSaveRunning = true;
const gen = syncSaveGeneration;
try {
await writeSessionPayload(paths, payload, gen);
await writeSessionPayload(paths, payload);
} catch (error) {
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
} finally {
@ -652,12 +596,6 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
}
}
export function cancelPendingAsyncSaves(): void {
asyncSaveQueued = null;
asyncSettingsSaveQueued = null;
syncSaveGeneration += 1;
}
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
await saveSessionPayloadAsync(paths, payload);
@ -683,8 +621,7 @@ function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
status: entry.status === "deleted" ? "deleted" : "completed",
outputDir: asText(entry.outputDir),
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
outputDir: asText(entry.outputDir)
};
}
@ -714,13 +651,8 @@ export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
const payload = JSON.stringify(trimmed, null, 2);
const tempPath = `${paths.historyFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
}
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {

View File

@ -14,32 +14,8 @@ const DOWNLOAD_BODY_IDLE_TIMEOUT_MS = 45000;
const RETRIES_PER_CANDIDATE = 3;
const RETRY_DELAY_MS = 1500;
const UPDATE_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
type UpdateSource = {
name: string;
webBase: string;
apiBase: string;
};
const UPDATE_SOURCES: UpdateSource[] = [
{
name: "git24",
webBase: "https://git.24-music.de",
apiBase: "https://git.24-music.de/api/v1"
},
{
name: "codeberg",
webBase: "https://codeberg.org",
apiBase: "https://codeberg.org/api/v1"
},
{
name: "github",
webBase: "https://github.com",
apiBase: "https://api.github.com"
}
];
const PRIMARY_UPDATE_SOURCE = UPDATE_SOURCES[0];
const UPDATE_WEB_BASE = PRIMARY_UPDATE_SOURCE.webBase;
const UPDATE_API_BASE = PRIMARY_UPDATE_SOURCE.apiBase;
const UPDATE_WEB_BASE = "https://codeberg.org";
const UPDATE_API_BASE = "https://codeberg.org/api/v1";
let activeUpdateAbortController: AbortController | null = null;
@ -81,9 +57,9 @@ export function normalizeUpdateRepo(repo: string): string {
const normalizeParts = (input: string): string => {
const cleaned = input
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com|git\.24-music\.de):/i, "")
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com):/i, "")
.replace(/\.git$/i, "")
.replace(/^\/+|\/+$/g, "");
const parts = cleaned.split("/").filter(Boolean);
@ -100,13 +76,7 @@ export function normalizeUpdateRepo(repo: string): string {
try {
const url = new URL(raw);
const host = url.hostname.toLowerCase();
if (
host === "codeberg.org"
|| host === "www.codeberg.org"
|| host === "github.com"
|| host === "www.github.com"
|| host === "git.24-music.de"
) {
if (host === "codeberg.org" || host === "www.codeberg.org" || host === "github.com" || host === "www.github.com") {
const normalized = normalizeParts(url.pathname);
if (normalized) {
return normalized;
@ -336,8 +306,6 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
const releaseUrl = String(payload.html_url || fallback.releaseUrl);
const setup = pickSetupAsset(readReleaseAssets(payload));
const body = typeof payload.body === "string" ? payload.body.trim() : "";
return {
updateAvailable: isRemoteNewer(APP_VERSION, latestVersion),
currentVersion: APP_VERSION,
@ -346,8 +314,7 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
releaseUrl,
setupAssetUrl: setup?.browser_download_url || "",
setupAssetName: setup?.name || "",
setupAssetDigest: setup?.digest || "",
releaseNotes: body || undefined
setupAssetDigest: setup?.digest || ""
};
}
@ -794,8 +761,7 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
};
const reader = response.body.getReader();
const tempPath = targetPath + ".tmp";
const writeStream = fs.createWriteStream(tempPath);
const chunks: Buffer[] = [];
try {
resetIdleTimer();
@ -809,39 +775,27 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
break;
}
const buf = Buffer.from(value.buffer, value.byteOffset, value.byteLength);
if (!writeStream.write(buf)) {
await new Promise<void>((resolve) => writeStream.once("drain", resolve));
}
chunks.push(buf);
downloadedBytes += buf.byteLength;
resetIdleTimer();
emitDownloadProgress(false);
}
} catch (error) {
writeStream.destroy();
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw error;
} finally {
clearIdleTimer();
}
await new Promise<void>((resolve, reject) => {
writeStream.end(() => resolve());
writeStream.on("error", reject);
});
if (idleTimedOut) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download Body Timeout nach ${Math.ceil(idleTimeoutMs / 1000)}s`);
}
if (totalBytes && downloadedBytes !== totalBytes) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download unvollständig (${downloadedBytes} / ${totalBytes} Bytes)`);
const fileBuffer = Buffer.concat(chunks);
if (totalBytes && fileBuffer.byteLength !== totalBytes) {
throw new Error(`Update Download unvollständig (${fileBuffer.byteLength} / ${totalBytes} Bytes)`);
}
await fs.promises.rename(tempPath, targetPath);
await fs.promises.writeFile(targetPath, fileBuffer);
emitDownloadProgress(true);
logger.info(`Update-Download abgeschlossen: ${targetPath} (${downloadedBytes} Bytes)`);
logger.info(`Update-Download abgeschlossen: ${targetPath} (${fileBuffer.byteLength} Bytes)`);
return { expectedBytes: totalBytes };
}

View File

@ -4,7 +4,6 @@ import {
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
@ -57,7 +56,7 @@ const api: ElectronApi = {
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
setPackagePriority: (packageId: string, priority: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),

File diff suppressed because it is too large Load Diff

View File

@ -1,25 +0,0 @@
import type { PackageEntry } from "../shared/types";
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
const fromIndex = order.indexOf(draggedPackageId);
const toIndex = order.indexOf(targetPackageId);
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
return order;
}
const next = [...order];
const [dragged] = next.splice(fromIndex, 1);
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
next.splice(insertIndex, 0, dragged);
return next;
}
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
const sorted = [...order];
sorted.sort((a, b) => {
const nameA = (packages[a]?.name ?? "").toLowerCase();
const nameB = (packages[b]?.name ?? "").toLowerCase();
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
return descending ? -cmp : cmp;
});
return sorted;
}

View File

@ -344,15 +344,6 @@ body,
background: rgba(244, 63, 94, 0.1);
}
.ctrl-icon-btn.ctrl-move:not(:disabled) {
color: var(--accent);
}
.ctrl-icon-btn.ctrl-move:hover:not(:disabled) {
border-color: var(--accent);
background: color-mix(in srgb, var(--accent) 10%, transparent);
}
.ctrl-icon-btn.ctrl-speed.active {
color: #f59e0b;
border-color: rgba(245, 158, 11, 0.5);
@ -586,7 +577,7 @@ body,
.pkg-column-header {
display: grid;
/* grid-template-columns set via inline style from columnOrder */
grid-template-columns: 1fr 160px 80px 110px 110px 70px 160px 90px;
gap: 8px;
padding: 5px 12px;
background: var(--card);
@ -604,8 +595,7 @@ body,
.pkg-column-header .pkg-col-account,
.pkg-column-header .pkg-col-prio,
.pkg-column-header .pkg-col-status,
.pkg-column-header .pkg-col-speed,
.pkg-column-header .pkg-col-added {
.pkg-column-header .pkg-col-speed {
text-align: center;
}
@ -623,7 +613,7 @@ body,
.pkg-columns {
display: grid;
/* grid-template-columns set via inline style from columnOrder */
grid-template-columns: 1fr 160px 80px 110px 110px 70px 160px 90px;
gap: 8px;
align-items: center;
min-width: 0;
@ -649,8 +639,7 @@ body,
.pkg-columns .pkg-col-account,
.pkg-columns .pkg-col-prio,
.pkg-columns .pkg-col-status,
.pkg-columns .pkg-col-speed,
.pkg-columns .pkg-col-added {
.pkg-columns .pkg-col-speed {
font-size: 13px;
color: var(--muted);
overflow: hidden;
@ -873,7 +862,7 @@ body,
.status-bar {
display: flex;
flex-wrap: wrap;
gap: 8px 16px;
gap: 16px;
align-items: center;
color: var(--muted);
font-size: 12px;
@ -884,16 +873,6 @@ body,
margin: 0 -14px -10px;
}
.footer-spacer {
flex: 1;
}
.footer-btn {
font-size: 11px;
padding: 2px 8px;
min-height: 0;
}
.settings-shell {
display: grid;
grid-template-rows: auto 1fr;
@ -1307,7 +1286,7 @@ td {
.item-row {
display: grid;
/* grid-template-columns set via inline style from columnOrder */
grid-template-columns: 1fr 160px 80px 110px 110px 70px 160px 90px;
gap: 8px;
align-items: center;
margin: 0 -12px;
@ -1369,22 +1348,6 @@ td {
color: #64748b !important;
}
.pkg-col-dragging {
opacity: 0.4;
}
.pkg-col-drop-target {
box-shadow: -2px 0 0 0 var(--accent);
}
.pkg-column-header .pkg-col {
cursor: grab;
}
.pkg-column-header .pkg-col.sortable {
cursor: pointer;
}
.ctx-menu-sub {
position: relative;
}
@ -1415,12 +1378,6 @@ td {
color: var(--accent) !important;
}
.ctx-menu-disabled {
opacity: 0.4;
cursor: not-allowed !important;
pointer-events: none;
}
.item-remove {
background: none;
border: none;
@ -1639,7 +1596,6 @@ td {
border-radius: 12px;
padding: 10px 14px;
box-shadow: 0 16px 30px rgba(0, 0, 0, 0.35);
z-index: 50;
}
.ctx-menu {
@ -1749,7 +1705,6 @@ td {
font-weight: 600;
pointer-events: none;
backdrop-filter: blur(2px);
z-index: 200;
}
.modal-backdrop {
@ -1764,8 +1719,6 @@ td {
.modal-card {
width: min(560px, 100%);
max-height: calc(100vh - 40px);
overflow-y: auto;
border: 1px solid var(--border);
border-radius: 14px;
background: linear-gradient(180deg, color-mix(in srgb, var(--card) 98%, transparent), color-mix(in srgb, var(--surface) 98%, transparent));
@ -1784,34 +1737,6 @@ td {
color: var(--muted);
}
.modal-details {
border: 1px solid var(--border);
border-radius: 6px;
padding: 0;
}
.modal-details summary {
padding: 6px 10px;
cursor: pointer;
font-size: 13px;
color: var(--muted);
user-select: none;
}
.modal-details summary:hover {
color: var(--text);
}
.modal-details pre {
margin: 0;
padding: 8px 10px;
border-top: 1px solid var(--border);
font-size: 12px;
line-height: 1.5;
white-space: pre-wrap;
word-break: break-word;
max-height: 260px;
overflow-y: auto;
color: var(--muted);
}
.modal-path {
font-size: 12px;
word-break: break-all;
@ -1882,9 +1807,8 @@ td {
}
.pkg-columns,
.pkg-column-header,
.item-row {
grid-template-columns: 1fr !important;
.pkg-column-header {
grid-template-columns: 1fr;
}
.pkg-column-header .pkg-col-progress,
@ -1893,8 +1817,7 @@ td {
.pkg-column-header .pkg-col-account,
.pkg-column-header .pkg-col-prio,
.pkg-column-header .pkg-col-status,
.pkg-column-header .pkg-col-speed,
.pkg-column-header .pkg-col-added {
.pkg-column-header .pkg-col-speed {
display: none;
}
@ -1904,8 +1827,7 @@ td {
.pkg-columns .pkg-col-account,
.pkg-columns .pkg-col-prio,
.pkg-columns .pkg-col-status,
.pkg-columns .pkg-col-speed,
.pkg-columns .pkg-col-added {
.pkg-columns .pkg-col-speed {
display: none;
}

View File

@ -14,7 +14,7 @@ export type CleanupMode = "none" | "trash" | "delete";
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
export type SpeedMode = "global" | "per_download";
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload" | "onefichier";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid";
export type DebridFallbackProvider = DebridProvider | "none";
export type AppTheme = "dark" | "light";
export type PackagePriority = "high" | "normal" | "low";
@ -42,9 +42,6 @@ export interface AppSettings {
megaPassword: string;
bestToken: string;
allDebridToken: string;
ddownloadLogin: string;
ddownloadPassword: string;
oneFichierApiKey: string;
archivePasswordList: string;
rememberToken: boolean;
providerPrimary: DebridProvider;
@ -87,7 +84,6 @@ export interface AppSettings {
bandwidthSchedules: BandwidthScheduleEntry[];
columnOrder: string[];
extractCpuPriority: ExtractCpuPriority;
autoExtractWhenStopped: boolean;
}
export interface DownloadItem {
@ -122,7 +118,6 @@ export interface PackageEntry {
cancelled: boolean;
enabled: boolean;
priority: PackagePriority;
postProcessLabel?: string;
createdAt: number;
updatedAt: number;
}
@ -223,7 +218,6 @@ export interface UpdateCheckResult {
setupAssetUrl?: string;
setupAssetName?: string;
setupAssetDigest?: string;
releaseNotes?: string;
error?: string;
}

View File

@ -1,5 +1,5 @@
import { describe, expect, it } from "vitest";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/App";
describe("reorderPackageOrderByDrop", () => {
it("moves adjacent package down by one on drop", () => {
@ -25,9 +25,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg3", "pkg1", "pkg2"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
},
false
);
@ -38,9 +38,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg1", "pkg2", "pkg3"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
},
true
);

View File

@ -269,7 +269,6 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
expect(result).not.toBeNull();
expect(result!).toContain("S99E999");
});
// Real-world scene release patterns
@ -344,7 +343,6 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
// "mkv" should not be treated as part of the filename match
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
});
it("does not match episode-like patterns in codec strings", () => {
@ -375,7 +373,6 @@ describe("buildAutoRenameBaseName", () => {
// Extreme edge case - sanitizeFilename trims leading dots
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
expect(result!).toContain("-4sf");
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
});
@ -664,31 +661,4 @@ describe("buildAutoRenameBaseNameFromFolders", () => {
);
expect(result).toBeNull();
});
it("renames Riviera S02 with single-digit episode s02e2", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
"tvp-riviera-s02e2-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
});
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"r104.de.dl.web.7p-s04e02",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
it("renames Room 104 wayne source with episode", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"room.104.s04e01.german.dl.720p.web.h264-wayne",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
});

View File

@ -317,7 +317,7 @@ describe("debrid service", () => {
const controller = new AbortController();
const abortTimer = setTimeout(() => {
controller.abort("test");
}, 200);
}, 25);
try {
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);

View File

@ -36,8 +36,12 @@ afterEach(() => {
}
});
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
describe("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
return;
}
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
@ -65,112 +69,11 @@ describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm b
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("emits progress callbacks with archiveName and percent", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create a ZIP with some content to trigger progress
const zipPath = path.join(packageDir, "progress-test.zip");
const zip = new AdmZip();
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
zip.writeZip(zipPath);
const progressUpdates: Array<{
archiveName: string;
percent: number;
phase: string;
archivePercent?: number;
}> = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
progressUpdates.push({
archiveName: update.archiveName,
percent: update.percent,
phase: update.phase,
archivePercent: update.archivePercent,
});
},
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
// Should have at least preparing, extracting, and done phases
const phases = new Set(progressUpdates.map((u) => u.phase));
expect(phases.has("preparing")).toBe(true);
expect(phases.has("extracting")).toBe(true);
// Extracting phase should include the archive name
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
expect(extracting.length).toBeGreaterThan(0);
// Should end at 100%
const lastExtracting = extracting[extracting.length - 1];
expect(lastExtracting.archivePercent).toBe(100);
// Files should exist
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
});
it("extracts multiple archives sequentially with progress for each", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create two separate ZIP archives
const zip1 = new AdmZip();
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
zip1.writeZip(path.join(packageDir, "archive1.zip"));
const zip2 = new AdmZip();
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
zip2.writeZip(path.join(packageDir, "archive2.zip"));
const archiveNames = new Set<string>();
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
if (update.phase === "extracting" && update.archiveName) {
archiveNames.add(update.archiveName);
}
},
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// Both archive names should have appeared in progress
expect(archiveNames.has("archive1.zip")).toBe(true);
expect(archiveNames.has("archive2.zip")).toBe(true);
// Both files extracted
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
if (!hasJavaRuntime() || !hasJvmExtractorRuntime()) {
return;
}
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));

View File

@ -15,8 +15,6 @@ import {
const tempDirs: string[] = [];
const originalExtractBackend = process.env.RD_EXTRACT_BACKEND;
const originalStatfs = fs.promises.statfs;
const originalZipEntryMemoryLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
beforeEach(() => {
process.env.RD_EXTRACT_BACKEND = "legacy";
@ -31,12 +29,6 @@ afterEach(() => {
} else {
process.env.RD_EXTRACT_BACKEND = originalExtractBackend;
}
(fs.promises as any).statfs = originalStatfs;
if (originalZipEntryMemoryLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = originalZipEntryMemoryLimit;
}
});
describe("extractor", () => {
@ -582,6 +574,7 @@ describe("extractor", () => {
});
it("keeps original ZIP size guard error when external fallback is unavailable", async () => {
const previousLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = "8";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
@ -595,20 +588,32 @@ describe("extractor", () => {
zip.addFile("large.bin", Buffer.alloc(9 * 1024 * 1024, 7));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(0);
expect(result.failed).toBe(1);
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
try {
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(0);
expect(result.failed).toBe(1);
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
} finally {
if (previousLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = previousLimit;
}
}
});
it.skipIf(process.platform !== "win32")("matches resume-state archive names case-insensitively on Windows", async () => {
it("matches resume-state archive names case-insensitively on Windows", async () => {
if (process.platform !== "win32") {
return;
}
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
@ -645,18 +650,23 @@ describe("extractor", () => {
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
const originalStatfs = fs.promises.statfs;
(fs.promises as any).statfs = async () => ({ bfree: 1, bsize: 1 });
await expect(
extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
})
).rejects.toThrow(/Nicht genug Speicherplatz/);
try {
await expect(
extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
})
).rejects.toThrow(/Nicht genug Speicherplatz/);
} finally {
(fs.promises as any).statfs = originalStatfs;
}
});
it("proceeds when disk space is sufficient", async () => {

View File

@ -166,7 +166,7 @@ describe("mega-web-fallback", () => {
const controller = new AbortController();
const timer = setTimeout(() => {
controller.abort("test");
}, 200);
}, 30);
try {
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);

View File

@ -1,188 +0,0 @@
import { describe, expect, it } from "vitest";
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
type MinimalItem = {
targetPath?: string;
fileName?: string;
[key: string]: unknown;
};
function makeItems(names: string[]): MinimalItem[] {
return names.map((name) => ({
targetPath: `C:\\Downloads\\Package\\${name}`,
fileName: name,
id: name,
status: "completed",
}));
}
describe("resolveArchiveItemsFromList", () => {
// ── Multipart RAR (.partN.rar) ──
it("matches multipart .part1.rar archives", () => {
const items = makeItems([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
"Other.rar",
]);
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(3);
expect(result.map((i: any) => i.fileName)).toEqual([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
]);
});
it("matches multipart .part01.rar archives (zero-padded)", () => {
const items = makeItems([
"Film.part01.rar",
"Film.part02.rar",
"Film.part10.rar",
"Unrelated.zip",
]);
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
expect(result).toHaveLength(3);
});
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
it("matches old-style .rar + .rNN volumes", () => {
const items = makeItems([
"Archive.rar",
"Archive.r00",
"Archive.r01",
"Archive.r02",
"Other.zip",
]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(4);
});
// ── Single RAR ──
it("matches a single .rar file", () => {
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("SingleFile.rar");
});
// ── Split ZIP ──
it("matches split .zip.NNN files", () => {
const items = makeItems([
"Data.zip",
"Data.zip.001",
"Data.zip.002",
"Data.zip.003",
]);
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
expect(result).toHaveLength(4);
});
// ── Split 7z ──
it("matches split .7z.NNN files", () => {
const items = makeItems([
"Backup.7z.001",
"Backup.7z.002",
]);
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
expect(result).toHaveLength(2);
});
// ── Generic .NNN splits ──
it("matches generic .NNN split files", () => {
const items = makeItems([
"video.001",
"video.002",
"video.003",
]);
const result = resolveArchiveItemsFromList("video.001", items as any);
expect(result).toHaveLength(3);
});
// ── Exact filename match ──
it("matches a single .zip by exact name", () => {
const items = makeItems(["myarchive.zip", "other.rar"]);
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("myarchive.zip");
});
// ── Case insensitivity ──
it("matches case-insensitively", () => {
const items = makeItems([
"MOVIE.PART1.RAR",
"MOVIE.PART2.RAR",
]);
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Stem-based fallback ──
it("uses stem-based fallback when exact patterns fail", () => {
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
// but the disk file is "Movie.part1.rar"
const items = makeItems([
"Movie.rar",
]);
// The archive on disk is "Movie.part1.rar" but there's no item matching the
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
// stem fallback: "movie" starts with "movie" and ends with .rar
expect(result).toHaveLength(1);
});
// ── Single item fallback ──
it("returns single archive item when no pattern matches", () => {
const items = makeItems(["totally-different-name.rar"]);
const result = resolveArchiveItemsFromList("Original.rar", items as any);
// Single item in list with archive extension → return it
expect(result).toHaveLength(1);
});
// ── Empty when no match ──
it("returns empty when items have no archive extensions", () => {
const items = makeItems(["video.mkv", "subtitle.srt"]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(0);
});
// ── Items without targetPath ──
it("falls back to fileName when targetPath is missing", () => {
const items = [
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
];
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Multiple archives, should not cross-match ──
it("does not cross-match different archive groups", () => {
const items = makeItems([
"Episode.S01E01.part1.rar",
"Episode.S01E01.part2.rar",
"Episode.S01E02.part1.rar",
"Episode.S01E02.part2.rar",
]);
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
expect(result1).toHaveLength(2);
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
expect(result2).toHaveLength(2);
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
});
});

View File

@ -153,7 +153,7 @@ async function main(): Promise<void> {
createStoragePaths(path.join(tempRoot, "state-pause"))
);
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
await manager2.start();
manager2.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const paused = manager2.togglePause();
assert(paused, "Pause konnte nicht aktiviert werden");

View File

@ -8,8 +8,6 @@ import { setLogListener } from "../src/main/logger";
const tempDirs: string[] = [];
afterEach(() => {
// Ensure session log is shut down between tests
shutdownSessionLog();
// Ensure listener is cleared between tests
setLogListener(null);
for (const dir of tempDirs.splice(0)) {
@ -47,7 +45,7 @@ describe("session-log", () => {
logger.info("Test-Nachricht für Session-Log");
// Wait for flush (200ms interval + margin)
await new Promise((resolve) => setTimeout(resolve, 500));
await new Promise((resolve) => setTimeout(resolve, 350));
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("Test-Nachricht für Session-Log");
@ -81,7 +79,7 @@ describe("session-log", () => {
const { logger } = await import("../src/main/logger");
logger.info("Nach-Shutdown-Nachricht");
await new Promise((resolve) => setTimeout(resolve, 500));
await new Promise((resolve) => setTimeout(resolve, 350));
const content = fs.readFileSync(logPath, "utf8");
expect(content).not.toContain("Nach-Shutdown-Nachricht");
@ -139,7 +137,7 @@ describe("session-log", () => {
shutdownSessionLog();
});
it("multiple sessions create different files", async () => {
it("multiple sessions create different files", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
@ -148,7 +146,10 @@ describe("session-log", () => {
shutdownSessionLog();
// Small delay to ensure different timestamp
await new Promise((resolve) => setTimeout(resolve, 1100));
const start = Date.now();
while (Date.now() - start < 1100) {
// busy-wait for 1.1 seconds to get different second in filename
}
initSessionLog(baseDir);
const path2 = getSessionLogPath();

View File

@ -22,7 +22,7 @@ afterEach(() => {
describe("update", () => {
it("normalizes update repo input", () => {
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
@ -31,14 +31,14 @@ describe("update", () => {
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
});
it("uses normalized repo slug for API requests", async () => {
it("uses normalized repo slug for Codeberg API requests", async () => {
let requestedUrl = "";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
return new Response(
JSON.stringify({
tag_name: `v${APP_VERSION}`,
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
html_url: "https://codeberg.org/owner/repo/releases/tag/v1.0.0",
assets: []
}),
{
@ -48,8 +48,8 @@ describe("update", () => {
);
}) as typeof fetch;
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
const result = await checkGitHubUpdate("https://codeberg.org/owner/repo/releases");
expect(requestedUrl).toBe("https://codeberg.org/api/v1/repos/owner/repo/releases/latest");
expect(result.currentVersion).toBe(APP_VERSION);
expect(result.latestVersion).toBe(APP_VERSION);
expect(result.updateAvailable).toBe(false);
@ -484,14 +484,14 @@ describe("normalizeUpdateRepo extended", () => {
});
it("returns default for malformed inputs", () => {
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("just-one-part")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Sucukdeluxe/real-debrid-downloader");
});
it("rejects traversal-like owner or repo segments", () => {
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("../owner/repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
});
it("handles www prefix", () => {

View File

@ -12,5 +12,5 @@
"isolatedModules": true,
"types": ["node", "vite/client"]
},
"include": ["src", "tests", "vite.config.mts"]
"include": ["src", "tests", "vite.config.ts"]
}