Compare commits

...

193 Commits

Author SHA1 Message Date
Sucukdeluxe
18e4b6cd58 Release v1.6.66 2026-03-05 17:32:42 +01:00
Sucukdeluxe
c380abaee2 Fix deferred post-extraction cleanup skipped after hybrid extraction
When hybrid extraction handled all archives, extractedCount stayed 0
causing all cleanup steps (archive deletion, resume state, link/sample
removal, empty dir pruning, auto-rename, nested extraction) to be
bypassed. Extended conditions to also trigger on alreadyMarkedExtracted.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 17:32:11 +01:00
Sucukdeluxe
44f202d116 Release v1.6.65 2026-03-05 16:59:48 +01:00
Sucukdeluxe
008f16a05d Add 1Fichier as direct file hoster provider with API key auth
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 16:59:15 +01:00
Sucukdeluxe
927013d9a6 Release v1.6.64 2026-03-05 16:39:59 +01:00
Sucukdeluxe
9f62c7c29c Fix hybrid extraction label flicker between Ausstehend and Warten auf Parts
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 16:39:31 +01:00
Sucukdeluxe
6897776460 Release v1.6.63 2026-03-05 16:35:38 +01:00
Sucukdeluxe
3089a45b13 Fix footer hoster count to show number of unique hosters in download list
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 16:35:05 +01:00
Sucukdeluxe
e37545c9c9 Release v1.6.62 2026-03-05 16:32:14 +01:00
Sucukdeluxe
17e7862307 Fix footer showing configured accounts instead of active download hosters
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 16:31:44 +01:00
Sucukdeluxe
d1eadff425 Release v1.6.61 2026-03-05 16:28:12 +01:00
Sucukdeluxe
650988c7ac Cleanup: remove leftover empty package folders after extract 2026-03-05 16:27:29 +01:00
Sucukdeluxe
49e62c1f83 Release v1.6.60 2026-03-05 14:52:38 +01:00
Sucukdeluxe
4c67455c67 Release script: use cmd wrapper for npm on Windows 2026-03-05 14:52:02 +01:00
Sucukdeluxe
5a5e3d2960 Extractor: cache package passwords and document v1.6.60 2026-03-05 14:49:26 +01:00
Sucukdeluxe
11da8b6e9a Release v1.6.59 2026-03-05 14:31:35 +01:00
Sucukdeluxe
265e6a72be Release v1.6.58 2026-03-05 14:22:19 +01:00
Sucukdeluxe
7816dc9488 Fix extraction progress oscillation 2026-03-05 14:20:01 +01:00
Sucukdeluxe
678d642683 Add changelog notes for v1.6.57 2026-03-05 14:17:22 +01:00
Sucukdeluxe
0f4174d153 Release v1.6.57 2026-03-05 14:12:52 +01:00
Sucukdeluxe
babcd8edb7 Fix extraction completion and password prioritization 2026-03-05 14:11:30 +01:00
Sucukdeluxe
6e00bbab53 Fix JVM daemon restart loop causing 25-30s gaps between extractions
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 13:50:32 +01:00
Sucukdeluxe
72642351d0 Release v1.6.55 2026-03-05 06:25:20 +01:00
Sucukdeluxe
51a01ea03f Use bulk IInArchive.extract() for ~8x faster extraction, fix archive item resolution
- Replace extractSlow() per-item extraction with IInArchive.extract() bulk API
  in 7-Zip-JBinding. Solid RAR archives no longer re-decode from the beginning
  for each item, bringing extraction speed close to native WinRAR/7z.exe (~375 MB/s
  instead of ~43 MB/s).

- Add BulkExtractCallback implementing both IArchiveExtractCallback and
  ICryptoGetTextPassword for proper password handling during bulk extraction.

- Fix resolveArchiveItemsFromList with multi-level fallback matching:
  1. Pattern match (multipart RAR, split ZIP/7z, generic splits)
  2. Exact filename match (case-insensitive)
  3. Stem-based fuzzy match (handles debrid service filename modifications)
  4. Single-item archive fallback

- Simplify caching from Set+Array workaround back to simple Map<string, T>
  (the original "caching failure" was caused by resolveArchiveItemsFromList
  returning empty arrays, not by Map/Set/Object data structure bugs).

- Add comprehensive tests for archive item resolution (14 test cases)
  and JVM extraction progress callbacks (2 test cases).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 06:24:12 +01:00
Sucukdeluxe
d9a78ea837 Release v1.6.54 2026-03-05 05:59:50 +01:00
Sucukdeluxe
5b221d5bd5 Add persistent JVM daemon for extraction, fix caching with Set+Array
- JVM extractor now supports --daemon mode: starts once, processes
  multiple archives via stdin JSON protocol, eliminating ~5s JVM boot
  per archive
- TypeScript side: daemon manager starts JVM once, sends requests via
  stdin, falls back to spawning new process if daemon is busy
- Fix extraction progress caching: replaced Object.create(null) + in
  operator with Set<string> + linear Array scan — both Map.has() and
  the in operator mysteriously failed to find keys that were just set
- Daemon auto-shutdown on app quit via shutdownDaemon() in before-quit

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:59:13 +01:00
Sucukdeluxe
c36549ca69 Release v1.6.53 2026-03-05 05:48:41 +01:00
Sucukdeluxe
7e79bef8da Increase JVM extractor heap to 8GB max / 512MB initial
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:48:02 +01:00
Sucukdeluxe
e3b4a4ba19 Release v1.6.52 2026-03-05 05:42:55 +01:00
Sucukdeluxe
30d216c7ca Fix extraction progress caching and JVM tuning
- Replace Map-based archive item cache with plain Object.create(null)
  to work around mysterious Map.has() returning false despite set()
  being called with the same key — this caused resolveArchiveItems
  to run on every 1.1s pulse instead of being cached, preventing
  extraction progress (Entpacken X%) from ever showing in the UI
- Apply same fix to both hybrid and full extraction paths
- Increase JVM heap from 512MB to 1GB for better extraction throughput
- Use SerialGC for faster JVM startup on short-lived extract processes
- Add download lifecycle logging (package add + item download start)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:42:23 +01:00
Sucukdeluxe
d80483adc2 Add download lifecycle logging for better diagnostics
- Log when packages are added (count + names)
- Log when individual item downloads start (filename, size, provider)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:32:44 +01:00
Sucukdeluxe
1cda391dfe Fix extraction speed and UI label updates
- Change OS priority from IDLE/BELOW_NORMAL to NORMAL/BELOW_NORMAL so
  extraction runs at full speed (matching manual 7-Zip/WinRAR performance)
- Use "high" priority in both hybrid and full extraction paths
- Increase hybrid extraction threads from hardcoded 2 to dynamic
  calculation (half CPU count, min 2, max 8)
- Fix emitState forced emit being silently dropped when a non-forced
  timer was already pending — forced emits now always replace pending
  timers to ensure immediate UI feedback during extraction transitions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:28:42 +01:00
Sucukdeluxe
375ec36781 Release v1.6.50 2026-03-05 05:09:24 +01:00
Sucukdeluxe
4ad1c05444 Fix extraction UI labels and speed for final extraction pass
- Force immediate emitState when first resolving archive items so UI
  transitions from 'Ausstehend' to 'Entpacken X%' instantly
- Use BELOW_NORMAL priority (instead of IDLE) for final extraction
  when all downloads are complete — matches manual extraction speed
- Add diagnostic logging for resolveArchiveItems matching

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 05:08:09 +01:00
Sucukdeluxe
c88eeb0b12 Release v1.6.49 2026-03-05 04:45:53 +01:00
Sucukdeluxe
c6261aba6a Log when each item download completes
Add "Download fertig: filename (size), pkg=name" log line when an item
finishes downloading, enabling precise timing analysis of when archive
parts become available for extraction.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:45:17 +01:00
Sucukdeluxe
a010b967b9 Release v1.6.48 2026-03-05 04:42:09 +01:00
Sucukdeluxe
af6547f254 Add MKV collection after hybrid extraction + detailed timing logs
- Run collectMkvFilesToLibrary in background after each hybrid extraction
  round so MKVs are moved to the library as episodes are extracted, not
  only after the entire package finishes
- Add timing logs to identify bottlenecks:
  - Post-process slot wait time
  - Per-round duration with requeue status
  - Recovery loop duration
  - Setup time in handlePackagePostProcessing
  - findReadyArchiveSets duration when > 200ms

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:41:01 +01:00
Sucukdeluxe
ba235b0b93 Release v1.6.47 2026-03-05 04:37:42 +01:00
Sucukdeluxe
1bfde96e46 Self-requeue hybrid extraction to avoid missed archive sets
After a hybrid extraction round completes, set the requeue flag so the
do-while loop immediately checks for more ready archive sets. Previously,
if all items completed before the task started processing, the single
requeue flag was consumed and no new completions triggered re-extraction,
causing 25+ second gaps until the next download completion.

Also change runHybridExtraction return type from void to number
(extracted count) to enable conditional self-requeue only when archives
were actually extracted, preventing infinite requeue loops.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:37:03 +01:00
Sucukdeluxe
e1f9b4b6d3 Release v1.6.46 2026-03-05 04:24:22 +01:00
Sucukdeluxe
95cf4fbed8 Eliminate 10-15s pause between package extractions
Release post-process slot immediately after main extraction completes.
All slow post-extraction work (nested extraction, auto-rename, archive
cleanup, link/sample removal, empty directory cleanup, MKV collection)
now runs in background via runDeferredPostExtraction so the next package
can start unpacking without delay.

- Export hasAnyFilesRecursive, removeEmptyDirectoryTree, cleanupArchives
  from extractor.ts for use in deferred handler
- Import removeDownloadLinkArtifacts, removeSampleArtifacts from cleanup
- Expand runDeferredPostExtraction with full post-cleanup pipeline:
  nested extraction, rename, archive cleanup, link/sample removal,
  empty dir tree removal, resume state clearing, MKV collection

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:22:20 +01:00
Sucukdeluxe
9ddc7d31bb Make update changelog collapsible in confirm dialog
Long changelogs made the update dialog unscrollable, preventing users
from reaching the install button. Changelog is now in a collapsed
<details> element. Dialog also has max-height with overflow scroll.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:12:47 +01:00
Sucukdeluxe
83626017b9 Fix btn-danger CSS class mismatch in history tab
CSS defines .btn.danger (two classes) but code used "btn btn-danger"
(one hyphenated class). History danger buttons now get correct red styling.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:07:09 +01:00
Sucukdeluxe
b9372f0ef0 Add ddownload to VALID_PRIMARY_PROVIDERS and VALID_FALLBACK_PROVIDERS
DDownload was missing from provider validation sets, preventing users
from configuring it as primary or fallback provider in settings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 04:04:31 +01:00
Sucukdeluxe
db97a7df14 Fix setPackagePriority type safety and add missing .catch() to IPC calls
- Use PackagePriority type instead of string/any in preload and app-controller
- Add .catch() to start(), extractNow(), setPackagePriority(), updateSettings(columnOrder), openLog(), openSessionLog()

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:59:10 +01:00
Sucukdeluxe
575fca3806 Release v1.6.45 2026-03-05 03:54:54 +01:00
Sucukdeluxe
a1c8f42435 Comprehensive bugfix release v1.6.45
Fix ~70 issues across the entire codebase including security fixes,
error handling improvements, test stabilization, and code quality.

- Fix TLS race condition with reference-counted acquire/release
- Bind debug server to 127.0.0.1 instead of 0.0.0.0
- Add overall timeout to MegaWebFallback
- Stream update installer to disk instead of RAM buffering
- Add path traversal protection in JVM extractor
- Cache DdownloadClient with credential-based invalidation
- Add .catch() to all fire-and-forget IPC calls
- Wrap app startup, clipboard, session-log in try/catch
- Add timeouts to container.ts fetch calls
- Fix variable shadowing, tsconfig path, line endings
- Stabilize tests with proper cleanup and timing tolerance
- Fix installer privileges, scripts, and afterPack null checks
- Delete obsolete _upload_release.mjs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:53:28 +01:00
Sucukdeluxe
a3c2680fec Show transitional label between archive extractions
After an archive finishes at 100%, show "Naechstes Archiv..." label
while the next archive initializes, eliminating the "dead" gap where
no activity was visible between consecutive extractions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:13:08 +01:00
Sucukdeluxe
12dade0240 Show compact changelog in update dialog, strip sub-items and long descriptions
Only top-level list items are shown in the updater changelog.
Indented sub-items, headings, and long descriptions are removed
for a clean, compact display. Detailed notes remain on the
Gitea release page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 03:02:41 +01:00
Sucukdeluxe
2a528a126c Add detailed preparation labels and dash separator for post-process status
- Show "Entpacken vorbereiten..." while scanning archives and checking disk space
- Show "Archive scannen..." and "Speicherplatz prüfen..." phases from extractor
- Use dash separator in UI: "[10/10 - Done] - Entpacken 45% (3/6)"
- Handle new "preparing" phase in both hybrid and full extraction progress handlers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:57:23 +01:00
Sucukdeluxe
8839080069 Show live extraction progress in package postProcessLabel
Update postProcessLabel during extraction with detailed progress:
- Overall percentage and archive count (e.g. "Entpacken 45% (3/6)")
- Password cracking progress when testing passwords
- Works for both hybrid and full extraction modes

Previously the label was static "Entpacken..." with no detail about
what was happening during potentially long extraction phases.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:46:10 +01:00
Sucukdeluxe
8f66d75eb3 Show DDownload provider label instead of generic Debrid in status
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:42:19 +01:00
Sucukdeluxe
56ee681aec Strip markdown formatting from changelog in update dialog
The confirm dialog is plain text and cannot render markdown. Strip
headings, bold, italic, code backticks, and normalize list bullets.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:37:18 +01:00
Sucukdeluxe
6db03f05a9 Fix DDownload downloads failing due to SSL certificate verification
DDownload's storage servers (dstorage.org) use certificates that fail
Node.js TLS verification. Add skipTlsVerify flag to UnrestrictedLink
and temporarily disable TLS verification for the download fetch when
the flag is set.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:32:59 +01:00
Sucukdeluxe
068da94e2a Auto-route DDownload URLs to DDownload provider before debrid chain
DDownload is a direct file hoster, not a debrid service. DDownload URLs
are now automatically handled by the DDownload provider when configured,
before trying any debrid providers. Remove DDownload from the
primary/secondary/tertiary provider dropdowns since it only handles its
own URLs and doesn't belong in the debrid fallback chain.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:25:34 +01:00
Sucukdeluxe
4b824b2d9f Fix crash when DDownload settings are missing from persisted config
Guard against undefined ddownloadLogin/ddownloadPassword in renderer
when upgrading from a version without DDownload support.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:17:41 +01:00
Sucukdeluxe
284c5e7aa6 Release v1.6.35
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:07:52 +01:00
Sucukdeluxe
036cd3e066 Add DDownload provider, post-processing status labels, and update changelog
- DDownload (ddownload.com/ddl.to) as new hoster with web login
- Post-processing labels: Entpacken/Renaming/Aufräumen/MKVs
- Release notes shown in update confirmation dialog

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 02:05:16 +01:00
Sucukdeluxe
479c7a3f3f Fix hybrid extraction showing "Ausstehend" instead of "Warten auf Parts"
When hybrid extraction finds no ready archive sets (because remaining parts
are still downloading), completed items were incorrectly labeled as
"Entpacken - Ausstehend" instead of "Entpacken - Warten auf Parts".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:38:51 +01:00
Sucukdeluxe
0404d870ad Release v1.6.33
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:24:26 +01:00
Sucukdeluxe
93a53763e0 Release v1.6.32
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:21:40 +01:00
Sucukdeluxe
c20d743286 Fix updater GetUserByName error, mask backup credentials, clean up old scripts
- Migrate deprecated updateRepo value (Sucukdeluxe/) to new default (Administrator/)
- Mask sensitive fields (tokens, passwords) in backup export with ***
- Preserve current credentials when importing backup with masked values
- Remove 22 obsolete release_v*.mjs scripts, release_codeberg.mjs, set_version_node.mjs
- Remove release:codeberg script from package.json

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 01:20:57 +01:00
Sucukdeluxe
ba938f64c5 Release v1.6.31 2026-03-05 00:45:46 +01:00
Sucukdeluxe
af00d69e5c Switch release/update docs and tooling to Gitea 2026-03-05 00:42:59 +01:00
Sucukdeluxe
bc47da504c Rename app description to Desktop downloader 2026-03-04 23:56:24 +01:00
Sucukdeluxe
5a24c891c0 Migrate project to GitHub and log 2026-03-04 23:55:42 +01:00
Sucukdeluxe
1103df98c1 Analysiere Programm auf Bugs 2026-03-04 23:03:16 +01:00
Sucukdeluxe
74920e2e2f Round 8 bug fixes (20 fixes)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:14:03 +01:00
Sucukdeluxe
75775f2798 Round 7 bug fixes (13 fixes)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:59:42 +01:00
Sucukdeluxe
fad0f1060b Release v1.6.30
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:36:35 +01:00
Sucukdeluxe
0ca359e509 Round 6 bug fixes (pre-release)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:23:34 +01:00
Sucukdeluxe
1d0ee31001 Release v1.6.29
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 21:11:32 +01:00
Sucukdeluxe
20a0a59670 Release v1.6.28
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:54:49 +01:00
Sucukdeluxe
9a00304a93 Release v1.6.27
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:34:50 +01:00
Sucukdeluxe
55b00bf884 Release v1.6.26
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:18:47 +01:00
Sucukdeluxe
e85f12977f Fix extractedArchives.push -> .add (Set, not Array)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:03:40 +01:00
Sucukdeluxe
940346e2f4 Release v1.6.25
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 20:01:01 +01:00
Sucukdeluxe
1854e6bb17 Release v1.6.24
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:34:13 +01:00
Sucukdeluxe
26b2ef0abb Release v1.6.23
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:26:01 +01:00
Sucukdeluxe
9cceaacd14 Release v1.6.22
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 19:04:53 +01:00
Sucukdeluxe
1ed13f7f88 Release v1.6.21
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 18:57:18 +01:00
Sucukdeluxe
729aa30253 Release v1.6.20
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 18:09:13 +01:00
Sucukdeluxe
b8bbc9c32f Release v1.6.19
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:45:30 +01:00
Sucukdeluxe
a263e3eb2c Release v1.6.18
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:31:20 +01:00
Sucukdeluxe
10bae4f98b Release v1.6.17
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 17:04:32 +01:00
Sucukdeluxe
b02aef2af9 Release v1.6.16
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 16:50:59 +01:00
Sucukdeluxe
56c0b633c8 Release v1.6.15
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:47:48 +01:00
Sucukdeluxe
4e8e8eba66 Release v1.6.14
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:30:49 +01:00
Sucukdeluxe
d5638b922d Release v1.6.13
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:18:15 +01:00
Sucukdeluxe
dc695c9a04 Release v1.6.12
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:07:08 +01:00
Sucukdeluxe
52909258ca Release v1.6.11
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:03:29 +01:00
Sucukdeluxe
e9b9801ac1 Release v1.6.10
Fix post-process slot counter going negative after stop(), allowing multiple
packages to extract simultaneously instead of one at a time.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:46:44 +01:00
Sucukdeluxe
86a358d568 Release v1.6.9
Fix extraction resume state / progress sync, abort labels, and hybrid pkg.status

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:43:31 +01:00
Sucukdeluxe
97c5bfaa7d Release v1.6.8
- Fix "Fertig" status on completed items: session recovery no longer resets
  "Entpacken - Ausstehend" to "Fertig (size)" — respects autoExtract setting
- Extraction continues during pause instead of being aborted
- Hybrid extraction recovery on start/resume: triggerPendingExtractions and
  recoverPostProcessingOnStartup now handle partial packages with hybridExtract
- Move Up/Down buttons: optimistic UI update so packages move instantly

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:23:29 +01:00
Sucukdeluxe
8d0c110415 Release v1.6.7
Add proactive disk-busy detection: lower STREAM_HIGH_WATER_MARK from 2 MB
to 512 KB so backpressure triggers sooner, and monitor stream.writableLength
to show "Warte auf Festplatte" after 300 ms of undrained writes — before
actual backpressure hits.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:13:14 +01:00
Sucukdeluxe
a131f4a11b Release v1.6.6
Fix hybrid extraction stalling: requeue loop now keeps the post-process
slot so the same package re-runs immediately without waiting behind other
packages.  Also skip already-extracted archives on requeue rounds.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 14:04:28 +01:00
Sucukdeluxe
335873a7f6 Release v1.6.5
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 13:49:20 +01:00
Sucukdeluxe
7446e07a8c Release v1.6.4
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 09:54:37 +01:00
Sucukdeluxe
693f7b482a Release v1.6.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:54:41 +01:00
Sucukdeluxe
1d4a13466f Release v1.6.2
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:41:40 +01:00
Sucukdeluxe
17844d4c28 Release v1.6.1
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:27:10 +01:00
Sucukdeluxe
55d0e3141c Release v1.6.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 05:13:07 +01:00
Sucukdeluxe
a967eb1080 Release v1.5.99
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:43:46 +01:00
Sucukdeluxe
21ff749cf3 Release v1.5.98
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:31:45 +01:00
Sucukdeluxe
18862bb8e0 Release v1.5.97
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:24:45 +01:00
Sucukdeluxe
27833615b7 Release v1.5.96
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:17:22 +01:00
Sucukdeluxe
00fae5cadd Release v1.5.95
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 04:08:29 +01:00
Sucukdeluxe
4fcbd5c4f7 Release v1.5.94
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:51:10 +01:00
Sucukdeluxe
bb8fd0646a Release v1.5.93
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:35:11 +01:00
Sucukdeluxe
1218adf5f2 Release v1.5.92
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:21:48 +01:00
Sucukdeluxe
818bf40a9c Release v1.5.91
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:15:54 +01:00
Sucukdeluxe
254612a49b Release v1.5.90
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 03:04:49 +01:00
Sucukdeluxe
92101e249a Release v1.5.89
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:57:33 +01:00
Sucukdeluxe
a18ab484cc Release v1.5.88
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:50:16 +01:00
Sucukdeluxe
7af9d67770 Release v1.5.87
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 02:38:05 +01:00
Sucukdeluxe
d63afcce89 Release v1.5.86
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:58:30 +01:00
Sucukdeluxe
15d0969cd9 Release v1.5.85
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:31:13 +01:00
Sucukdeluxe
5574b50d20 Add visual online/offline status dot indicator for Rapidgator links
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:23:20 +01:00
Sucukdeluxe
662c903bf3 Add Rapidgator link online/offline check when links are added
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 00:19:21 +01:00
Sucukdeluxe
545043e1d6 Add multi-select, Ctrl+A, right-click context menu with link viewer to history tab
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 00:06:58 +01:00
Sucukdeluxe
8f10ff8f96 Show TB for sizes >= 1 TiB in humanSize formatter
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 00:00:43 +01:00
Sucukdeluxe
62f3bd94de Remove speed limit toolbar button, fix hoster stats grouping, add Ctrl+A select all, fix context menu clipping
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 23:56:35 +01:00
Sucukdeluxe
253b1868ec Release v1.5.79
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 23:46:06 +01:00
Sucukdeluxe
c4aefb6175 Release v1.5.78
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 23:17:37 +01:00
Sucukdeluxe
956cad0da4 Release v1.5.77
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:58:34 +01:00
Sucukdeluxe
83d8df84bf Release v1.5.76
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:50:51 +01:00
Sucukdeluxe
0c058fa162 Release v1.5.75
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:28:40 +01:00
Sucukdeluxe
21dbf46f81 Release v1.5.74
- Fix hybrid extract not using maxParallelExtract setting (was hardcoded to 1)
- Fix "Warten auf Parts" label shown for items whose downloads are already complete
- Update hybrid extract progress handler to support parallel archive tracking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:13:58 +01:00
Sucukdeluxe
af6eea8253 Release v1.5.73
- Show full passwords (unmasked) in extraction logs for easier debugging

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:55:36 +01:00
Sucukdeluxe
7029271999 Release v1.5.72
- WRONG_PASSWORD JVM error now falls back to legacy UnRAR extractor
- Added masked password logging for JVM and legacy extractors
- Per-attempt password logging shows which passwords are tried and in what order

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:50:35 +01:00
Sucukdeluxe
5dabee332e Parallel archive extraction within packages
maxParallelExtract now controls how many archives extract simultaneously
within a single package (e.g. 4 episodes at once). Packages still
extract sequentially (one package at a time) to focus I/O. Progress
handler updated to track multiple active archives independently.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:43:34 +01:00
Sucukdeluxe
d9fe98231f Extract packages sequentially instead of in parallel
Previously maxParallelExtract allowed multiple packages to extract
simultaneously, splitting I/O across packages. Now packages extract
one at a time in packageOrder so each package finishes faster.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:35:12 +01:00
Sucukdeluxe
6ee98328fb Fix JVM extractor not falling back to legacy UnRAR on codec errors
When SevenZipJBinding reports "Archive file can't be opened with any
of the registered codecs", the extractor now falls back to legacy
UnRAR instead of failing immediately. Previously, backend mode "jvm"
(the production default) only allowed fallback for UNSUPPORTEDMETHOD.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:28:47 +01:00
Sucukdeluxe
3dbb94d298 Release v1.5.68: Extractor optimizations inspired by JDownloader
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 21:25:02 +01:00
Sucukdeluxe
1956be0c71 Release v1.5.67
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:40 +01:00
Sucukdeluxe
e9414853f9 Sync package-lock.json version
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:14 +01:00
Sucukdeluxe
d67cce501b Split Hoster column into Hoster + Account columns
Hoster column now shows only the file hoster (e.g. rapidgator.net),
new Account column shows the debrid service used (e.g. Mega-Debrid).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:54:03 +01:00
Sucukdeluxe
d87b74d359 Release v1.5.66
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:43:08 +01:00
Sucukdeluxe
be7a8fd103 Add progress sorting, extraction priority by packageOrder, auto-expand extracting packages
- Fortschritt column is now clickable/sortable (ascending/descending by package %)
- Extraction queue respects packageOrder: top packages get extracted first
- Packages currently extracting are auto-expanded so user can see progress
- Increased Fortschritt column width for better spacing

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:42:28 +01:00
Sucukdeluxe
d23740eac7 Release v1.5.65
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:31:58 +01:00
Sucukdeluxe
56a507b45d Add configurable parallel extraction count (1-8, default 2)
- New setting maxParallelExtract in AppSettings
- UI input in Entpacken tab: "Parallele Entpackungen"
- Replaces hardcoded maxConcurrent=2 in acquirePostProcessSlot

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:31:24 +01:00
Sucukdeluxe
31ce1e6618 Release v1.5.64
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:25:09 +01:00
Sucukdeluxe
b637fb3db8 Fix test suite: await async calls, update status expectations, add timeouts
- cleanup.test.ts: add async/await for removeDownloadLinkArtifacts and removeSampleArtifacts
- download-manager.test.ts: await manager.start() to prevent race conditions
- download-manager.test.ts: update "Entpackt" -> "Entpackt - Done" expectations
- download-manager.test.ts: await async getStartConflicts()
- download-manager.test.ts: add 35s timeout for extraction tests

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:23:57 +01:00
Sucukdeluxe
1d876f8ded Fix parallel JVM extraction: isolate temp dirs to prevent native DLL lock conflicts
Each JVM extractor process now gets its own temp directory via
-Djava.io.tmpdir so parallel SevenZipJBinding instances don't fight
over the same lib7-Zip-JBinding.dll file lock on Windows.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 20:23:57 +01:00
Sucukdeluxe
2f43164732 Fix hybrid-extract item matching: use fileName for robust part detection
The previous targetPath-only matching missed items whose targetPath
differed from the on-disk filename. Now matches by basename and
fileName for reliable archive-part to item association.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:22:00 +01:00
Sucukdeluxe
9747cabb14 Release v1.5.63
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:16:42 +01:00
Sucukdeluxe
9b460758f9 Add parallel extraction (2 concurrent) and better status labels
- Replace serial packagePostProcessQueue with semaphore (max 2 concurrent)
- Hybrid-extract: items waiting for parts show "Entpacken - Warten auf Parts"
- Failed hybrid extraction shows "Entpacken - Error" instead of "Fertig"

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:15:57 +01:00
Sucukdeluxe
8cc1f788ad Release v1.5.62
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:03:50 +01:00
Sucukdeluxe
0888e16aec Fix startPackages: scheduler now respects runPackageIds filter
findNextQueuedItem(), hasQueuedItems(), hasDelayedQueuedItems() and
countQueuedItems() now skip packages not in runPackageIds when the set
is non-empty. This ensures "Ausgewählte Downloads starten" only
processes selected packages instead of all enabled ones.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:03:05 +01:00
Sucukdeluxe
cab550a13c Release v1.5.61 2026-03-03 17:54:15 +01:00
Sucukdeluxe
2ef3983049 Revert to v1.5.49 base + fix "Ausgewählte Downloads starten"
- Restore all source files from v1.5.49 (proven stable on both servers)
- Add startPackages() IPC method that starts only specified packages
- Fix context menu "Ausgewählte Downloads starten" to use startPackages()
  instead of start() which was starting ALL enabled packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 17:53:39 +01:00
Sucukdeluxe
ca4392fa8b Release v1.5.60 2026-03-03 17:33:33 +01:00
Sucukdeluxe
8fad61e329 Restore multipart RAR open strategy for JVM extractor
Bring back callback-first RAR multipart opening with explicit RAR5/RAR fallback while keeping 7z.001 on VolumedArchiveInStream to improve 7z-JBinding compatibility on split RAR sets.
2026-03-03 17:32:22 +01:00
Sucukdeluxe
02bd4a61fb Release v1.5.59 2026-03-03 16:56:06 +01:00
Sucukdeluxe
30ac5bf9db Harden hybrid extract readiness for partial archives
Require near-complete file size checks in Item-Recovery and hybrid ready-set detection so partially downloaded RAR parts are not marked completed and extracted prematurely.
2026-03-03 16:52:16 +01:00
Sucukdeluxe
87e0a986e6 Release v1.5.58 2026-03-03 16:37:14 +01:00
Sucukdeluxe
353cef7dbd Add JVM hybrid-extract retry and clean up Java extractor
- Add automatic retry with 3s delay when JVM extractor fails with
  "codecs" or "can't be opened" error during hybrid-extract mode
  (handles transient Windows file locks after download completion)
- Log archive file size before JVM extraction in hybrid mode
- Remove unused ArchiveFormat import, RAR_MULTIPART_RE/RAR_OLDSPLIT_RE
  patterns, and hasOldStyleRarSplits() method from Java extractor
- Keep simple openSevenZipArchive with currentVolumeName tracking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:35:43 +01:00
Sucukdeluxe
f9b0bbe676 v1.5.57
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:15:49 +01:00
Sucukdeluxe
93d54c8f84 Fix multi-part RAR: get first stream via callback, track current volume name
Two bugs in SevenZipVolumeCallback caused multi-part RAR extraction to fail:

1. getProperty(NAME) always returned firstFileName instead of tracking the
   last opened volume name. 7z-JBinding needs this to compute subsequent
   volume filenames.

2. The first IInStream was created separately instead of through the
   callback's getStream() method, so the volume name tracker was not
   properly initialized.

Verified with real multi-part RAR5 test archives (3 parts, WinRAR 7.01).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:15:44 +01:00
Sucukdeluxe
26bf675a41 v1.5.56
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:03:06 +01:00
Sucukdeluxe
35e84e652e Fix multi-part RAR: use explicit ArchiveFormat instead of VolumedArchiveInStream
VolumedArchiveInStream only works for .7z.001 splits - it rejects RAR
filenames. For multi-part RAR (.partN.rar), use RandomAccessFileInStream
with explicit ArchiveFormat.RAR5/RAR format specification. Auto-detection
with null format can fail for multi-volume RAR archives.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:03:01 +01:00
Sucukdeluxe
462fc0397e v1.5.55
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:54:17 +01:00
Sucukdeluxe
d4bf574370 Fix multi-part RAR extraction: use VolumedArchiveInStream for .partN.rar
The JVM extractor used RandomAccessFileInStream for multi-part RAR archives,
which only provides a single file stream. 7z-JBinding requires
VolumedArchiveInStream to access additional volume parts via callback.

Added RAR_MULTIPART_RE and RAR_OLDSPLIT_RE patterns to detect multi-volume
RAR archives and route them through VolumedArchiveInStream, fixing
"Archive file can't be opened with any of the registered codecs" errors.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:54:09 +01:00
Sucukdeluxe
d3ec000da5 v1.5.54
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:59:08 +01:00
Sucukdeluxe
d7d256f716 Fix hybrid-extract: check per-archive prefix instead of whole package
The previous fix blocked ALL multi-part extractions when any item in the
package was pending. Now checks only parts of the SAME archive (by prefix
match on fileName/targetPath), so E01 can extract while E06 downloads.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:59:03 +01:00
Sucukdeluxe
804fbe2bdc v1.5.53
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:52:39 +01:00
Sucukdeluxe
1fde0a9951 Fix hybrid-extract multi-part archive + extractor CRC handling
- findReadyArchiveSets: for .part1.rar, require ALL package items
  to be terminal before allowing extraction (prevents premature
  extraction when later parts have no targetPath/fileName yet)
- JVM extractor: remove CRCERROR from isPasswordFailure() — only
  DATAERROR indicates wrong password. CRCERROR on archives where
  7z-JBinding falsely reports encrypted no longer triggers password
  cycling.
- looksLikeWrongPassword: remove CRC text matching, keep only
  explicit "data error" for encrypted archives.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:52:00 +01:00
Sucukdeluxe
0cf5ebe5e9 Release v1.5.52 2026-03-03 14:37:59 +01:00
Sucukdeluxe
0b7c658c8f Add Account Manager + fix Hybrid-Extract premature extraction
- Account Manager: table UI with add/remove/check for all 4 providers
  (Real-Debrid, Mega-Debrid, BestDebrid, AllDebrid)
- Backend: checkRealDebridAccount, checkAllDebridAccount, checkBestDebridAccount
- Hybrid-Extract fix: check item.fileName for queued items without targetPath,
  disable disk-fallback for multi-part archives, extend disk-fallback to catch
  active downloads by fileName match (prevents CRC errors on incomplete files)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:36:13 +01:00
Sucukdeluxe
e6ec1ed755 Add Mega-Debrid account info check (web scraping)
Scrapes the Mega-Debrid profile page to display username, premium status,
remaining days, and loyalty points. New "Account prüfen" button in Settings > Accounts.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:06:19 +01:00
Sucukdeluxe
ac479bb023 Add backup encryption (AES-256-GCM) and directory existence check
- Encrypt sensitive credentials (tokens, passwords) in backup exports
  using AES-256-GCM with PBKDF2 key derivation from OS username
- Backup format v2 with backwards-compatible v1 import
- Show dialog to create non-existent directories when changing
  outputDir, extractDir, or mkvLibraryDir settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 13:47:56 +01:00
Sucukdeluxe
9ac557b0a8 Fix app icon: use rcedit afterPack hook to embed custom icon in EXE
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:17:47 +01:00
Sucukdeluxe
4585db0281 Remove CHANGELOG.md from repo, link to Codeberg Releases instead
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:07:25 +01:00
Sucukdeluxe
0d86356f96 Remove internal files from repo: .github/workflows, docs/plans, verify_remote.mjs
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:06:43 +01:00
Sucukdeluxe
140ee488b7 Update README with new features: JVM extraction, auto-rename, progress bars, history, nested extraction
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:03:31 +01:00
Sucukdeluxe
486379183b Remove .claude folder from repo and add to .gitignore
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:02:11 +01:00
Sucukdeluxe
19a588a997 Show "Jetzt entpacken" context menu when any item is completed, re-enable paused packages
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:52:19 +01:00
Sucukdeluxe
fa30e738d9 Fix UNSUPPORTEDMETHOD: init SevenZipJBinding native libs, pass password to extractSlow
Some checks are pending
Build and Release / build (push) Waiting to run
Root cause: SevenZip.initSevenZipFromPlatformJAR() was never called, so
native compression codecs (RAR5, LZMA2, etc.) were not loaded. Archives
could be opened (header parsing is pure Java) but all extractSlow() calls
returned UNSUPPORTEDMETHOD because no native decoder was available.

- Add ensureSevenZipInitialized() with lazy init before extraction
- Pass password to extractSlow(outStream, password) for RAR5 compatibility
- Add UNSUPPORTEDMETHOD -> legacy fallback in extractor.ts as safety net

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:35:19 +01:00
Sucukdeluxe
eefb536cb3 Fix path traversal false positive: skip subst drive mapping for JVM backend
Some checks are pending
Build and Release / build (push) Waiting to run
Java's getCanonicalFile() resolves subst drives inconsistently,
causing secureResolve() to falsely block valid filenames. JVM handles
long paths natively so subst is only needed for legacy UnRAR/7z.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:25:10 +01:00
Sucukdeluxe
02b136dac7 Fix JVM extractor: asarUnpack for class/jar files, add unpacked path candidate, default to jvm mode
Some checks are pending
Build and Release / build (push) Waiting to run
The JVM sidecar class files were packed inside app.asar where Java
cannot access them. asarUnpack extracts them to app.asar.unpacked/.
Default backend changed from auto to jvm (no legacy fallback).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:19:26 +01:00
Sucukdeluxe
b712282f62 Log which extraction backend was used (7zjbinding/zip4j/legacy)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:12:48 +01:00
Sucukdeluxe
de369f3bcd Replace extraction backend with SevenZipJBinding + Zip4j JVM sidecar
Some checks are pending
Build and Release / build (push) Waiting to run
- New JVM sidecar (resources/extractor-jvm/) using SevenZipJBinding for
  RAR/7z/TAR and Zip4j for ZIP multipart, matching JDownloader 2 stack
- Auto/JVM/Legacy backend modes via RD_EXTRACT_BACKEND env variable
- Fallback to legacy UnRAR/7z when JVM runtime unavailable
- Fix isJvmRuntimeMissingError false positives on valid extraction errors
- Cache JVM layout resolution to avoid repeated filesystem checks
- Route nested ZIP extraction through JVM backend consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 02:08:42 +01:00
Sucukdeluxe
3ee3af03cf Match JDownloader 2 extraction behavior: normal I/O, -mt2 hybrid
Some checks are pending
Build and Release / build (push) Waiting to run
- Remove setWindowsBackgroundIO entirely (JD2 uses normal I/O priority)
- Keep only CPU priority IDLE (os.setPriority)
- Hybrid threads fixed at -mt2 (matches JD2's ~16 MB/s controlled throughput)
- Final extraction uses full thread count (unchanged)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 01:14:47 +01:00
Sucukdeluxe
9a646d516b Fix extraction speed: I/O priority only in hybrid mode, more threads
Some checks are pending
Build and Release / build (push) Waiting to run
- setWindowsBackgroundIO (Very Low I/O) now only applied in hybrid mode,
  not for all extractions (was causing massive slowdown)
- Hybrid threads changed from -mt1 to half CPU count (e.g. -mt4 on 8-core)
- Move retry count (R9, R22 etc.) from status text to tooltip only

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 01:08:40 +01:00
Sucukdeluxe
311fb00430 Release v1.5.40
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-03 01:01:46 +01:00
Sucukdeluxe
4008e20278 Reset stale status texts on session load and stop
Some checks are pending
Build and Release / build (push) Waiting to run
- normalizeSessionStatuses: reset all queued items to "Wartet" instead of
  only checking a few specific patterns (missed Retry, Unrestrict-Fehler etc.)
- Reset completed items with stale extraction status to "Fertig (size)"
- stop(): reset all non-finished items to queued/"Wartet" and packages to queued

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:58:28 +01:00
Sucukdeluxe
375b9885ee Disk space pre-check, nested extraction, lower I/O priority for hybrid extraction
Some checks are pending
Build and Release / build (push) Waiting to run
- Add disk space check before extraction (aborts if insufficient space)
- Add single-level nested archive extraction (archives inside archives)
- Blacklist .iso/.img/.bin/.dmg from nested extraction
- Set real Windows I/O priority (Very Low) on UnRAR via NtSetInformationProcess
- Reduce UnRAR threads to -mt1 during hybrid extraction
- Fix double episode renaming (s01e01e02 pattern)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:54:10 +01:00
Sucukdeluxe
ce01512537 Release v1.5.37
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-03 00:08:06 +01:00
Sucukdeluxe
7dc12aca0c Fix disk-backpressure stalls and improve episode-token parsing 2026-03-03 00:07:12 +01:00
Sucukdeluxe
a6c65acfcb Release v1.5.36
Some checks are pending
Build and Release / build (push) Waiting to run
2026-03-02 23:50:31 +01:00
Sucukdeluxe
19342647e5 Fix download freeze spikes and unrestrict slot overshoot handling 2026-03-02 23:47:54 +01:00
Sucukdeluxe
7fe7d93e83 Fix dark text on package header progress bars (CSS specificity override)
Some checks are pending
Build and Release / build (push) Waiting to run
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:35:22 +01:00
73 changed files with 9093 additions and 1785 deletions

View File

@ -1,126 +0,0 @@
# Memory Bank - Multi Debrid Downloader
## Projekt-Überblick
**Name:** Multi Debrid Downloader (MDD)
**Typ:** Electron Desktop App für Windows 10/11
**Repository:**
- Codeberg: https://codeberg.org/Sucukdeluxe/real-debrid-downloader.git
- GitHub: https://github.com/Sucukdeluxe/real-debrid-downloader.git
## Technologie-Stack
- **Runtime:** Electron 31.x
- **Frontend:** React 18.x + TypeScript 5.x
- **Build:** Vite (Renderer) + tsup (Main/Preload)
- **Tests:** Vitest (262+ Tests)
- **Installer:** NSIS via electron-builder
## Unterstützte Debrid-Provider
| Provider | Auth | Priorität |
|----------|------|-----------|
| Real-Debrid | API Token | Primär |
| Mega-Debrid | Login + Passwort | Fallback 1 |
| BestDebrid | API Token | Fallback 2 |
| AllDebrid | API Key | Fallback 3 |
## Kernfeatures
- **Queue-Management:** Package-basierte Organisation mit Drag & Drop
- **Auto-Extract:** RAR, ZIP, 7z mit Passwortliste
- **Auto-Rename:** Scene-Release Muster (4sf/4sj) → saubere Namen
- **Integritätsprüfung:** CRC32, MD5, SHA1 via SFV-Dateien
- **Provider-Fallback:** Automatischer Wechsel bei Fehlern/Fair-Use
- **Session-Persistenz:** Queue überlebt App-Neustart
- **Clipboard-Watcher:** Automatische Link-Erkennung
- **System-Tray:** Minimize to Tray
- **Speed-Limit:** Global oder per Download + Bandwidth-Schedules
- **MKV-Sammelordner:** Automatisches Verschieben nach Paketabschluss
- **Update-System:** Automatische Updates via Codeberg Releases
## Projektstruktur
```
src/
├── main/ # Electron Main Process
│ ├── main.ts # Entry Point, IPC Handler, Window Management
│ ├── app-controller.ts # Koordiniert DownloadManager + Settings
│ ├── download-manager.ts # Core: Queue, Downloads, Retry-Logic
│ ├── debrid.ts # Debrid-Service Abstraktion
│ ├── realdebrid.ts # Real-Debrid API Client
│ ├── extractor.ts # Archiv-Entpackung
│ ├── integrity.ts # CRC32/Hash-Validierung
│ ├── storage.ts # Session/Settings Persistenz
│ ├── update.ts # Update-Check & Installation
│ └── ...
├── renderer/ # React UI
│ ├── App.tsx # Hauptkomponente mit allen Tabs
│ └── styles.css # Styling
├── preload/ # Preload Script (IPC Bridge)
│ └── preload.ts
└── shared/ # Geteilte Types
├── types.ts # Alle TypeScript Interfaces
├── ipc.ts # IPC Channel Konstanten
└── preload-api.ts # window.rd API Definition
```
## Wichtige Types (src/shared/types.ts)
- `DownloadItem`: Einzelner Download mit Status, Progress, Speed
- `PackageEntry`: Gruppe von Downloads mit OutputDir, ExtractDir
- `SessionState`: Gesamter Queue-Zustand (persistiert)
- `AppSettings`: Alle Einstellungen
- `UiSnapshot`: Kompletter UI-State für Renderer
## IPC Channels (src/shared/ipc.ts)
Hauptchannels für Renderer ↔ Main Kommunikation:
- `GET_SNAPSHOT`, `STATE_UPDATE`: State-Sync
- `ADD_LINKS`, `ADD_CONTAINERS`: Queue befüllen
- `START`, `STOP`, `TOGGLE_PAUSE`: Download-Kontrolle
- `UPDATE_SETTINGS`: Einstellungen ändern
## Aktuelle Version
**Version:** 1.5.27
**Letztes Release:** 1.4.68 (2026-03-01)
### Letzte Änderungen (CHANGELOG)
- Session-Backup für Queue-Zustand
- Start-Konflikt-Behandlung verbessert
- Mega-Web Unrestrict abort-fähig
- DLC-Import gehärtet
- Auto-Renamer erweitert
## Offene Pläne
1. **Native Menüleiste** (`.claude/plans/agile-watching-lampson.md`)
- JDownloader 2 Style Menü
- Electron Menu API nutzen
- Bestehende React Menu-Bar ersetzen
## Coding-Conventions
- TypeScript strict mode
- Async/Await über Promises
- Deutsche UI-Texte
- Ausführliche Error-Logs via `logger`
- Retry-Logic mit exponential backoff
- AbortController für abbrechbare Operationen
## Build & Release
```bash
npm run build # TypeScript + Vite Build
npm run dist # electron-builder (NSIS + Portable)
npm test # Vitest Tests
npm run self-check # Vollständiger Check (Typecheck + Tests)
```
## Wichtige Dateien
- `CHANGELOG.md` - Detaillierte Versionshistorie
- `.claude/plans/` - Feature-Pläne
- `tests/` - Umfangreiche Test-Suite
- `installer/RealDebridDownloader.iss` - Inno Setup Script

View File

@ -1,66 +0,0 @@
# Native Menüleiste (JDownloader 2 Style)
## Context
Die App hat aktuell keine native Menüleiste (nur ein Tray-Kontextmenü). Der User möchte eine Menüleiste oben links wie bei JDownloader 2 mit Datei-Menü, Shortcuts und Sicherungs-Funktion.
## Features
### "Datei"-Menü (oben links)
| Menüpunkt | Shortcut | Aktion |
|-----------|----------|--------|
| Text mit Links analysieren | Ctrl+L | Wechselt zum Linksammler-Tab |
| Linkcontainer laden | Ctrl+O | Öffnet DLC-Dateiauswahl (existiert bereits) |
| --- Separator --- | | |
| Sicherung → Backup erstellen | | Exportiert Queue als JSON (existiert: `exportQueue`) |
| Sicherung → Backup laden | | Importiert Queue-JSON (existiert: `importQueue`) |
| --- Separator --- | | |
| Neustart | Ctrl+Shift+R | `app.relaunch()` + `app.quit()` |
| Beenden | Ctrl+Q | `app.quit()` |
## Implementation
### Step 1: Neue IPC-Channels
**Datei:** `src/shared/ipc.ts`
- `NAVIGATE_TAB: "app:navigate-tab"` — Renderer wechselt Tab
- `RESTART: "app:restart"` — App neustarten
- `SAVE_BACKUP: "dialog:save-backup"` — Save-Dialog + Export
- `LOAD_BACKUP: "dialog:load-backup"` — Open-Dialog + Import
### Step 2: Preload-API erweitern
**Datei:** `src/shared/preload-api.ts` + `src/preload/preload.ts`
- `onNavigateTab(callback)` — Event-Listener für Tab-Wechsel
- `saveBackup()` — Backup über nativen Save-Dialog speichern
- `loadBackup()` — Backup über nativen Open-Dialog laden
### Step 3: Menüleiste erstellen
**Datei:** `src/main/main.ts`
Neue Funktion `createApplicationMenu()` nach `createTray()`:
- Nutzt `Menu.buildFromTemplate()` + `Menu.setApplicationMenu()`
- "Datei"-Menü mit allen Punkten aus der Tabelle
- Accelerators für Shortcuts (Electron handelt die automatisch)
- Menü-Clicks senden IPC-Events an den Renderer oder rufen direkt Main-Process-Funktionen auf
**Backup erstellen:** `dialog.showSaveDialog()``controller.exportQueue()``fs.writeFile()`
**Backup laden:** `dialog.showOpenDialog()``fs.readFile()``controller.importQueue()`
**Neustart:** `app.relaunch()``app.quit()`
**Beenden:** `app.quit()`
**Linksammler/DLC:** IPC-Event an Renderer senden
### Step 4: Renderer reagiert auf Menü-Events
**Datei:** `src/renderer/App.tsx`
- `onNavigateTab` Listener registrieren im `useEffect`
- Bei `"collector"``setTab("collector")`
- DLC-Import: `pickContainers` + `addContainers` (bestehendes Pattern)
## Dateien
- `src/shared/ipc.ts` — Neue Channels
- `src/shared/preload-api.ts` — Neue API-Methoden
- `src/preload/preload.ts` — IPC-Bridge
- `src/main/main.ts` — Menüleiste + IPC-Handler + Backup-Logik
- `src/renderer/App.tsx` — Tab-Navigation Listener
## Verification
1. `npm run build`
2. `npx vitest run` (schnelle Tests)
3. Manuell: App starten, Datei-Menü prüfen, Shortcuts testen

View File

@ -1,6 +0,0 @@
{
"enabledPlugins": {
"frontend-design@claude-plugins-official": true,
"code-review@claude-plugins-official": true
}
}

View File

@ -1,7 +0,0 @@
{
"permissions": {
"allow": [
"WebFetch(domain:github.com)"
]
}
}

@ -1 +0,0 @@
Subproject commit f1e132b2ed4717667fd7318ecab22e5ef52da0cc

View File

@ -1,52 +0,0 @@
name: Build and Release
permissions:
contents: write
on:
push:
tags:
- "v*"
jobs:
build:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "22"
cache: "npm"
- name: Install dependencies
run: npm ci
- name: Apply tag version
shell: pwsh
run: |
$version = "${{ github.ref_name }}".TrimStart('v')
node scripts/set_version_node.mjs $version
- name: Build app
run: npm run build
- name: Build Windows artifacts
run: npm run release:win
- name: Pack portable zip
shell: pwsh
run: |
Compress-Archive -Path "release\win-unpacked\*" -DestinationPath "Real-Debrid-Downloader-win64.zip" -Force
- name: Publish GitHub Release
uses: softprops/action-gh-release@v2
with:
files: |
Real-Debrid-Downloader-win64.zip
release/*.exe
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

14
.gitignore vendored
View File

@ -17,9 +17,23 @@ rd_download_manifest.json
_update_staging/
apply_update.cmd
.claude/
.github/
docs/plans/
CHANGELOG.md
node_modules/
.vite/
coverage/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Forgejo deployment runtime files
deploy/forgejo/.env
deploy/forgejo/forgejo/
deploy/forgejo/postgres/
deploy/forgejo/caddy/data/
deploy/forgejo/caddy/config/
deploy/forgejo/caddy/logs/
deploy/forgejo/backups/

View File

@ -1,223 +0,0 @@
# Changelog
Alle nennenswerten Aenderungen werden in dieser Datei dokumentiert.
## 1.5.28 - 2026-03-02
UI-Verbesserung: Visuelle Fortschrittsanzeigen in der Download-Liste (JDownloader 2 Style).
### Features
- **Visuelle Fortschrittsbalken:**
- Paket-Fortschritt wird jetzt als grafische Progress-Bar in der Spalte "Fortschritt" angezeigt.
- Einzelne Items haben ebenfalls eine kleinere Progress-Bar.
- Grüner Gradient (#22c55e → #4ade80) für bessere Text-Lesbarkeit.
- Prozentanzeige als Overlay-Text auf der Bar.
## 1.4.68 - 2026-03-01
Stabilitaets-Hotfix fuer Session-Verlust nach Update/Neustart: Session-Dateien haben jetzt ein robustes Backup-/Restore-Fallback.
### Fixes
- Session-Backup fuer Queue-Zustand eingefuehrt:
- Vor jedem Session-Save wird die vorherige Session als `.bak` gesichert (sync + async Pfad).
- Schuetzt gegen defekte/trunkierte Session-Datei beim Start.
- Session-Autorestore beim Laden:
- Wenn `rd_session_state.json` defekt ist, wird automatisch `rd_session_state.json.bak` geladen.
- Das Backup wird danach best-effort wieder als primäre Session-Datei hergestellt.
- Klarere Fehlersignale im Log:
- Eindeutige Meldung, ob primäre Session defekt war und Backup verwendet wurde.
### Tests
- Neue Tests in `tests/storage.test.ts`:
- Laden aus Session-Backup bei defekter primärer Session.
- Backup-Erstellung vor sync- und async-Session-Overwrite.
## 1.4.67 - 2026-03-01
Hotfix fuer einen kritischen Start-Konflikt-Datenverlust und zusaetzliche Renamer-Haertung fuer reale Scene-Muster.
### Fixes
- Start-Konflikt `Überspringen` loescht keine Pakete/Items mehr:
- Bereits entpackte Dateien bleiben erhalten.
- Offene Downloads bleiben in der Queue und koennen normal fortgesetzt werden.
- Laufende Tasks werden dabei als Paket-Stop statt als Cancel behandelt.
- Start-Konflikt-Dialogtext in der UI praezisiert:
- `Entpacktes überspringen` statt missverstaendlichem `Überspringen`.
- Klare Info, dass nur erneutes Entpacken uebersprungen wird.
- Auto-Renamer verbessert:
- Erkennt jetzt auch Episode-only Tokens wie `e01`/`e02` mit Staffel-Hinweis aus dem Ordner.
- Akzeptiert lowercase Group-Suffixe wie `-tmsf`.
- Robuster bei Source-Formaten wie `4sf-bs-720p-s01e05`.
### Tests
- Neue/angepasste Tests in:
- `tests/download-manager.test.ts` (Start-Konflikt-Skip behaelt Paket + Partial-Queue)
- `tests/auto-rename.test.ts` (e01/e02, lowercase suffix, odd source order)
## 1.4.66 - 2026-03-01
Hotfix fuer haengende "Link wird umgewandelt"-Faelle (insbesondere Mega-Web-Pfad), bei denen nur ein App-Neustart geholfen hat.
### Fixes
- Mega-Web-Unrestrict ist jetzt komplett abort-/timeout-faehig:
- Abort-Signale werden bis in den Mega-Web-Fallback durchgereicht.
- Laufende Polling-/Fetch-Schritte respektieren Stop/Timeout sofort.
- Wartende Jobs in der exklusiven Mega-Web-Queue koennen bei Abort sauber abbrechen.
- Download-Manager kann haengende Unrestrict-Phasen dadurch wieder automatisch per Timeout + Retry aufloesen, statt dauerhaft in "Link wird umgewandelt" zu bleiben.
### Tests
- Neue Tests sichern den Fix ab:
- Abort-Weitergabe bei Mega-Web-Unrestrict in `tests/debrid.test.ts`.
- Abort waehrend Mega-Web-Polling in `tests/mega-web-fallback.test.ts`.
## 1.4.33 - 2026-03-02
Hotfix-Release fuer zwei reale Produktionsprobleme: falsche Gesamt-Statistik bei leerer Queue und stilles DLC-Import-Failure bei Drag-and-Drop.
### Fixes
- **Stats-Anzeige korrigiert ("Gesamt" bei leerer Queue):**
- Wenn keine Pakete/Items mehr vorhanden sind, werden persistierte Run-Bytes und Run-Timestamps jetzt sauber auf 0 zurueckgesetzt.
- Dadurch verschwindet die Ghost-Anzeige wie z. B. `Gesamt: 19.99 GB` bei `Pakete: 0 / Dateien: 0`.
- Reset greift in den relevanten Pfaden (`getStats`, `clearAll`, Paket-Entfernung, Startup-Normalisierung).
- **DLC Drag-and-Drop Import gehaertet:**
- Lokale DLC-Fehler wie `Ungültiges DLC-Padding` blockieren den Fallback zu dcrypt nicht mehr.
- Oversize/invalid-size DLCs werden weiterhin defensiv behandelt, aber valide Dateien im gleichen Batch werden nicht mehr still geschluckt.
- Wenn alle DLC-Imports fehlschlagen, wird jetzt ein klarer Fehler mit Ursache geworfen statt still `0 Paket(e), 0 Link(s)` zu melden.
- **UI-Rueckmeldung verbessert:**
- Bei DLC-Import mit `0` Treffern zeigt die UI jetzt eine klare Meldung (`Keine gültigen Links in den DLC-Dateien gefunden`) statt eines irrefuehrenden Erfolgs-Toast.
### Tests
- Neue/erweiterte Tests fuer:
- Reset von `totalDownloadedBytes`/Stats bei leerer Queue.
- DLC-Fallback-Pfad bei lokalen Decrypt-Exceptions.
- Fehlerausgabe bei vollstaendig fehlgeschlagenem DLC-Import.
- Validierung:
- `npx tsc --noEmit` erfolgreich
- `npm test` erfolgreich (`283/283`)
- `npm run self-check` erfolgreich
## 1.4.32 - 2026-03-01
Diese Version erweitert den Auto-Renamer stark fuer reale Scene-/TV-Release-Strukturen (nested und flat) und fuehrt eine intensive Renamer-Regression mit zusaetzlichen Edge-Case- und Stress-Checks ein.
### Renamer (Download-Manager)
- Erweiterte Mustererkennung fuer nested und flat Staffel-Ordner mit Group-Suffix (z. B. `-TMSF`, `-TVS`, `-TvR`, `-ZZGtv`, `-SunDry`).
- Episode-Token kann jetzt auch aus kompakten Codes im Source-Namen abgeleitet werden (z. B. `301` -> `S03E01`, `211` -> `S02E11`, `101` -> `S01E01`), sofern Staffel-Hinweise vorhanden sind.
- `Teil1/Teil2` bzw. `Part1/Part2` wird auf `SxxExx` gemappt, inklusive Staffel-Ableitung aus der Ordnerstruktur.
- Repack-Handling ueber Dateiname und Ordnerstruktur vereinheitlicht (`rp`/`repack` -> `REPACK`-Token konsistent im Zielnamen).
- Flat-Season-Ordner (Dateien direkt im Staffelordner) bekommen jetzt sauberes Episode-Inlining statt unspezifischer Season-Dateinamen.
- Pfadlaengen-Schutz auf Windows gehaertet: erst normaler Zielname, dann deterministischer Paket-Fallback (z. B. `Show.S08E20`), danach sicherer Skip mit Warnlog statt fehlerhaftem Rename.
### Abgedeckte reale Muster (Beispiele)
- Arrow / Gotham / Britannia / Legion / Lethal.Weapon / Agent.X / Last.Impact
- Nested Unterordner mit Episodentiteln und flache Staffelordner mit vielen Episoden-Dateien
- Uneinheitliche Source-Namen wie `tvs-...-301`, `...-211`, `...teil1...`, `...rp...`
### Intensive Bugtests
- Unit-Tests fuer Renamer deutlich ausgebaut (`tests/auto-rename.test.ts`) mit zusaetzlichen realen Pattern- und Compact-Code-Faellen.
- Zusätzliche intensive Szenario- und Stress-Checks mit temporaeren Testdateien ausgefuehrt (nested/flat, Repack, Teil/Part, Compact-Code, Pfadlaenge, Kollisionsschutz).
- TypeScript Typecheck erfolgreich.
- Voller Vitest Lauf erfolgreich (`279/279`).
- End-to-End Self-Check erfolgreich.
## 1.4.31 - 2026-03-01
Diese Version schliesst die komplette Bug-Audit-Runde (156 Punkte) ab und fokussiert auf Stabilitaet, Datenintegritaet, sauberes Abbruchverhalten und reproduzierbares Release-Verhalten.
### Audit-Abschluss
- Vollstaendige Abarbeitung der Audit-Liste `Bug-Audit-Komplett-156-Bugs.txt` ueber Main-Process, Renderer, Storage, Update, Integrity und Logger.
- Vereinheitlichte Fehlerbehandlung fuer Netzwerk-, Abort-, Retry- und Timeout-Pfade.
- Harte Regression-Absicherung ueber Typecheck, Unit-Tests und Release-Build.
### Download-Manager (Queue, Retry, Stop/Start, Post-Processing)
- Retry-Status ist jetzt item-gebunden statt call-lokal (kein Retry-Reset bei Requeue, keine Endlos-Retry-Schleifen mehr).
- Stop-zu-Start-Resume in derselben Session repariert (gestoppte Items werden wieder sauber gequeued).
- HTTP-416-Pfade gehaertet (Body-Konsum, korrektes Fehlerbild im letzten Attempt, Contribution-Reset bei Datei-Neustart).
- Target-Path-Reservierung gegen Race-Fenster verbessert (kein verfruehtes Release waehrend Retry-Delay).
- Scheduler-Verhalten bei Reconnect/Abort bereinigt, inklusive Status- und Speed-Resets in Abbruchpfaden.
- Post-Processing/Extraction-Abbruch und Paket-Lifecycle synchronisiert (inkl. Cleanup und Run-Finish-Konsistenz).
- `prepareForShutdown()` raeumt Persist- und State-Emitter-Timer jetzt vollstaendig auf.
- Read-only Queue-Checks entkoppelt von mutierenden Seiteneffekten.
### Extractor
- Cleanup-Modus `trash` ueberarbeitet (keine permanente Loeschung mehr im Trash-Pfad).
- Konfliktmodus-Weitergabe in ZIP- und External-Fallback-Pfaden konsistent gemacht.
- Fortschritts-Puls robust gegen callback-exceptions (kein unhandled crash durch `onProgress`).
- ZIP/Volume-Erkennung und Cleanup-Targets fuer Multi-Part-Archive erweitert.
- Schutz gegen gefaehrliche ZIP-Eintraege und Problemarchive weiter gehaertet.
### Debrid / RealDebrid
- Abort-signale werden in Filename-Resolution und Provider-Fallback konsequent respektiert.
- Provider-Fallback bricht bei Abort sofort ab statt weitere Provider zu probieren.
- Rapidgator-Filename-Resolution auf Content-Type, Retry-Klassen und Body-Handling gehaertet.
- AllDebrid/BestDebrid URL-Validierung verbessert (nur gueltige HTTP(S)-direct URLs).
- User-Agent-Versionsdrift beseitigt (nun zentral ueber `APP_VERSION`).
- RealDebrid-Retry-Backoff ist abort-freundlich (kein unnoetiges Warten nach Stop/Abort).
### Storage / Session / Settings
- Temp-Dateipfade fuer Session-Save gegen Race/Kollision gehaertet.
- Session-Normalisierung und PackageOrder-Deduplizierung stabilisiert.
- Settings-Normalisierung tightened (kein unkontrolliertes Property-Leaking).
- Import- und Update-Pfade robust gegen invalides Input-Shape.
### Main / App-Controller / IPC
- IPC-Validierung erweitert (Payload-Typen, String-Laengen, Import-Size-Limits).
- Auto-Resume Start-Reihenfolge korrigiert, damit der Renderer initiale States sicher erhaelt.
- Fenster-Lifecycle-Handler fuer neu erstellte Fenster vereinheitlicht (macOS activate-recreate eingeschlossen).
- Clipboard-Normalisierung unicode-sicher (kein Surrogate-Split bei Truncation).
- Container-Path-Filter so korrigiert, dass legitime Dateinamen mit `..` nicht falsch verworfen werden.
### Update-System
- Dateinamenhygiene fuer Setup-Assets gehaertet (`basename` + sanitize gegen Traversal/RCE-Pfade).
- Zielpfad-Kollisionen beseitigt (Timestamp + PID + UUID).
- `spawn`-Error-Handling hinzugefuegt (kein unhandled EventEmitter crash beim Installer-Start).
- Download-Pipeline auf Shutdown-abort vorbereitet; aktive Update-Downloads koennen sauber abgebrochen werden.
- Stream/Timeout/Retry-Handling bei Download und Release-Fetch konsolidiert.
### Integrity
- CRC32-Berechnung optimiert (Lookup-Table + Event-Loop-Yield), deutlich weniger UI-/Loop-Blockade bei grossen Dateien.
- Hash-Manifest-Lesen gecacht (reduzierte Disk-I/O bei Multi-File-Validierung).
- Manifest-Key-Matching fuer relative Pfade und Basenamen vereinheitlicht.
### Logger
- Rotation im async- und fallback-Pfad vervollstaendigt.
- Rotate-Checks pro Datei getrennt statt global geteilt.
- Async-Flush robust gegen Log-Loss bei Write-Fehlern (pending Lines werden erst nach erfolgreichem Write entfernt).
### Renderer (App.tsx)
- Theme-Toggle, Sortier-Optimismus und Picker-Busy-Lifecycle gegen Race Conditions gehaertet.
- Mounted-Guards fuer fruehe Unmount-Pfade ergaenzt.
- Drag-and-Drop nutzt aktive Tab-Referenz robust ueber async Grenzen.
- Confirm-Dialog-Text rendert Zeilenumbrueche korrekt.
- PackageCard-Memovergleich erweitert (inkl. Dateiname) fuer korrekte Re-Renders.
- Human-size Anzeige gegen negative/NaN Inputs gehaertet.
### QA / Build / Release
- TypeScript Typecheck erfolgreich.
- Voller Vitest Lauf erfolgreich (`262/262`).
- Windows Release-Build erfolgreich (NSIS + Portable).

23
CLAUDE.md Normal file
View File

@ -0,0 +1,23 @@
## Release + Update Source (Wichtig)
- Primäre Plattform ist `https://git.24-music.de`
- Standard-Repo: `Administrator/real-debrid-downloader`
- Nicht mehr primär über Codeberg/GitHub releasen
## Releasen
1. Token setzen:
- PowerShell: `$env:GITEA_TOKEN="<token>"`
2. Release ausführen:
- `npm run release:gitea -- <version> [notes]`
Das Script:
- bumped `package.json`
- baut Windows-Artefakte
- pusht `main` + Tag
- erstellt Release auf `git.24-music.de`
- lädt Assets hoch
## Auto-Update
- Updater nutzt aktuell `git.24-music.de` als Standardquelle

View File

@ -1,6 +1,6 @@
# Multi Debrid Downloader
Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** with fast queue management, automatic extraction, and robust error handling.
Desktop downloader with fast queue management, automatic extraction, and robust error handling.
![Platform](https://img.shields.io/badge/platform-Windows%2010%2F11-0078D6)
![Electron](https://img.shields.io/badge/Electron-31.x-47848F)
@ -20,8 +20,10 @@ Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** w
- Package-based queue with file status, progress, ETA, speed, and retry counters.
- Start, pause, stop, and cancel for both single items and full packages.
- Multi-select via Ctrl+Click for batch operations on packages and items.
- Duplicate handling when adding links: keep, skip, or overwrite.
- Session recovery after restart, including optional auto-resume.
- Circuit breaker with escalating backoff cooldowns to handle provider outages gracefully.
### Debrid and link handling
@ -32,10 +34,30 @@ Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** w
### Extraction, cleanup, and quality
- JVM-based extraction backend using SevenZipJBinding + Zip4j (supports RAR, 7z, ZIP, and more).
- Automatic fallback to legacy UnRAR/7z CLI tools when JVM is unavailable.
- Auto-extract with separate target directory and conflict strategies.
- Hybrid extraction, optional removal of link artifacts and sample files.
- Hybrid extraction: simultaneous downloading and extracting with smart I/O priority throttling.
- Nested extraction: archives within archives are automatically extracted (one level deep).
- Pre-extraction disk space validation to prevent incomplete extracts.
- Right-click "Extract now" on any package with at least one completed item.
- Post-download integrity checks (`CRC32`, `MD5`, `SHA1`) with auto-retry on failures.
- Completed-item cleanup policy: `never`, `immediate`, `on_start`, `package_done`.
- Optional removal of link artifacts and sample files after extraction.
### Auto-rename
- Automatic renaming of extracted files based on series/episode patterns.
- Multi-episode token parsing for batch renames.
### UI and progress
- Visual progress bars with percentage overlay for packages and individual items.
- Real-time bandwidth chart showing current download speeds.
- Persistent download counters: all-time totals and per-session statistics.
- Download history for completed packages.
- Vertical sidebar with organized settings tabs.
- Hoster display showing both the original source and the debrid provider used.
### Convenience and automation
@ -43,17 +65,18 @@ Desktop downloader for **Real-Debrid, Mega-Debrid, BestDebrid, and AllDebrid** w
- Minimize-to-tray with tray menu controls.
- Speed limits globally or per download.
- Bandwidth schedules for time-based speed profiles.
- Built-in update checks via Codeberg Releases.
- Built-in auto-updater via `git.24-music.de` Releases.
- Long path support (>260 characters) on Windows.
## Installation
### Option A: prebuilt releases (recommended)
1. Download a release from the Codeberg Releases page.
1. Download a release from the `git.24-music.de` Releases page.
2. Run the installer or portable build.
3. Add your debrid tokens in Settings.
Releases: `https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases`
Releases: `https://git.24-music.de/Administrator/real-debrid-downloader/releases`
### Option B: build from source
@ -62,7 +85,8 @@ Requirements:
- Node.js `20+` (recommended `22+`)
- npm
- Windows `10/11` (for packaging and regular desktop use)
- Optional: 7-Zip/UnRAR for specific archive formats
- Java Runtime `8+` (for SevenZipJBinding sidecar backend)
- Optional fallback: 7-Zip/UnRAR if you force legacy extraction mode
```bash
npm install
@ -79,21 +103,34 @@ npm run dev
| `npm test` | Runs Vitest unit tests |
| `npm run self-check` | Runs integrated end-to-end self-checks |
| `npm run release:win` | Creates Windows installer and portable build |
| `npm run release:codeberg -- <version> [notes]` | One-command version bump + build + tag + Codeberg release upload |
| `npm run release:gitea -- <version> [notes]` | One-command version bump + build + tag + release upload to `git.24-music.de` |
| `npm run release:codeberg -- <version> [notes]` | Legacy path for old Codeberg workflow |
### One-command Codeberg release
### One-command git.24-music release
```bash
npm run release:codeberg -- 1.4.42 "- Maintenance update"
npm run release:gitea -- 1.6.31 "- Maintenance update"
```
This command will:
1. Bump `package.json` version.
2. Build setup/portable artifacts (`npm run release:win`).
3. Commit and push `main` to your Codeberg remote.
3. Commit and push `main` to your `git.24-music.de` remote.
4. Create and push tag `v<version>`.
5. Create/update the Codeberg release and upload required assets.
5. Create/update the Gitea release and upload required assets.
Required once before release:
```bash
git remote add gitea https://git.24-music.de/<user>/<repo>.git
```
PowerShell token setup:
```powershell
$env:GITEA_TOKEN="<dein-token>"
```
## Typical workflow
@ -110,6 +147,7 @@ This command will:
- `src/renderer` - React UI
- `src/shared` - shared types and IPC contracts
- `tests` - unit tests and self-check tests
- `resources/extractor-jvm` - SevenZipJBinding + Zip4j sidecar JAR and native libraries
## Data and logs
@ -122,13 +160,43 @@ The app stores runtime files in Electron's `userData` directory, including:
## Troubleshooting
- Download does not start: verify token and selected provider in Settings.
- Extraction fails: check archive passwords and extraction tool availability.
- Extraction fails: check archive passwords and native extractor installation (7-Zip/WinRAR). Optional JVM extractor can be forced with `RD_EXTRACT_BACKEND=jvm`.
- Very slow downloads: check active speed limit and bandwidth schedules.
- Unexpected interruptions: enable reconnect and fallback providers.
- Stalled downloads: the app auto-detects stalls within 10 seconds and retries automatically.
## Changelog
Release history is available in `CHANGELOG.md` and on Codeberg Releases.
Release history is available on [git.24-music.de Releases](https://git.24-music.de/Administrator/real-debrid-downloader/releases).
### v1.6.61 (2026-03-05)
- Fixed leftover empty package folders in `Downloader Unfertig` after successful extraction.
- Resume marker files (`.rd_extract_progress*.json`) are now treated as ignorable for empty-folder cleanup.
- Deferred post-processing now clears resume markers before running empty-directory removal.
### v1.6.60 (2026-03-05)
- Added package-scoped password cache for extraction: once the first archive in a package is solved, following archives in the same package reuse that password first.
- Kept fallback behavior intact (`""` and other candidates are still tested), but moved empty-password probing behind the learned password to reduce per-archive delays.
- Added cache invalidation on real `wrong_password` failures so stale passwords are automatically discarded.
### v1.6.59 (2026-03-05)
- Switched default extraction backend to native tools (`legacy`) for more stable archive-to-archive flow.
- Prioritized 7-Zip as primary native extractor, with WinRAR/UnRAR as fallback.
- JVM extractor remains available as opt-in via `RD_EXTRACT_BACKEND=jvm`.
### v1.6.58 (2026-03-05)
- Fixed extraction progress oscillation (`1% -> 100% -> 1%` loops) during password retries.
- Kept strict archive completion logic, but normalized in-progress archive percent to avoid false visual done states before real completion.
### v1.6.57 (2026-03-05)
- Fixed extraction flow so archives are marked done only on real completion, not on temporary `100%` progress spikes.
- Improved password handling: after the first successful archive, the discovered password is prioritized for subsequent archives.
- Fixed progress parsing for password retries (reset/restart handling), reducing visible and real gaps between archive extractions.
## License

View File

@ -1,75 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const credResult = spawnSync("git", ["credential", "fill"], {
input: "protocol=https\nhost=codeberg.org\n\n",
encoding: "utf8",
stdio: ["pipe", "pipe", "pipe"]
});
const creds = new Map();
for (const line of credResult.stdout.split(/\r?\n/)) {
if (line.includes("=")) {
const [k, v] = line.split("=", 2);
creds.set(k, v);
}
}
const auth = "Basic " + Buffer.from(creds.get("username") + ":" + creds.get("password")).toString("base64");
const owner = "Sucukdeluxe";
const repo = "real-debrid-downloader";
const tag = "v1.5.27";
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
async function main() {
await fetch(baseApi, {
method: "PATCH",
headers: { Authorization: auth, "Content-Type": "application/json" },
body: JSON.stringify({ has_releases: true })
});
const createRes = await fetch(`${baseApi}/releases`, {
method: "POST",
headers: { Authorization: auth, "Content-Type": "application/json", Accept: "application/json" },
body: JSON.stringify({
tag_name: tag,
target_commitish: "main",
name: tag,
body: "- Increase column spacing for Fortschritt/Größe/Geladen",
draft: false,
prerelease: false
})
});
const release = await createRes.json();
if (!createRes.ok) {
console.error("Create failed:", JSON.stringify(release));
process.exit(1);
}
console.log("Release created:", release.id);
const files = [
"Real-Debrid-Downloader Setup 1.5.27.exe",
"Real-Debrid-Downloader 1.5.27.exe",
"latest.yml",
"Real-Debrid-Downloader Setup 1.5.27.exe.blockmap"
];
for (const f of files) {
const filePath = path.join("release", f);
const data = fs.readFileSync(filePath);
const uploadUrl = `${baseApi}/releases/${release.id}/assets?name=${encodeURIComponent(f)}`;
const res = await fetch(uploadUrl, {
method: "POST",
headers: { Authorization: auth, "Content-Type": "application/octet-stream" },
body: data
});
if (res.ok) {
console.log("Uploaded:", f);
} else if (res.status === 409 || res.status === 422) {
console.log("Skipped existing:", f);
} else {
console.error("Upload failed for", f, ":", res.status);
}
}
console.log(`Done! https://codeberg.org/${owner}/${repo}/releases/tag/${tag}`);
}
main().catch(e => { console.error(e.message); process.exit(1); });

View File

@ -25,11 +25,11 @@ AppPublisher=Sucukdeluxe
DefaultDirName={autopf}\{#MyAppName}
DefaultGroupName={#MyAppName}
OutputDir={#MyOutputDir}
OutputBaseFilename=Real-Debrid-Downloader-Setup-{#MyAppVersion}
OutputBaseFilename=Real-Debrid-Downloader Setup {#MyAppVersion}
Compression=lzma
SolidCompression=yes
WizardStyle=modern
PrivilegesRequired=admin
PrivilegesRequired=lowest
ArchitecturesInstallIn64BitMode=x64compatible
UninstallDisplayIcon={app}\{#MyAppExeName}
SetupIconFile={#MyIconFile}
@ -39,8 +39,8 @@ Name: "german"; MessagesFile: "compiler:Languages\German.isl"
Name: "english"; MessagesFile: "compiler:Default.isl"
[Files]
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"; Flags: ignoreversion
Source: "{#MySourceDir}\\*"; DestDir: "{app}"; Flags: recursesubdirs createallsubdirs
Source: "{#MyIconFile}"; DestDir: "{app}"; DestName: "app_icon.ico"
[Icons]
Name: "{group}\{#MyAppName}"; Filename: "{app}\{#MyAppExeName}"; IconFilename: "{app}\app_icon.ico"

161
package-lock.json generated
View File

@ -1,12 +1,12 @@
{
"name": "real-debrid-downloader",
"version": "1.4.68",
"version": "1.5.66",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "real-debrid-downloader",
"version": "1.4.68",
"version": "1.5.66",
"license": "MIT",
"dependencies": {
"adm-zip": "^0.5.16",
@ -25,6 +25,7 @@
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
@ -64,7 +65,6 @@
"integrity": "sha512-CGOfOJqWjg2qW/Mb6zNsDm+u5vFQ8DxXfbM09z69p5Z6+mE1ikP2jUXw+j42Pf1XTYED2Rni5f95npYeuwMDQA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@babel/code-frame": "^7.29.0",
"@babel/generator": "^7.29.0",
@ -2043,7 +2043,6 @@
"integrity": "sha512-z9VXpC7MWrhfWipitjNdgCauoMLRdIILQsAEV+ZesIzBq/oUlxk0m3ApZuMFCXdnS4U7KrI+l3WRUEGQ8K1QKw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@types/prop-types": "*",
"csstype": "^3.2.2"
@ -2305,7 +2304,6 @@
"integrity": "sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"fast-deep-equal": "^3.1.1",
"fast-json-stable-stringify": "^2.0.0",
@ -2479,6 +2477,7 @@
"integrity": "sha512-+25nxyyznAXF7Nef3y0EbBeqmGZgeN/BxHX29Rs39djAfaFalmQ89SE6CWyDCHzGL0yt/ycBtNOmGTW0FyGWNw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"archiver-utils": "^2.1.0",
"async": "^3.2.4",
@ -2498,6 +2497,7 @@
"integrity": "sha512-bEL/yUb/fNNiNTuUz979Z0Yg5L+LzLxGJz8x79lYmR54fmTIb6ob/hNQgkQnIUDWIFjZVQwl9Xs356I6BAMHfw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"glob": "^7.1.4",
"graceful-fs": "^4.2.0",
@ -2520,6 +2520,7 @@
"integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"core-util-is": "~1.0.0",
"inherits": "~2.0.3",
@ -2535,7 +2536,8 @@
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/archiver-utils/node_modules/string_decoder": {
"version": "1.1.1",
@ -2543,6 +2545,7 @@
"integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "~5.1.0"
}
@ -2762,7 +2765,6 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"baseline-browser-mapping": "^2.9.0",
"caniuse-lite": "^1.0.30001759",
@ -3344,6 +3346,7 @@
"integrity": "sha512-D3uMHtGc/fcO1Gt1/L7i1e33VOvD4A9hfQLP+6ewd+BvG/gQ84Yh4oftEhAdjSMgBgwGL+jsppT7JYNpo6MHHg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"buffer-crc32": "^0.2.13",
"crc32-stream": "^4.0.2",
@ -3517,6 +3520,7 @@
"integrity": "sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==",
"dev": true,
"license": "Apache-2.0",
"peer": true,
"bin": {
"crc32": "bin/crc32.njs"
},
@ -3530,6 +3534,7 @@
"integrity": "sha512-NT7w2JVU7DFroFdYkeq8cywxrgjPHWkdX1wjpRQXPX5Asews3tA+Ght6lddQO5Mkumffp3X7GEqku3epj2toIw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"crc-32": "^1.2.0",
"readable-stream": "^3.4.0"
@ -3572,6 +3577,54 @@
"node": ">= 8"
}
},
"node_modules/cross-spawn-windows-exe": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/cross-spawn-windows-exe/-/cross-spawn-windows-exe-1.2.0.tgz",
"integrity": "sha512-mkLtJJcYbDCxEG7Js6eUnUNndWjyUZwJ3H7bErmmtOYU/Zb99DyUkpamuIZE0b3bhmJyZ7D90uS6f+CGxRRjOw==",
"dev": true,
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/malept"
},
{
"type": "tidelift",
"url": "https://tidelift.com/subscription/pkg/npm-cross-spawn-windows-exe?utm_medium=referral&utm_source=npm_fund"
}
],
"license": "Apache-2.0",
"dependencies": {
"@malept/cross-spawn-promise": "^1.1.0",
"is-wsl": "^2.2.0",
"which": "^2.0.2"
},
"engines": {
"node": ">= 10"
}
},
"node_modules/cross-spawn-windows-exe/node_modules/@malept/cross-spawn-promise": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@malept/cross-spawn-promise/-/cross-spawn-promise-1.1.1.tgz",
"integrity": "sha512-RTBGWL5FWQcg9orDOCcp4LvItNzUPcyEU9bwaeJX0rJ1IQxzucC48Y0/sQLp/g6t99IQgAlGIaesJS+gTn7tVQ==",
"dev": true,
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/malept"
},
{
"type": "tidelift",
"url": "https://tidelift.com/subscription/pkg/npm-.malept-cross-spawn-promise?utm_medium=referral&utm_source=npm_fund"
}
],
"license": "Apache-2.0",
"dependencies": {
"cross-spawn": "^7.0.1"
},
"engines": {
"node": ">= 10"
}
},
"node_modules/csstype": {
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz",
@ -3780,7 +3833,6 @@
"integrity": "sha512-NoXo6Liy2heSklTI5OIZbCgXC1RzrDQsZkeEwXhdOro3FT1VBOvbubvscdPnjVuQ4AMwwv61oaH96AbiYg9EnQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"app-builder-lib": "25.1.8",
"builder-util": "25.1.7",
@ -3976,6 +4028,7 @@
"integrity": "sha512-2ntkJ+9+0GFP6nAISiMabKt6eqBB0kX1QqHNWFWAXgi0VULKGisM46luRFpIBiU3u/TDmhZMM8tzvo2Abn3ayg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"app-builder-lib": "25.1.8",
"archiver": "^5.3.1",
@ -3989,6 +4042,7 @@
"integrity": "sha512-oRXApq54ETRj4eMiFzGnHWGy+zo5raudjuxN0b8H7s/RU2oW0Wvsx9O0ACRN/kRq9E8Vu/ReskGB5o3ji+FzHQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
@ -4004,6 +4058,7 @@
"integrity": "sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"universalify": "^2.0.0"
},
@ -4017,6 +4072,7 @@
"integrity": "sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 10.0.0"
}
@ -4253,7 +4309,6 @@
"dev": true,
"hasInstallScript": true,
"license": "MIT",
"peer": true,
"bin": {
"esbuild": "bin/esbuild"
},
@ -4539,7 +4594,8 @@
"resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz",
"integrity": "sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/fs-extra": {
"version": "8.1.0",
@ -5146,6 +5202,22 @@
"is-ci": "bin.js"
}
},
"node_modules/is-docker": {
"version": "2.2.1",
"resolved": "https://registry.npmjs.org/is-docker/-/is-docker-2.2.1.tgz",
"integrity": "sha512-F+i2BKsFrH66iaUFc0woD8sLy8getkwTwtOBjvs56Cx4CgJDeKQeqfz8wAYiSb8JOprWhHH5p77PbmYCvvUuXQ==",
"dev": true,
"license": "MIT",
"bin": {
"is-docker": "cli.js"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-fullwidth-code-point": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
@ -5186,12 +5258,26 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-wsl": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/is-wsl/-/is-wsl-2.2.0.tgz",
"integrity": "sha512-fKzAra0rGJUUBwGBgNkHZuToZcn+TtXHpeCgmkMJMMYx1sQDYaCSyjJBSCa2nH1DGm7s3n1oBnohoVTBaN7Lww==",
"dev": true,
"license": "MIT",
"dependencies": {
"is-docker": "^2.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/isarray": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
"integrity": "sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/isbinaryfile": {
"version": "5.0.7",
@ -5376,6 +5462,7 @@
"integrity": "sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"readable-stream": "^2.0.5"
},
@ -5389,6 +5476,7 @@
"integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"core-util-is": "~1.0.0",
"inherits": "~2.0.3",
@ -5404,7 +5492,8 @@
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/lazystream/node_modules/string_decoder": {
"version": "1.1.1",
@ -5412,6 +5501,7 @@
"integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "~5.1.0"
}
@ -5458,35 +5548,40 @@
"resolved": "https://registry.npmjs.org/lodash.defaults/-/lodash.defaults-4.2.0.tgz",
"integrity": "sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/lodash.difference": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/lodash.difference/-/lodash.difference-4.5.0.tgz",
"integrity": "sha512-dS2j+W26TQ7taQBGN8Lbbq04ssV3emRw4NY58WErlTO29pIqS0HmoT5aJ9+TUQ1N3G+JOZSji4eugsWwGp9yPA==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/lodash.flatten": {
"version": "4.4.0",
"resolved": "https://registry.npmjs.org/lodash.flatten/-/lodash.flatten-4.4.0.tgz",
"integrity": "sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/lodash.isplainobject": {
"version": "4.0.6",
"resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/lodash.union": {
"version": "4.6.0",
"resolved": "https://registry.npmjs.org/lodash.union/-/lodash.union-4.6.0.tgz",
"integrity": "sha512-c4pB2CdGrGdjMKYLA+XiRDO7Y0PRQbm/Gzg8qMj+QH+pFVAoTp5sBpO0odL3FjoPCGjK96p6qsP+yQoiLoOBcw==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/log-symbols": {
"version": "4.1.0",
@ -6050,6 +6145,7 @@
"integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=0.10.0"
}
@ -6310,7 +6406,6 @@
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=12"
},
@ -6375,7 +6470,6 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"nanoid": "^3.3.11",
"picocolors": "^1.1.1",
@ -6433,7 +6527,8 @@
"resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz",
"integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/progress": {
"version": "2.0.3",
@ -6507,12 +6602,24 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/rcedit": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/rcedit/-/rcedit-5.0.2.tgz",
"integrity": "sha512-dgysxaeXZ4snLpPjn8aVtHvZDCx+aRcvZbaWBgl1poU6OPustMvOkj9a9ZqASQ6i5Y5szJ13LSvglEOwrmgUxA==",
"dev": true,
"license": "MIT",
"dependencies": {
"cross-spawn-windows-exe": "^1.1.0"
},
"engines": {
"node": ">= 22.12.0"
}
},
"node_modules/react": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz",
"integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"loose-envify": "^1.1.0"
},
@ -6577,6 +6684,7 @@
"integrity": "sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA==",
"dev": true,
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"minimatch": "^5.1.0"
}
@ -6586,7 +6694,8 @@
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/readdir-glob/node_modules/brace-expansion": {
"version": "2.0.2",
@ -6594,6 +6703,7 @@
"integrity": "sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"balanced-match": "^1.0.0"
}
@ -6604,6 +6714,7 @@
"integrity": "sha512-7o1wEA2RyMP7Iu7GNba9vc0RWWGACJOCZBJX2GJWip0ikV+wcOsgVuY9uE8CPiyQhkGFSlhuSkZPavN7u1c2Fw==",
"dev": true,
"license": "ISC",
"peer": true,
"dependencies": {
"brace-expansion": "^2.0.1"
},
@ -7284,6 +7395,7 @@
"integrity": "sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"bl": "^4.0.3",
"end-of-stream": "^1.4.1",
@ -7568,7 +7680,6 @@
"integrity": "sha512-5C1sg4USs1lfG0GFb2RLXsdpXqBSEhAaA/0kPL01wxzpMqLILNxIxIOKiILz+cdg/pLnOUxFYOR5yhHU666wbw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "~0.27.0",
"get-tsconfig": "^4.7.5"
@ -7751,7 +7862,6 @@
"integrity": "sha512-+Oxm7q9hDoLMyJOYfUYBuHQo+dkAloi33apOPP56pzj+vsdJDzr+j1NISE5pyaAuKL4A3UD34qd0lx5+kfKp2g==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "^0.25.0",
"fdir": "^6.4.4",
@ -9361,7 +9471,6 @@
"integrity": "sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "^0.21.3",
"postcss": "^8.4.43",
@ -9619,6 +9728,7 @@
"integrity": "sha512-9qv4rlDiopXg4E69k+vMHjNN63YFMe9sZMrdlvKnCjlCRWeCBswPPMPUfx+ipsAWq1LXHe70RcbaHdJJpS6hyQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"archiver-utils": "^3.0.4",
"compress-commons": "^4.1.2",
@ -9634,6 +9744,7 @@
"integrity": "sha512-KVgf4XQVrTjhyWmx6cte4RxonPLR9onExufI1jhvw/MQ4BB6IsZD5gT8Lq+u/+pRkWna/6JoHpiQioaqFP5Rzw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"glob": "^7.2.3",
"graceful-fs": "^4.2.0",

View File

@ -1,7 +1,7 @@
{
"name": "real-debrid-downloader",
"version": "1.5.34",
"description": "Real-Debrid Downloader Desktop (Electron + React + TypeScript)",
"version": "1.6.66",
"description": "Desktop downloader",
"main": "build/main/main/main.js",
"author": "Sucukdeluxe",
"license": "MIT",
@ -17,7 +17,8 @@
"test": "vitest run",
"self-check": "tsx tests/self-check.ts",
"release:win": "npm run build && electron-builder --publish never --win nsis portable",
"release:codeberg": "node scripts/release_codeberg.mjs"
"release:gitea": "node scripts/release_gitea.mjs",
"release:forgejo": "node scripts/release_gitea.mjs"
},
"dependencies": {
"adm-zip": "^0.5.16",
@ -36,6 +37,7 @@
"cross-env": "^7.0.3",
"electron": "^31.7.7",
"electron-builder": "^25.1.8",
"rcedit": "^5.0.2",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
@ -53,8 +55,12 @@
"files": [
"build/main/**/*",
"build/renderer/**/*",
"resources/extractor-jvm/**/*",
"package.json"
],
"asarUnpack": [
"resources/extractor-jvm/**/*"
],
"win": {
"target": [
"nsis",
@ -68,6 +74,7 @@
"perMachine": false,
"allowToChangeInstallationDirectory": true,
"createDesktopShortcut": true
}
},
"afterPack": "scripts/afterPack.cjs"
}
}

View File

@ -0,0 +1,22 @@
# JVM extractor runtime
This directory contains the Java sidecar runtime used by `src/main/extractor.ts`.
## Included backends
- `sevenzipjbinding` for the primary extraction path (RAR/7z/ZIP and others)
- `zip4j` for ZIP multipart handling (JD-style split ZIP behavior)
## Layout
- `classes/` compiled `JBindExtractorMain` classes
- `lib/` runtime jars required by the sidecar
- `src/` Java source for the sidecar
## Rebuild notes
The checked-in classes are Java 8 compatible and built from:
`resources/extractor-jvm/src/com/sucukdeluxe/extractor/JBindExtractorMain.java`
If you need to rebuild, compile against the jars in `lib/` with a Java 8-compatible compiler.

View File

@ -0,0 +1,12 @@
Bundled JVM extractor dependencies:
1) sevenzipjbinding (16.02-2.01)
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding
- Maven artifact: net.sf.sevenzipjbinding:sevenzipjbinding-all-platforms
- Upstream: https://sevenzipjbind.sourceforge.net/
2) zip4j (2.11.5)
- Maven artifact: net.lingala.zip4j:zip4j
- Upstream: https://github.com/srikanth-lingala/zip4j
Please review upstream licenses and notices before redistribution.

Binary file not shown.

Binary file not shown.

File diff suppressed because it is too large Load Diff

18
scripts/afterPack.cjs Normal file
View File

@ -0,0 +1,18 @@
const path = require("path");
const { rcedit } = require("rcedit");
module.exports = async function afterPack(context) {
const productFilename = context.packager?.appInfo?.productFilename;
if (!productFilename) {
console.warn(" • rcedit: skipped — productFilename not available");
return;
}
const exePath = path.join(context.appOutDir, `${productFilename}.exe`);
const iconPath = path.resolve(__dirname, "..", "assets", "app_icon.ico");
console.log(` • rcedit: patching icon → ${exePath}`);
try {
await rcedit(exePath, { icon: iconPath });
} catch (error) {
console.warn(` • rcedit: failed — ${String(error)}`);
}
};

View File

@ -31,6 +31,7 @@ async function main(): Promise<void> {
login: settings.megaLogin,
password: settings.megaPassword
}));
try {
const service = new DebridService(settings, {
megaWebUnrestrict: (link) => megaWeb.unrestrict(link)
});
@ -42,7 +43,9 @@ async function main(): Promise<void> {
console.log(`[FAIL] ${String(error)}`);
}
}
} finally {
megaWeb.dispose();
}
}
void main();
main().catch(e => { console.error(e); process.exit(1); });

View File

@ -16,8 +16,8 @@ function sleep(ms) {
}
function cookieFrom(headers) {
const raw = headers.get("set-cookie") || "";
return raw.split(",").map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
const cookies = headers.getSetCookie();
return cookies.map((x) => x.split(";")[0].trim()).filter(Boolean).join("; ");
}
function parseDebridCodes(html) {
@ -47,6 +47,9 @@ async function resolveCode(cookie, code) {
});
const text = (await res.text()).trim();
if (text === "reload") {
if (attempt % 5 === 0) {
console.log(` [retry] code=${code} attempt=${attempt}/50 (waiting for server)`);
}
await sleep(800);
continue;
}
@ -98,7 +101,13 @@ async function main() {
redirect: "manual"
});
if (loginRes.status >= 400) {
throw new Error(`Login failed with HTTP ${loginRes.status}`);
}
const cookie = cookieFrom(loginRes.headers);
if (!cookie) {
throw new Error("Login returned no session cookie");
}
console.log("login", loginRes.status, loginRes.headers.get("location") || "");
const debridRes = await fetch("https://www.mega-debrid.eu/index.php?form=debrid", {
@ -136,4 +145,4 @@ async function main() {
}
}
await main();
await main().catch((e) => { console.error(e); process.exit(1); });

View File

@ -66,6 +66,8 @@ async function callRealDebrid(link) {
};
}
// megaCookie is intentionally cached at module scope so that multiple
// callMegaDebrid() invocations reuse the same session cookie.
async function callMegaDebrid(link) {
if (!megaCookie) {
const loginRes = await fetch("https://www.mega-debrid.eu/index.php?form=login", {
@ -77,13 +79,15 @@ async function callMegaDebrid(link) {
body: new URLSearchParams({ login: megaLogin, password: megaPassword, remember: "on" }),
redirect: "manual"
});
megaCookie = (loginRes.headers.get("set-cookie") || "")
.split(",")
if (loginRes.status >= 400) {
return { ok: false, error: `Mega-Web login failed with HTTP ${loginRes.status}` };
}
megaCookie = loginRes.headers.getSetCookie()
.map((chunk) => chunk.split(";")[0].trim())
.filter(Boolean)
.join("; ");
if (!megaCookie) {
return { ok: false, error: "Mega-Web login failed" };
return { ok: false, error: "Mega-Web login returned no session cookie" };
}
}
@ -290,4 +294,4 @@ async function main() {
}
}
await main();
await main().catch((e) => { console.error(e); process.exit(1); });

View File

@ -2,7 +2,15 @@ import fs from "node:fs";
import path from "node:path";
import { spawnSync } from "node:child_process";
const NPM_EXECUTABLE = process.platform === "win32" ? "npm.cmd" : "npm";
const NPM_RELEASE_WIN = process.platform === "win32"
? {
command: process.env.ComSpec || "cmd.exe",
args: ["/d", "/s", "/c", "npm run release:win"]
}
: {
command: "npm",
args: ["run", "release:win"]
};
function run(command, args, options = {}) {
const result = spawnSync(command, args, {
@ -37,7 +45,8 @@ function runWithInput(command, args, input) {
cwd: process.cwd(),
encoding: "utf8",
input,
stdio: ["pipe", "pipe", "pipe"]
stdio: ["pipe", "pipe", "pipe"],
timeout: 10000
});
if (result.status !== 0) {
const stderr = String(result.stderr || "").trim();
@ -59,37 +68,74 @@ function parseArgs(argv) {
return { help: false, dryRun, version, notes };
}
function parseCodebergRemote(url) {
function parseRemoteUrl(url) {
const raw = String(url || "").trim();
const httpsMatch = raw.match(/^https?:\/\/(?:www\.)?codeberg\.org\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
const httpsMatch = raw.match(/^https?:\/\/([^/]+)\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (httpsMatch) {
return { owner: httpsMatch[1], repo: httpsMatch[2] };
return { host: httpsMatch[1], owner: httpsMatch[2], repo: httpsMatch[3] };
}
const sshMatch = raw.match(/^git@codeberg\.org:([^/]+)\/([^/]+?)(?:\.git)?$/i);
const sshMatch = raw.match(/^git@([^:]+):([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshMatch) {
return { owner: sshMatch[1], repo: sshMatch[2] };
return { host: sshMatch[1], owner: sshMatch[2], repo: sshMatch[3] };
}
throw new Error(`Cannot parse Codeberg remote URL: ${raw}`);
const sshAltMatch = raw.match(/^ssh:\/\/git@([^/:]+)(?::\d+)?\/([^/]+)\/([^/]+?)(?:\.git)?$/i);
if (sshAltMatch) {
return { host: sshAltMatch[1], owner: sshAltMatch[2], repo: sshAltMatch[3] };
}
throw new Error(`Cannot parse remote URL: ${raw}`);
}
function getCodebergRepo() {
const remotes = ["codeberg", "origin"];
function normalizeBaseUrl(url) {
const raw = String(url || "").trim().replace(/\/+$/, "");
if (!raw) {
return "";
}
if (!/^https?:\/\//i.test(raw)) {
throw new Error("GITEA_BASE_URL must start with http:// or https://");
}
return raw;
}
function getGiteaRepo() {
const forcedRemote = String(process.env.GITEA_REMOTE || process.env.FORGEJO_REMOTE || "").trim();
const remotes = forcedRemote
? [forcedRemote]
: ["gitea", "forgejo", "origin", "github-new", "codeberg"];
const preferredBase = normalizeBaseUrl(process.env.GITEA_BASE_URL || process.env.FORGEJO_BASE_URL || "https://git.24-music.de");
const preferredProtocol = preferredBase ? new URL(preferredBase).protocol : "https:";
for (const remote of remotes) {
try {
const remoteUrl = runCapture("git", ["remote", "get-url", remote]);
if (/codeberg\.org/i.test(remoteUrl)) {
const parsed = parseCodebergRemote(remoteUrl);
return { remote, ...parsed };
const parsed = parseRemoteUrl(remoteUrl);
const remoteBase = `https://${parsed.host}`.toLowerCase();
if (preferredBase && remoteBase !== preferredBase.toLowerCase().replace(/^http:/, "https:")) {
continue;
}
return { remote, ...parsed, baseUrl: `${preferredProtocol}//${parsed.host}` };
} catch {
// try next remote
}
}
throw new Error("No Codeberg remote found. Add one with: git remote add codeberg https://codeberg.org/<owner>/<repo>.git");
if (preferredBase) {
throw new Error(
`No remote found for ${preferredBase}. Add one with: git remote add gitea ${preferredBase}/<owner>/<repo>.git`
);
}
throw new Error("No suitable remote found. Set GITEA_REMOTE or GITEA_BASE_URL.");
}
function getCodebergAuthHeader() {
const credentialText = runWithInput("git", ["credential", "fill"], "protocol=https\nhost=codeberg.org\n\n");
function getAuthHeader(host) {
const explicitToken = String(process.env.GITEA_TOKEN || process.env.FORGEJO_TOKEN || "").trim();
if (explicitToken) {
return `token ${explicitToken}`;
}
const credentialText = runWithInput("git", ["credential", "fill"], `protocol=https\nhost=${host}\n\n`);
const map = new Map();
for (const line of credentialText.split(/\r?\n/)) {
if (!line.includes("=")) {
@ -101,7 +147,9 @@ function getCodebergAuthHeader() {
const username = map.get("username") || "";
const password = map.get("password") || "";
if (!username || !password) {
throw new Error("Missing Codeberg credentials in git credential helper");
throw new Error(
`Missing credentials for ${host}. Set GITEA_TOKEN or store credentials for this host in git credential helper.`
);
}
const token = Buffer.from(`${username}:${password}`, "utf8").toString("base64");
return `Basic ${token}`;
@ -142,7 +190,8 @@ function updatePackageVersion(rootDir, version) {
const packagePath = path.join(rootDir, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
if (String(packageJson.version || "") === version) {
throw new Error(`package.json is already at version ${version}`);
process.stdout.write(`package.json is already at version ${version}, skipping update.\n`);
return;
}
packageJson.version = version;
fs.writeFileSync(packagePath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
@ -197,8 +246,7 @@ function ensureTagMissing(tag) {
}
}
async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
async function createOrGetRelease(baseApi, tag, authHeader, notes) {
const byTag = await apiRequest("GET", `${baseApi}/releases/tags/${encodeURIComponent(tag)}`, authHeader);
if (byTag.ok) {
return byTag.body;
@ -218,13 +266,34 @@ async function createOrGetRelease(owner, repo, tag, authHeader, notes) {
return created.body;
}
async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDir, files) {
const baseApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
async function uploadReleaseAssets(baseApi, releaseId, authHeader, releaseDir, files) {
for (const fileName of files) {
const filePath = path.join(releaseDir, fileName);
const fileData = fs.readFileSync(filePath);
const fileSize = fs.statSync(filePath).size;
const uploadUrl = `${baseApi}/releases/${releaseId}/assets?name=${encodeURIComponent(fileName)}`;
const response = await apiRequest("POST", uploadUrl, authHeader, fileData, "application/octet-stream");
// Stream large files instead of loading them entirely into memory
const fileStream = fs.createReadStream(filePath);
const response = await fetch(uploadUrl, {
method: "POST",
headers: {
Accept: "application/json",
Authorization: authHeader,
"Content-Type": "application/octet-stream",
"Content-Length": String(fileSize)
},
body: fileStream,
duplex: "half"
});
const text = await response.text();
let parsed;
try {
parsed = text ? JSON.parse(text) : null;
} catch {
parsed = text;
}
if (response.ok) {
process.stdout.write(`Uploaded: ${fileName}\n`);
continue;
@ -233,7 +302,7 @@ async function uploadReleaseAssets(owner, repo, releaseId, authHeader, releaseDi
process.stdout.write(`Skipped existing asset: ${fileName}\n`);
continue;
}
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(response.body)}`);
throw new Error(`Asset upload failed for ${fileName} (${response.status}): ${JSON.stringify(parsed)}`);
}
}
@ -241,46 +310,44 @@ async function main() {
const rootDir = process.cwd();
const args = parseArgs(process.argv);
if (args.help) {
process.stdout.write("Usage: npm run release:codeberg -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Example: npm run release:codeberg -- 1.4.42 \"- Small fixes\"\n");
process.stdout.write("Usage: npm run release:gitea -- <version> [release notes] [--dry-run]\n");
process.stdout.write("Env: GITEA_BASE_URL, GITEA_REMOTE, GITEA_TOKEN\n");
process.stdout.write("Compatibility envs still supported: FORGEJO_BASE_URL, FORGEJO_REMOTE, FORGEJO_TOKEN\n");
process.stdout.write("Example: npm run release:gitea -- 1.6.31 \"- Bugfixes\"\n");
return;
}
const version = ensureVersionString(args.version);
const tag = `v${version}`;
const releaseNotes = args.notes || `- Release ${tag}`;
const { remote, owner, repo } = getCodebergRepo();
const repo = getGiteaRepo();
ensureNoTrackedChanges();
ensureTagMissing(tag);
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_EXECUTABLE, ["run", "release:win"]);
const assets = ensureAssetsExist(rootDir, version);
if (args.dryRun) {
process.stdout.write(`Dry run complete. Assets exist for ${tag}.\n`);
process.stdout.write(`Dry run: would release ${tag}. No changes made.\n`);
return;
}
updatePackageVersion(rootDir, version);
process.stdout.write(`Building release artifacts for ${tag}...\n`);
run(NPM_RELEASE_WIN.command, NPM_RELEASE_WIN.args);
const assets = ensureAssetsExist(rootDir, version);
run("git", ["add", "package.json"]);
run("git", ["commit", "-m", `Release ${tag}`]);
run("git", ["push", remote, "main"]);
run("git", ["push", repo.remote, "main"]);
run("git", ["tag", tag]);
run("git", ["push", remote, tag]);
run("git", ["push", repo.remote, tag]);
const authHeader = getCodebergAuthHeader();
const baseRepoApi = `https://codeberg.org/api/v1/repos/${owner}/${repo}`;
const patchReleaseEnabled = await apiRequest("PATCH", baseRepoApi, authHeader, JSON.stringify({ has_releases: true }));
if (!patchReleaseEnabled.ok) {
throw new Error(`Failed to enable releases (${patchReleaseEnabled.status}): ${JSON.stringify(patchReleaseEnabled.body)}`);
}
const authHeader = getAuthHeader(repo.host);
const baseApi = `${repo.baseUrl}/api/v1/repos/${repo.owner}/${repo.repo}`;
const release = await createOrGetRelease(baseApi, tag, authHeader, releaseNotes);
await uploadReleaseAssets(baseApi, release.id, authHeader, assets.releaseDir, assets.files);
const release = await createOrGetRelease(owner, repo, tag, authHeader, releaseNotes);
await uploadReleaseAssets(owner, repo, release.id, authHeader, assets.releaseDir, assets.files);
process.stdout.write(`Release published: ${release.html_url}\n`);
process.stdout.write(`Release published: ${release.html_url || `${repo.baseUrl}/${repo.owner}/${repo.repo}/releases/tag/${tag}`}\n`);
}
main().catch((error) => {

View File

@ -1,24 +0,0 @@
import fs from "node:fs";
import path from "node:path";
const version = process.argv[2];
if (!version) {
console.error("Usage: node scripts/set_version_node.mjs <version>");
process.exit(1);
}
const root = process.cwd();
const packageJsonPath = path.join(root, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, "utf8"));
packageJson.version = version;
fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, "utf8");
const constantsPath = path.join(root, "src", "main", "constants.ts");
const constants = fs.readFileSync(constantsPath, "utf8").replace(
/APP_VERSION = "[^"]+"/,
`APP_VERSION = "${version}"`
);
fs.writeFileSync(constantsPath, constants, "utf8");
console.log(`Set version to ${version}`);

View File

@ -4,6 +4,8 @@ import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
ParsedPackageInput,
SessionStats,
StartConflictEntry,
@ -18,8 +20,9 @@ import { APP_VERSION } from "./constants";
import { DownloadManager } from "./download-manager";
import { parseCollectorInput } from "./link-parser";
import { configureLogger, getLogFilePath, logger } from "./logger";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "./session-log";
import { MegaWebFallback } from "./mega-web-fallback";
import { createStoragePaths, loadSession, loadSettings, normalizeSettings, saveSession, saveSettings } from "./storage";
import { addHistoryEntry, cancelPendingAsyncSaves, clearHistory, createStoragePaths, loadHistory, loadSession, loadSettings, normalizeLoadedSession, normalizeLoadedSessionTransientFields, normalizeSettings, removeHistoryEntry, saveSession, saveSettings } from "./storage";
import { abortActiveUpdateDownload, checkGitHubUpdate, installLatestUpdate } from "./update";
import { startDebugServer, stopDebugServer } from "./debug-server";
@ -51,6 +54,7 @@ export class AppController {
public constructor() {
configureLogger(this.storagePaths.baseDir);
initSessionLog(this.storagePaths.baseDir);
this.settings = loadSettings(this.storagePaths);
const session = loadSession(this.storagePaths);
this.megaWebFallback = new MegaWebFallback(() => ({
@ -59,7 +63,10 @@ export class AppController {
}));
this.manager = new DownloadManager(this.settings, session, this.storagePaths, {
megaWebUnrestrict: (link: string, signal?: AbortSignal) => this.megaWebFallback.unrestrict(link, signal),
invalidateMegaSession: () => this.megaWebFallback.invalidateSession()
invalidateMegaSession: () => this.megaWebFallback.invalidateSession(),
onHistoryEntry: (entry: HistoryEntry) => {
addHistoryEntry(this.storagePaths, entry);
}
});
this.manager.on("state", (snapshot: UiSnapshot) => {
this.onStateHandler?.(snapshot);
@ -75,8 +82,15 @@ export class AppController {
void this.manager.getStartConflicts().then((conflicts) => {
const hasConflicts = conflicts.length > 0;
if (this.hasAnyProviderToken(this.settings) && !hasConflicts) {
// If the onState handler is already set (renderer connected), start immediately.
// Otherwise mark as pending so the onState setter triggers the start.
if (this.onStateHandler) {
logger.info("Auto-Resume beim Start aktiviert (nach Konflikt-Check)");
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
} else {
this.autoResumePending = true;
logger.info("Auto-Resume beim Start vorgemerkt");
}
} else if (hasConflicts) {
logger.info("Auto-Resume übersprungen: Start-Konflikte erkannt");
}
@ -91,6 +105,8 @@ export class AppController {
|| (settings.megaLogin.trim() && settings.megaPassword.trim())
|| settings.bestToken.trim()
|| settings.allDebridToken.trim()
|| (settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim())
|| settings.oneFichierApiKey.trim()
);
}
@ -106,6 +122,9 @@ export class AppController {
this.autoResumePending = false;
void this.manager.start().catch((err) => logger.warn(`Auto-Resume Start Fehler: ${String(err)}`));
logger.info("Auto-Resume beim Start aktiviert");
} else {
// Trigger pending extractions without starting the session
this.manager.triggerIdleExtractions();
}
}
}
@ -152,6 +171,12 @@ export class AppController {
}
public async installUpdate(onProgress?: (progress: UpdateInstallProgress) => void): Promise<UpdateInstallResult> {
// Stop active downloads before installing. Extractions may continue briefly
// until prepareForShutdown() is called during app quit.
if (this.manager.isSessionRunning()) {
this.manager.stop();
}
const cacheAgeMs = Date.now() - this.lastUpdateCheckAt;
const cached = this.lastUpdateCheck && !this.lastUpdateCheck.error && cacheAgeMs <= 10 * 60 * 1000
? this.lastUpdateCheck
@ -200,6 +225,14 @@ export class AppController {
await this.manager.start();
}
public async startPackages(packageIds: string[]): Promise<void> {
await this.manager.startPackages(packageIds);
}
public async startItems(itemIds: string[]): Promise<void> {
await this.manager.startItems(itemIds);
}
public stop(): void {
this.manager.stop();
}
@ -216,6 +249,10 @@ export class AppController {
this.manager.extractNow(packageId);
}
public resetPackage(packageId: string): void {
this.manager.resetPackage(packageId);
}
public cancelPackage(packageId: string): void {
this.manager.cancelPackage(packageId);
}
@ -249,7 +286,14 @@ export class AppController {
}
public exportBackup(): string {
const settings = this.settings;
const settings = { ...this.settings };
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = settings[key];
if (typeof val === "string" && val.length > 0) {
(settings as Record<string, unknown>)[key] = `***${val.slice(-4)}`;
}
}
const session = this.manager.getSession();
return JSON.stringify({ version: 1, settings, session }, null, 2);
}
@ -264,20 +308,77 @@ export class AppController {
if (!parsed || typeof parsed !== "object" || !parsed.settings || !parsed.session) {
return { restored: false, message: "Kein gültiges Backup (settings/session fehlen)" };
}
const restoredSettings = normalizeSettings(parsed.settings as AppSettings);
const importedSettings = parsed.settings as AppSettings;
const SENSITIVE_KEYS: (keyof AppSettings)[] = ["token", "megaLogin", "megaPassword", "bestToken", "allDebridToken", "ddownloadLogin", "ddownloadPassword", "oneFichierApiKey"];
for (const key of SENSITIVE_KEYS) {
const val = (importedSettings as Record<string, unknown>)[key];
if (typeof val === "string" && val.startsWith("***")) {
(importedSettings as Record<string, unknown>)[key] = (this.settings as Record<string, unknown>)[key];
}
}
const restoredSettings = normalizeSettings(importedSettings);
this.settings = restoredSettings;
saveSettings(this.storagePaths, this.settings);
this.manager.setSettings(this.settings);
const restoredSession = parsed.session as ReturnType<typeof loadSession>;
// Full stop including extraction abort — the old session is being replaced,
// so no extraction tasks from it should keep running.
this.manager.stop();
this.manager.abortAllPostProcessing();
// Cancel any deferred persist timer and queued async writes so the old
// in-memory session does not overwrite the restored session file on disk.
this.manager.clearPersistTimer();
cancelPendingAsyncSaves();
const restoredSession = normalizeLoadedSessionTransientFields(
normalizeLoadedSession(parsed.session)
);
saveSession(this.storagePaths, restoredSession);
// Prevent prepareForShutdown from overwriting the restored session file
// with the old in-memory session when the app quits after backup restore.
this.manager.skipShutdownPersist = true;
// Block all persistence (including persistSoon from any IPC operations
// the user might trigger before restarting) to protect the restored backup.
this.manager.blockAllPersistence = true;
return { restored: true, message: "Backup wiederhergestellt. Bitte App neustarten." };
}
public getSessionLogPath(): string | null {
return getSessionLogPath();
}
public shutdown(): void {
stopDebugServer();
abortActiveUpdateDownload();
this.manager.prepareForShutdown();
this.megaWebFallback.dispose();
shutdownSessionLog();
logger.info("App beendet");
}
public getHistory(): HistoryEntry[] {
return loadHistory(this.storagePaths);
}
public clearHistory(): void {
clearHistory(this.storagePaths);
}
public setPackagePriority(packageId: string, priority: PackagePriority): void {
this.manager.setPackagePriority(packageId, priority);
}
public skipItems(itemIds: string[]): void {
this.manager.skipItems(itemIds);
}
public resetItems(itemIds: string[]): void {
this.manager.resetItems(itemIds);
}
public removeHistoryEntry(entryId: string): void {
removeHistoryEntry(this.storagePaths, entryId);
}
public addToHistory(entry: HistoryEntry): void {
addHistoryEntry(this.storagePaths, entry);
}
}

66
src/main/backup-crypto.ts Normal file
View File

@ -0,0 +1,66 @@
import crypto from "node:crypto";
export const SENSITIVE_KEYS = [
"token",
"megaLogin",
"megaPassword",
"bestToken",
"allDebridToken",
"archivePasswordList"
] as const;
export type SensitiveKey = (typeof SENSITIVE_KEYS)[number];
export interface EncryptedCredentials {
salt: string;
iv: string;
tag: string;
data: string;
}
const PBKDF2_ITERATIONS = 100_000;
const KEY_LENGTH = 32; // 256 bit
const IV_LENGTH = 12; // 96 bit for GCM
const SALT_LENGTH = 16;
function deriveKey(username: string, salt: Buffer): Buffer {
return crypto.pbkdf2Sync(username, salt, PBKDF2_ITERATIONS, KEY_LENGTH, "sha256");
}
export function encryptCredentials(
fields: Record<string, string>,
username: string
): EncryptedCredentials {
const salt = crypto.randomBytes(SALT_LENGTH);
const iv = crypto.randomBytes(IV_LENGTH);
const key = deriveKey(username, salt);
const cipher = crypto.createCipheriv("aes-256-gcm", key, iv);
const plaintext = JSON.stringify(fields);
const encrypted = Buffer.concat([cipher.update(plaintext, "utf8"), cipher.final()]);
const tag = cipher.getAuthTag();
return {
salt: salt.toString("hex"),
iv: iv.toString("hex"),
tag: tag.toString("hex"),
data: encrypted.toString("hex")
};
}
export function decryptCredentials(
encrypted: EncryptedCredentials,
username: string
): Record<string, string> {
const salt = Buffer.from(encrypted.salt, "hex");
const iv = Buffer.from(encrypted.iv, "hex");
const tag = Buffer.from(encrypted.tag, "hex");
const data = Buffer.from(encrypted.data, "hex");
const key = deriveKey(username, salt);
const decipher = crypto.createDecipheriv("aes-256-gcm", key, iv);
decipher.setAuthTag(tag);
const decrypted = Buffer.concat([decipher.update(data), decipher.final()]);
return JSON.parse(decrypted.toString("utf8")) as Record<string, string>;
}

View File

@ -16,20 +16,26 @@ export const DLC_AES_IV = Buffer.from("9bc24cb995cb8db3", "utf8");
export const REQUEST_RETRIES = 3;
export const CHUNK_SIZE = 512 * 1024;
export const WRITE_BUFFER_SIZE = 512 * 1024; // 512 KB write buffer (JDownloader: 500 KB)
export const WRITE_FLUSH_TIMEOUT_MS = 2000; // 2s flush timeout
export const ALLOCATION_UNIT_SIZE = 4096; // 4 KB NTFS alignment
export const STREAM_HIGH_WATER_MARK = 512 * 1024; // 512 KB stream buffer — lower than before (2 MB) so backpressure triggers sooner when disk is slow
export const DISK_BUSY_THRESHOLD_MS = 300; // Show "Warte auf Festplatte" if writableLength > 0 for this long
export const SAMPLE_DIR_NAMES = new Set(["sample", "samples"]);
export const SAMPLE_VIDEO_EXTENSIONS = new Set([".mkv", ".mp4", ".avi", ".mov", ".wmv", ".m4v", ".ts", ".m2ts", ".webm"]);
export const LINK_ARTIFACT_EXTENSIONS = new Set([".url", ".webloc", ".dlc", ".rsdf", ".ccf"]);
export const SAMPLE_TOKEN_RE = /(^|[._\-\s])sample([._\-\s]|$)/i;
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz"]);
export const ARCHIVE_TEMP_EXTENSIONS = new Set([".rar", ".zip", ".7z", ".tmp", ".part", ".tar", ".gz", ".bz2", ".xz", ".rev"]);
export const RAR_SPLIT_RE = /\.r\d{2,3}$/i;
export const MAX_MANIFEST_FILE_BYTES = 5 * 1024 * 1024;
export const MAX_LINK_ARTIFACT_BYTES = 256 * 1024;
export const SPEED_WINDOW_SECONDS = 3;
export const SPEED_WINDOW_SECONDS = 1;
export const CLIPBOARD_POLL_INTERVAL_MS = 2000;
export const DEFAULT_UPDATE_REPO = "Sucukdeluxe/real-debrid-downloader";
export const DEFAULT_UPDATE_REPO = "Administrator/real-debrid-downloader";
export function defaultSettings(): AppSettings {
const baseDir = path.join(os.homedir(), "Downloads", "RealDebrid");
@ -39,6 +45,9 @@ export function defaultSettings(): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: "",
archivePasswordList: "",
rememberToken: true,
providerPrimary: "realdebrid",
@ -64,6 +73,7 @@ export function defaultSettings(): AppSettings {
reconnectWaitSeconds: 45,
completedCleanupPolicy: "never",
maxParallel: 4,
maxParallelExtract: 2,
retryLimit: 0,
speedLimitEnabled: false,
speedLimitKbps: 0,
@ -77,6 +87,9 @@ export function defaultSettings(): AppSettings {
autoSkipExtracted: false,
confirmDeleteSelection: true,
totalDownloadedAllTime: 0,
bandwidthSchedules: []
bandwidthSchedules: [],
columnOrder: ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"],
extractCpuPriority: "high",
autoExtractWhenStopped: true
};
}

View File

@ -164,7 +164,7 @@ async function decryptDlcLocal(filePath: string): Promise<ParsedPackageInput[]>
const dlcData = content.slice(0, -88);
const rcUrl = DLC_SERVICE_URL.replace("{KEY}", encodeURIComponent(dlcKey));
const rcResponse = await fetch(rcUrl, { method: "GET" });
const rcResponse = await fetch(rcUrl, { method: "GET", signal: AbortSignal.timeout(30000) });
if (!rcResponse.ok) {
return [];
}
@ -217,7 +217,8 @@ async function tryDcryptUpload(fileContent: Buffer, fileName: string): Promise<s
const response = await fetch(DCRYPT_UPLOAD_URL, {
method: "POST",
body: form
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;
@ -235,7 +236,8 @@ async function tryDcryptPaste(fileContent: Buffer): Promise<string[] | null> {
const response = await fetch(DCRYPT_PASTE_URL, {
method: "POST",
body: form
body: form,
signal: AbortSignal.timeout(30000)
});
if (response.status === 413) {
return null;

View File

@ -11,11 +11,16 @@ const RAPIDGATOR_SCAN_MAX_BYTES = 512 * 1024;
const BEST_DEBRID_API_BASE = "https://bestdebrid.com/api/v1";
const ALL_DEBRID_API_BASE = "https://api.alldebrid.com/v4";
const ONEFICHIER_API_BASE = "https://api.1fichier.com/v1";
const ONEFICHIER_URL_RE = /^https?:\/\/(?:www\.)?(?:1fichier\.com|alterupload\.com|cjoint\.net|desfichiers\.com|dfichiers\.com|megadl\.fr|mesfichiers\.org|piecejointe\.net|pjointe\.com|tenvoi\.com|dl4free\.com)\/\?([a-z0-9]{5,20})$/i;
const PROVIDER_LABELS: Record<DebridProvider, string> = {
realdebrid: "Real-Debrid",
megadebrid: "Mega-Debrid",
bestdebrid: "BestDebrid",
alldebrid: "AllDebrid"
alldebrid: "AllDebrid",
ddownload: "DDownload",
onefichier: "1Fichier"
};
interface ProviderUnrestrictedLink extends UnrestrictedLink {
@ -226,7 +231,9 @@ function isRapidgatorLink(link: string): boolean {
return hostname === "rapidgator.net"
|| hostname.endsWith(".rapidgator.net")
|| hostname === "rg.to"
|| hostname.endsWith(".rg.to");
|| hostname.endsWith(".rg.to")
|| hostname === "rapidgator.asia"
|| hostname.endsWith(".rapidgator.asia");
} catch {
return false;
}
@ -315,7 +322,7 @@ async function runWithConcurrency<T>(items: T[], concurrency: number, worker: (i
let index = 0;
let firstError: unknown = null;
const next = (): T | undefined => {
if (index >= items.length) {
if (firstError || index >= items.length) {
return undefined;
}
const item = items[index];
@ -415,6 +422,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt < REQUEST_RETRIES + 2) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
@ -430,9 +438,11 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
&& !contentType.includes("text/plain")
&& !contentType.includes("text/xml")
&& !contentType.includes("application/xml")) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
if (!contentType && Number.isFinite(contentLength) && contentLength > RAPIDGATOR_SCAN_MAX_BYTES) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return "";
}
@ -444,7 +454,7 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
return "";
} catch (error) {
const errorText = compactErrorText(error);
if (/aborted/i.test(errorText)) {
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
if (attempt >= REQUEST_RETRIES + 2 || !isRetryableErrorText(errorText)) {
@ -460,6 +470,138 @@ async function resolveRapidgatorFilename(link: string, signal?: AbortSignal): Pr
return "";
}
export interface RapidgatorCheckResult {
online: boolean;
fileName: string;
fileSize: string | null;
}
const RG_FILE_ID_RE = /\/file\/([a-z0-9]{32}|\d+)/i;
const RG_FILE_NOT_FOUND_RE = />\s*404\s*File not found/i;
const RG_FILESIZE_RE = /File\s*size:\s*<strong>([^<>"]+)<\/strong>/i;
export async function checkRapidgatorOnline(
link: string,
signal?: AbortSignal
): Promise<RapidgatorCheckResult | null> {
if (!isRapidgatorLink(link)) {
return null;
}
const fileIdMatch = link.match(RG_FILE_ID_RE);
if (!fileIdMatch) {
return null;
}
const fileId = fileIdMatch[1];
const headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36",
Accept: "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",
"Accept-Language": "en-US,en;q=0.9,de;q=0.8"
};
// Fast path: HEAD request (no body download, much faster)
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
const response = await fetch(link, {
method: "HEAD",
redirect: "follow",
headers,
signal: withTimeoutSignal(signal, 15000)
});
if (response.status === 404) {
return { online: false, fileName: "", fileSize: null };
}
if (response.ok) {
const finalUrl = response.url || link;
if (!finalUrl.includes(fileId)) {
return { online: false, fileName: "", fileSize: null };
}
// HEAD 200 + URL still contains file ID → online
const fileName = filenameFromRapidgatorUrlPath(link);
return { online: true, fileName, fileSize: null };
}
// Non-OK, non-404: retry or give up
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
// HEAD inconclusive — fall through to GET
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
break; // fall through to GET
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
// Slow path: GET request (downloads HTML, more thorough)
for (let attempt = 1; attempt <= REQUEST_RETRIES + 1; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
const response = await fetch(link, {
method: "GET",
redirect: "follow",
headers,
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (response.status === 404) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
if (!response.ok) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
if (shouldRetryStatus(response.status) && attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelayForResponse(response, attempt), signal);
continue;
}
return null;
}
const finalUrl = response.url || link;
if (!finalUrl.includes(fileId)) {
try { await response.body?.cancel(); } catch { /* drain socket */ }
return { online: false, fileName: "", fileSize: null };
}
const html = await readResponseTextLimited(response, RAPIDGATOR_SCAN_MAX_BYTES, signal);
if (RG_FILE_NOT_FOUND_RE.test(html)) {
return { online: false, fileName: "", fileSize: null };
}
const fileName = extractRapidgatorFilenameFromHtml(html) || filenameFromRapidgatorUrlPath(link);
const sizeMatch = html.match(RG_FILESIZE_RE);
const fileSize = sizeMatch ? sizeMatch[1].trim() : null;
return { online: true, fileName, fileSize };
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) throw error;
if (attempt > REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
return null;
}
}
if (attempt <= REQUEST_RETRIES) {
await sleepWithSignal(retryDelay(attempt), signal);
}
}
return null;
}
function buildBestDebridRequests(link: string, token: string): BestDebridRequest[] {
const linkParam = encodeURIComponent(link);
const safeToken = String(token || "").trim();
@ -503,7 +645,7 @@ class MegaDebridClient {
throw new Error("Mega-Web Antwort ohne Download-Link");
}
if (!lastError) {
lastError = web ? "Mega-Web Antwort ohne Download-Link" : "Mega-Web Antwort leer";
lastError = "Mega-Web Antwort leer";
}
// Don't retry permanent hoster errors (dead link, file removed, etc.)
if (/permanent ungültig|hosternotavailable|file.?not.?found|file.?unavailable|link.?is.?dead/i.test(lastError)) {
@ -513,7 +655,7 @@ class MegaDebridClient {
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(lastError || "Mega-Web Unrestrict fehlgeschlagen");
throw new Error(String(lastError || "Mega-Web Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}
@ -532,7 +674,11 @@ class BestDebridClient {
try {
return await this.tryRequest(request, link, signal);
} catch (error) {
lastError = compactErrorText(error);
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
lastError = errorText;
}
}
@ -597,7 +743,7 @@ class BestDebridClient {
throw new Error("BestDebrid Antwort ohne Download-Link");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(lastError)) {
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -691,7 +837,7 @@ class AllDebridClient {
break;
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(errorText)) {
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(errorText)) {
@ -803,7 +949,7 @@ class AllDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(lastError)) {
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -813,7 +959,257 @@ class AllDebridClient {
}
}
throw new Error(lastError || "AllDebrid Unrestrict fehlgeschlagen");
throw new Error(String(lastError || "AllDebrid Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}
// ── 1Fichier Client ──
class OneFichierClient {
private apiKey: string;
public constructor(apiKey: string) {
this.apiKey = apiKey;
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
if (!ONEFICHIER_URL_RE.test(link)) {
throw new Error("Kein 1Fichier-Link");
}
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
if (signal?.aborted) throw new Error("aborted:debrid");
try {
const res = await fetch(`${ONEFICHIER_API_BASE}/download/get_token.cgi`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${this.apiKey}`
},
body: JSON.stringify({ url: link, pretty: 1 }),
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const json = await res.json() as Record<string, unknown>;
if (json.status === "KO" || json.error) {
const msg = String(json.message || json.error || "Unbekannter 1Fichier-Fehler");
throw new Error(msg);
}
const directUrl = String(json.url || "");
if (!directUrl) {
throw new Error("1Fichier: Keine Download-URL in Antwort");
}
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
throw error;
}
if (attempt < REQUEST_RETRIES) {
await sleep(retryDelay(attempt), signal);
}
}
}
throw new Error(`1Fichier-Unrestrict fehlgeschlagen: ${lastError}`);
}
}
const DDOWNLOAD_URL_RE = /^https?:\/\/(?:www\.)?(?:ddownload\.com|ddl\.to)\/([a-z0-9]+)/i;
const DDOWNLOAD_WEB_BASE = "https://ddownload.com";
const DDOWNLOAD_WEB_UA = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36";
class DdownloadClient {
private login: string;
private password: string;
private cookies: string = "";
public constructor(login: string, password: string) {
this.login = login;
this.password = password;
}
private async webLogin(signal?: AbortSignal): Promise<void> {
// Step 1: GET login page to extract form token
const loginPageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/login.html`, {
headers: { "User-Agent": DDOWNLOAD_WEB_UA },
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
const loginPageHtml = await loginPageRes.text();
const tokenMatch = loginPageHtml.match(/name="token" value="([^"]+)"/);
const pageCookies = (loginPageRes.headers.getSetCookie?.() || []).map((c: string) => c.split(";")[0]).join("; ");
// Step 2: POST login
const body = new URLSearchParams({
op: "login",
token: tokenMatch?.[1] || "",
rand: "",
redirect: "",
login: this.login,
password: this.password
});
const loginRes = await fetch(`${DDOWNLOAD_WEB_BASE}/`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
...(pageCookies ? { Cookie: pageCookies } : {})
},
body: body.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Drain body
try { await loginRes.text(); } catch { /* ignore */ }
const setCookies = loginRes.headers.getSetCookie?.() || [];
const xfss = setCookies.find((c: string) => c.startsWith("xfss="));
const loginCookie = setCookies.find((c: string) => c.startsWith("login="));
if (!xfss) {
throw new Error("DDownload Login fehlgeschlagen (kein Session-Cookie)");
}
this.cookies = [loginCookie, xfss].filter(Boolean).map((c: string) => c.split(";")[0]).join("; ");
}
public async unrestrictLink(link: string, signal?: AbortSignal): Promise<UnrestrictedLink> {
const match = link.match(DDOWNLOAD_URL_RE);
if (!match) {
throw new Error("Kein DDownload-Link");
}
const fileCode = match[1];
let lastError = "";
for (let attempt = 1; attempt <= REQUEST_RETRIES; attempt += 1) {
try {
if (signal?.aborted) throw new Error("aborted:debrid");
// Login if no session yet
if (!this.cookies) {
await this.webLogin(signal);
}
// Step 1: GET file page to extract form fields
const filePageRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
Cookie: this.cookies
},
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
// Premium with direct downloads enabled → redirect immediately
if (filePageRes.status >= 300 && filePageRes.status < 400) {
const directUrl = filePageRes.headers.get("location") || "";
try { await filePageRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: filenameFromUrl(directUrl) || filenameFromUrl(link),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const html = await filePageRes.text();
// Check for file not found
if (/File Not Found|file was removed|file was banned/i.test(html)) {
throw new Error("DDownload: Datei nicht gefunden");
}
// Extract form fields
const idVal = html.match(/name="id" value="([^"]+)"/)?.[1] || fileCode;
const randVal = html.match(/name="rand" value="([^"]+)"/)?.[1] || "";
const fileNameMatch = html.match(/class="file-info-name"[^>]*>([^<]+)</);
const fileName = fileNameMatch?.[1]?.trim() || filenameFromUrl(link);
// Step 2: POST download2 for premium download
const dlBody = new URLSearchParams({
op: "download2",
id: idVal,
rand: randVal,
referer: "",
method_premium: "1",
adblock_detected: "0"
});
const dlRes = await fetch(`${DDOWNLOAD_WEB_BASE}/${fileCode}`, {
method: "POST",
headers: {
"User-Agent": DDOWNLOAD_WEB_UA,
"Content-Type": "application/x-www-form-urlencoded",
Cookie: this.cookies,
Referer: `${DDOWNLOAD_WEB_BASE}/${fileCode}`
},
body: dlBody.toString(),
redirect: "manual",
signal: withTimeoutSignal(signal, API_TIMEOUT_MS)
});
if (dlRes.status >= 300 && dlRes.status < 400) {
const directUrl = dlRes.headers.get("location") || "";
try { await dlRes.text(); } catch { /* drain */ }
if (directUrl) {
return {
fileName: fileName || filenameFromUrl(directUrl),
directUrl,
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
}
const dlHtml = await dlRes.text();
// Try to find direct URL in response HTML
const directMatch = dlHtml.match(/https?:\/\/[a-z0-9]+\.(?:dstorage\.org|ddownload\.com|ddl\.to|ucdn\.to)[^\s"'<>]+/i);
if (directMatch) {
return {
fileName,
directUrl: directMatch[0],
fileSize: null,
retriesUsed: attempt - 1,
skipTlsVerify: true
};
}
// Check for error messages
const errMatch = dlHtml.match(/class="err"[^>]*>([^<]+)</i);
if (errMatch) {
throw new Error(`DDownload: ${errMatch[1].trim()}`);
}
throw new Error("DDownload: Kein Download-Link erhalten");
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
// Re-login on auth errors
if (/login|session|cookie/i.test(lastError)) {
this.cookies = "";
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
break;
}
await sleepWithSignal(retryDelay(attempt), signal);
}
}
throw new Error(String(lastError || "DDownload Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}
@ -822,6 +1218,9 @@ export class DebridService {
private options: DebridServiceOptions;
private cachedDdownloadClient: DdownloadClient | null = null;
private cachedDdownloadKey = "";
public constructor(settings: AppSettings, options: DebridServiceOptions = {}) {
this.settings = cloneSettings(settings);
this.options = options;
@ -831,6 +1230,16 @@ export class DebridService {
this.settings = cloneSettings(next);
}
private getDdownloadClient(login: string, password: string): DdownloadClient {
const key = `${login}\0${password}`;
if (this.cachedDdownloadClient && this.cachedDdownloadKey === key) {
return this.cachedDdownloadClient;
}
this.cachedDdownloadClient = new DdownloadClient(login, password);
this.cachedDdownloadKey = key;
return this.cachedDdownloadClient;
}
public async resolveFilenames(
links: string[],
onResolved?: (link: string, fileName: string) => void,
@ -865,7 +1274,7 @@ export class DebridService {
}
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(errorText)) {
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// ignore and continue with host page fallback
@ -883,6 +1292,46 @@ export class DebridService {
public async unrestrictLink(link: string, signal?: AbortSignal, settingsSnapshot?: AppSettings): Promise<ProviderUnrestrictedLink> {
const settings = settingsSnapshot ? cloneSettings(settingsSnapshot) : cloneSettings(this.settings);
// 1Fichier is a direct file hoster. If the link is a 1fichier.com URL
// and the API key is configured, use 1Fichier directly before debrid providers.
if (ONEFICHIER_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "onefichier")) {
try {
const result = await this.unrestrictViaProvider(settings, "onefichier", link, signal);
return {
...result,
provider: "onefichier",
providerLabel: PROVIDER_LABELS["onefichier"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain
}
}
// DDownload is a direct file hoster, not a debrid service.
// If the link is a ddownload.com/ddl.to URL and the account is configured,
// use DDownload directly before trying any debrid providers.
if (DDOWNLOAD_URL_RE.test(link) && this.isProviderConfiguredFor(settings, "ddownload")) {
try {
const result = await this.unrestrictViaProvider(settings, "ddownload", link, signal);
return {
...result,
provider: "ddownload",
providerLabel: PROVIDER_LABELS["ddownload"]
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
// Fall through to normal provider chain (debrid services may also support ddownload links)
}
}
const order = toProviderOrder(
settings.providerPrimary,
settings.providerSecondary,
@ -910,7 +1359,11 @@ export class DebridService {
providerLabel: PROVIDER_LABELS[primary]
};
} catch (error) {
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${compactErrorText(error)}`);
const errorText = compactErrorText(error);
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
throw new Error(`Unrestrict fehlgeschlagen: ${PROVIDER_LABELS[primary]}: ${errorText}`);
}
}
@ -940,7 +1393,7 @@ export class DebridService {
};
} catch (error) {
const errorText = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(errorText)) {
if (signal?.aborted || (/aborted/i.test(errorText) && !/timeout/i.test(errorText))) {
throw error;
}
attempts.push(`${PROVIDER_LABELS[provider]}: ${compactErrorText(error)}`);
@ -964,6 +1417,12 @@ export class DebridService {
if (provider === "alldebrid") {
return Boolean(settings.allDebridToken.trim());
}
if (provider === "ddownload") {
return Boolean(settings.ddownloadLogin.trim() && settings.ddownloadPassword.trim());
}
if (provider === "onefichier") {
return Boolean(settings.oneFichierApiKey.trim());
}
return Boolean(settings.bestToken.trim());
}
@ -977,6 +1436,12 @@ export class DebridService {
if (provider === "alldebrid") {
return new AllDebridClient(settings.allDebridToken).unrestrictLink(link, signal);
}
if (provider === "ddownload") {
return this.getDdownloadClient(settings.ddownloadLogin, settings.ddownloadPassword).unrestrictLink(link, signal);
}
if (provider === "onefichier") {
return new OneFichierClient(settings.oneFichierApiKey).unrestrictLink(link, signal);
}
return new BestDebridClient(settings.bestToken).unrestrictLink(link, signal);
}
}

View File

@ -261,7 +261,7 @@ export function startDebugServer(mgr: DownloadManager, baseDir: string): void {
const port = getPort(baseDir);
server = http.createServer(handleRequest);
server.listen(port, "0.0.0.0", () => {
server.listen(port, "127.0.0.1", () => {
logger.info(`Debug-Server gestartet auf Port ${port}`);
});
server.on("error", (err) => {

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -8,12 +8,19 @@ const LOG_BUFFER_LIMIT_CHARS = 1_000_000;
const LOG_MAX_FILE_BYTES = 10 * 1024 * 1024;
const rotateCheckAtByFile = new Map<string, number>();
type LogListener = (line: string) => void;
let logListener: LogListener | null = null;
let pendingLines: string[] = [];
let pendingChars = 0;
let flushTimer: NodeJS.Timeout | null = null;
let flushInFlight = false;
let exitHookAttached = false;
export function setLogListener(listener: LogListener | null): void {
logListener = listener;
}
export function configureLogger(baseDir: string): void {
logFilePath = path.join(baseDir, "rd_downloader.log");
const cwdLogPath = path.resolve(process.cwd(), "rd_downloader.log");
@ -188,6 +195,10 @@ function write(level: "INFO" | "WARN" | "ERROR", message: string): void {
pendingLines.push(line);
pendingChars += line.length;
if (logListener) {
try { logListener(line); } catch { /* ignore */ }
}
while (pendingChars > LOG_BUFFER_LIMIT_CHARS && pendingLines.length > 1) {
const removed = pendingLines.shift();
if (!removed) {

View File

@ -7,6 +7,7 @@ import { IPC_CHANNELS } from "../shared/ipc";
import { getLogFilePath, logger } from "./logger";
import { APP_NAME } from "./constants";
import { extractHttpLinksFromText } from "./utils";
import { cleanupStaleSubstDrives, shutdownDaemon } from "./extractor";
/* ── IPC validation helpers ────────────────────────────────────── */
function validateString(value: unknown, name: string): string {
@ -50,6 +51,7 @@ process.on("unhandledRejection", (reason) => {
let mainWindow: BrowserWindow | null = null;
let tray: Tray | null = null;
let clipboardTimer: ReturnType<typeof setInterval> | null = null;
let updateQuitTimer: ReturnType<typeof setTimeout> | null = null;
let lastClipboardText = "";
const controller = new AppController();
const CLIPBOARD_MAX_TEXT_CHARS = 50_000;
@ -80,7 +82,7 @@ function createWindow(): BrowserWindow {
responseHeaders: {
...details.responseHeaders,
"Content-Security-Policy": [
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu"
"default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' https://api.real-debrid.com https://codeberg.org https://bestdebrid.com https://api.alldebrid.com https://www.mega-debrid.eu https://git.24-music.de https://ddownload.com https://ddl.to"
]
}
});
@ -129,7 +131,7 @@ function createTray(): void {
const contextMenu = Menu.buildFromTemplate([
{ label: "Anzeigen", click: () => { mainWindow?.show(); mainWindow?.focus(); } },
{ type: "separator" },
{ label: "Start", click: () => { controller.start(); } },
{ label: "Start", click: () => { void controller.start().catch((err) => logger.warn(`Tray Start Fehler: ${String(err)}`)); } },
{ label: "Stop", click: () => { controller.stop(); } },
{ type: "separator" },
{ label: "Beenden", click: () => { app.quit(); } }
@ -187,7 +189,12 @@ function startClipboardWatcher(): void {
}
lastClipboardText = normalizeClipboardText(clipboard.readText());
clipboardTimer = setInterval(() => {
const text = normalizeClipboardText(clipboard.readText());
let text: string;
try {
text = normalizeClipboardText(clipboard.readText());
} catch {
return;
}
if (text === lastClipboardText || !text.trim()) {
return;
}
@ -236,9 +243,9 @@ function registerIpcHandlers(): void {
mainWindow.webContents.send(IPC_CHANNELS.UPDATE_INSTALL_PROGRESS, progress);
});
if (result.started) {
setTimeout(() => {
updateQuitTimer = setTimeout(() => {
app.quit();
}, 800);
}, 2500);
}
return result;
});
@ -288,6 +295,14 @@ function registerIpcHandlers(): void {
});
ipcMain.handle(IPC_CHANNELS.CLEAR_ALL, () => controller.clearAll());
ipcMain.handle(IPC_CHANNELS.START, () => controller.start());
ipcMain.handle(IPC_CHANNELS.START_PACKAGES, (_event: IpcMainInvokeEvent, packageIds: string[]) => {
validateStringArray(packageIds ?? [], "packageIds");
return controller.startPackages(packageIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.START_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.startItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.STOP, () => controller.stop());
ipcMain.handle(IPC_CHANNELS.TOGGLE_PAUSE, () => controller.togglePause());
ipcMain.handle(IPC_CHANNELS.CANCEL_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
@ -322,7 +337,45 @@ function registerIpcHandlers(): void {
validateString(packageId, "packageId");
return controller.extractNow(packageId);
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, () => controller.exportQueue());
ipcMain.handle(IPC_CHANNELS.RESET_PACKAGE, (_event: IpcMainInvokeEvent, packageId: string) => {
validateString(packageId, "packageId");
return controller.resetPackage(packageId);
});
ipcMain.handle(IPC_CHANNELS.SET_PACKAGE_PRIORITY, (_event: IpcMainInvokeEvent, packageId: string, priority: string) => {
validateString(packageId, "packageId");
validateString(priority, "priority");
if (priority !== "high" && priority !== "normal" && priority !== "low") {
throw new Error("priority muss 'high', 'normal' oder 'low' sein");
}
return controller.setPackagePriority(packageId, priority);
});
ipcMain.handle(IPC_CHANNELS.SKIP_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.skipItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.RESET_ITEMS, (_event: IpcMainInvokeEvent, itemIds: string[]) => {
validateStringArray(itemIds ?? [], "itemIds");
return controller.resetItems(itemIds ?? []);
});
ipcMain.handle(IPC_CHANNELS.GET_HISTORY, () => controller.getHistory());
ipcMain.handle(IPC_CHANNELS.CLEAR_HISTORY, () => controller.clearHistory());
ipcMain.handle(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, (_event: IpcMainInvokeEvent, entryId: string) => {
validateString(entryId, "entryId");
return controller.removeHistoryEntry(entryId);
});
ipcMain.handle(IPC_CHANNELS.EXPORT_QUEUE, async () => {
const options = {
defaultPath: `rd-queue-export.json`,
filters: [{ name: "Queue Export", extensions: ["json"] }]
};
const result = mainWindow ? await dialog.showSaveDialog(mainWindow, options) : await dialog.showSaveDialog(options);
if (result.canceled || !result.filePath) {
return { saved: false };
}
const json = controller.exportQueue();
await fs.promises.writeFile(result.filePath, json, "utf8");
return { saved: true };
});
ipcMain.handle(IPC_CHANNELS.IMPORT_QUEUE, (_event: IpcMainInvokeEvent, json: string) => {
validateString(json, "json");
const bytes = Buffer.byteLength(json, "utf8");
@ -386,6 +439,13 @@ function registerIpcHandlers(): void {
await shell.openPath(logPath);
});
ipcMain.handle(IPC_CHANNELS.OPEN_SESSION_LOG, async () => {
const logPath = controller.getSessionLogPath();
if (logPath) {
await shell.openPath(logPath);
}
});
ipcMain.handle(IPC_CHANNELS.IMPORT_BACKUP, async () => {
const options = {
properties: ["openFile"] as Array<"openFile">,
@ -399,6 +459,11 @@ function registerIpcHandlers(): void {
return { restored: false, message: "Abgebrochen" };
}
const filePath = result.filePaths[0];
const stat = await fs.promises.stat(filePath);
const BACKUP_MAX_BYTES = 50 * 1024 * 1024;
if (stat.size > BACKUP_MAX_BYTES) {
return { restored: false, message: `Backup-Datei zu groß (max 50 MB, Datei hat ${(stat.size / 1024 / 1024).toFixed(1)} MB)` };
}
const json = await fs.promises.readFile(filePath, "utf8");
return controller.importBackup(json);
});
@ -422,6 +487,7 @@ app.on("second-instance", () => {
});
app.whenReady().then(() => {
cleanupStaleSubstDrives();
registerIpcHandlers();
mainWindow = createWindow();
bindMainWindowLifecycle(mainWindow);
@ -434,6 +500,9 @@ app.whenReady().then(() => {
bindMainWindowLifecycle(mainWindow);
}
});
}).catch((error) => {
console.error("App startup failed:", error);
app.quit();
});
app.on("window-all-closed", () => {
@ -443,8 +512,10 @@ app.on("window-all-closed", () => {
});
app.on("before-quit", () => {
if (updateQuitTimer) { clearTimeout(updateQuitTimer); updateQuitTimer = null; }
stopClipboardWatcher();
destroyTray();
shutdownDaemon();
try {
controller.shutdown();
} catch (error) {

View File

@ -228,22 +228,23 @@ export class MegaWebFallback {
}
public async unrestrict(link: string, signal?: AbortSignal): Promise<UnrestrictedLink | null> {
const overallSignal = withTimeoutSignal(signal, 180000);
return this.runExclusive(async () => {
throwIfAborted(signal);
throwIfAborted(overallSignal);
const creds = this.getCredentials();
if (!creds.login.trim() || !creds.password.trim()) {
return null;
}
if (!this.cookie || Date.now() - this.cookieSetAt > 20 * 60 * 1000) {
await this.login(creds.login, creds.password, signal);
await this.login(creds.login, creds.password, overallSignal);
}
const generated = await this.generate(link, signal);
const generated = await this.generate(link, overallSignal);
if (!generated) {
this.cookie = "";
await this.login(creds.login, creds.password, signal);
const retry = await this.generate(link, signal);
await this.login(creds.login, creds.password, overallSignal);
const retry = await this.generate(link, overallSignal);
if (!retry) {
return null;
}
@ -261,7 +262,7 @@ export class MegaWebFallback {
fileSize: null,
retriesUsed: 0
};
}, signal);
}, overallSignal);
}
public invalidateSession(): void {

View File

@ -8,6 +8,7 @@ export interface UnrestrictedLink {
directUrl: string;
fileSize: number | null;
retriesUsed: number;
skipTlsVerify?: boolean;
}
function shouldRetryStatus(status: number): boolean {
@ -62,7 +63,8 @@ function isRetryableErrorText(text: string): boolean {
|| lower.includes("aborted")
|| lower.includes("econnreset")
|| lower.includes("enotfound")
|| lower.includes("etimedout");
|| lower.includes("etimedout")
|| lower.includes("html statt json");
}
function withTimeoutSignal(signal: AbortSignal | undefined, timeoutMs: number): AbortSignal {
@ -77,6 +79,11 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
await sleep(ms);
return;
}
// Check before entering the Promise constructor to avoid a race where the timer
// resolves before the aborted check runs (especially when ms=0).
if (signal.aborted) {
throw new Error("aborted");
}
await new Promise<void>((resolve, reject) => {
let timer: NodeJS.Timeout | null = setTimeout(() => {
timer = null;
@ -93,10 +100,6 @@ async function sleepWithSignal(ms: number, signal?: AbortSignal): Promise<void>
reject(new Error("aborted"));
};
if (signal.aborted) {
onAbort();
return;
}
signal.addEventListener("abort", onAbort, { once: true });
});
}
@ -165,6 +168,15 @@ export class RealDebridClient {
if (!directUrl) {
throw new Error("Unrestrict ohne Download-URL");
}
try {
const parsedUrl = new URL(directUrl);
if (parsedUrl.protocol !== "https:" && parsedUrl.protocol !== "http:") {
throw new Error(`Ungültiges Download-URL-Protokoll (${parsedUrl.protocol})`);
}
} catch (urlError) {
if (urlError instanceof Error && urlError.message.includes("Protokoll")) throw urlError;
throw new Error("Real-Debrid Antwort enthält keine gültige Download-URL");
}
const fileName = String(payload.filename || "download.bin").trim() || "download.bin";
const fileSizeRaw = Number(payload.filesize ?? NaN);
@ -176,7 +188,7 @@ export class RealDebridClient {
};
} catch (error) {
lastError = compactErrorText(error);
if (signal?.aborted || /aborted/i.test(lastError)) {
if (signal?.aborted || (/aborted/i.test(lastError) && !/timeout/i.test(lastError))) {
break;
}
if (attempt >= REQUEST_RETRIES || !isRetryableErrorText(lastError)) {
@ -186,6 +198,6 @@ export class RealDebridClient {
}
}
throw new Error(lastError || "Unrestrict fehlgeschlagen");
throw new Error(String(lastError || "Unrestrict fehlgeschlagen").replace(/^Error:\s*/i, ""));
}
}

128
src/main/session-log.ts Normal file
View File

@ -0,0 +1,128 @@
import fs from "node:fs";
import path from "node:path";
import { setLogListener } from "./logger";
const SESSION_LOG_FLUSH_INTERVAL_MS = 200;
let sessionLogPath: string | null = null;
let sessionLogsDir: string | null = null;
let pendingLines: string[] = [];
let flushTimer: NodeJS.Timeout | null = null;
function formatTimestamp(): string {
const now = new Date();
const y = now.getFullYear();
const mo = String(now.getMonth() + 1).padStart(2, "0");
const d = String(now.getDate()).padStart(2, "0");
const h = String(now.getHours()).padStart(2, "0");
const mi = String(now.getMinutes()).padStart(2, "0");
const s = String(now.getSeconds()).padStart(2, "0");
return `${y}-${mo}-${d}_${h}-${mi}-${s}`;
}
function flushPending(): void {
if (pendingLines.length === 0 || !sessionLogPath) {
return;
}
const chunk = pendingLines.join("");
pendingLines = [];
try {
fs.appendFileSync(sessionLogPath, chunk, "utf8");
} catch {
// ignore write errors
}
}
function scheduleFlush(): void {
if (flushTimer) {
return;
}
flushTimer = setTimeout(() => {
flushTimer = null;
flushPending();
}, SESSION_LOG_FLUSH_INTERVAL_MS);
}
function appendToSessionLog(line: string): void {
if (!sessionLogPath) {
return;
}
pendingLines.push(line);
scheduleFlush();
}
async function cleanupOldSessionLogs(dir: string, maxAgeDays: number): Promise<void> {
try {
const files = await fs.promises.readdir(dir);
const cutoff = Date.now() - maxAgeDays * 24 * 60 * 60 * 1000;
for (const file of files) {
if (!file.startsWith("session_") || !file.endsWith(".txt")) {
continue;
}
const filePath = path.join(dir, file);
try {
const stat = await fs.promises.stat(filePath);
if (stat.mtimeMs < cutoff) {
await fs.promises.unlink(filePath);
}
} catch {
// ignore - file may be locked
}
}
} catch {
// ignore - dir may not exist
}
}
export function initSessionLog(baseDir: string): void {
sessionLogsDir = path.join(baseDir, "session-logs");
try {
fs.mkdirSync(sessionLogsDir, { recursive: true });
} catch {
sessionLogsDir = null;
return;
}
const timestamp = formatTimestamp();
sessionLogPath = path.join(sessionLogsDir, `session_${timestamp}.txt`);
const isoTimestamp = new Date().toISOString();
try {
fs.writeFileSync(sessionLogPath, `=== Session gestartet: ${isoTimestamp} ===\n`, "utf8");
} catch {
sessionLogPath = null;
return;
}
setLogListener((line) => appendToSessionLog(line));
void cleanupOldSessionLogs(sessionLogsDir, 7);
}
export function getSessionLogPath(): string | null {
return sessionLogPath;
}
export function shutdownSessionLog(): void {
if (!sessionLogPath) {
return;
}
// Flush any pending lines
if (flushTimer) {
clearTimeout(flushTimer);
flushTimer = null;
}
flushPending();
// Write closing line
const isoTimestamp = new Date().toISOString();
try {
fs.appendFileSync(sessionLogPath, `=== Session beendet: ${isoTimestamp} ===\n`, "utf8");
} catch {
// ignore
}
setLogListener(null);
sessionLogPath = null;
}

View File

@ -1,21 +1,24 @@
import fs from "node:fs";
import fsp from "node:fs/promises";
import path from "node:path";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, PackageEntry, SessionState } from "../shared/types";
import { AppSettings, BandwidthScheduleEntry, DebridProvider, DownloadItem, DownloadStatus, HistoryEntry, PackageEntry, PackagePriority, SessionState } from "../shared/types";
import { defaultSettings } from "./constants";
import { logger } from "./logger";
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_PRIMARY_PROVIDERS = new Set(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_FALLBACK_PROVIDERS = new Set(["none", "realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_CLEANUP_MODES = new Set(["none", "trash", "delete"]);
const VALID_CONFLICT_MODES = new Set(["overwrite", "skip", "rename", "ask"]);
const VALID_FINISHED_POLICIES = new Set(["never", "immediate", "on_start", "package_done"]);
const VALID_SPEED_MODES = new Set(["global", "per_download"]);
const VALID_THEMES = new Set(["dark", "light"]);
const VALID_EXTRACT_CPU_PRIORITIES = new Set(["high", "middle", "low"]);
const VALID_PACKAGE_PRIORITIES = new Set<string>(["high", "normal", "low"]);
const VALID_DOWNLOAD_STATUSES = new Set<DownloadStatus>([
"queued", "validating", "downloading", "paused", "reconnect_wait", "extracting", "integrity_check", "completed", "failed", "cancelled"
]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid"]);
const VALID_ITEM_PROVIDERS = new Set<DebridProvider>(["realdebrid", "megadebrid", "bestdebrid", "alldebrid", "ddownload", "onefichier"]);
const VALID_ONLINE_STATUSES = new Set(["online", "offline", "checking"]);
function asText(value: unknown): string {
return String(value ?? "").trim();
@ -65,6 +68,41 @@ function normalizeAbsoluteDir(value: unknown, fallback: string): string {
return path.resolve(text);
}
const DEFAULT_COLUMN_ORDER = ["name", "size", "progress", "hoster", "account", "prio", "status", "speed"];
const ALL_VALID_COLUMNS = new Set([...DEFAULT_COLUMN_ORDER, "added"]);
function normalizeColumnOrder(raw: unknown): string[] {
if (!Array.isArray(raw) || raw.length === 0) {
return [...DEFAULT_COLUMN_ORDER];
}
const valid = ALL_VALID_COLUMNS;
const seen = new Set<string>();
const result: string[] = [];
for (const col of raw) {
if (typeof col === "string" && valid.has(col) && !seen.has(col)) {
seen.add(col);
result.push(col);
}
}
// "name" is mandatory — ensure it's always present
if (!seen.has("name")) {
result.unshift("name");
}
return result;
}
const DEPRECATED_UPDATE_REPOS = new Set([
"sucukdeluxe/real-debrid-downloader"
]);
function migrateUpdateRepo(raw: string, fallback: string): string {
const trimmed = raw.trim();
if (!trimmed || DEPRECATED_UPDATE_REPOS.has(trimmed.toLowerCase())) {
return fallback;
}
return trimmed;
}
export function normalizeSettings(settings: AppSettings): AppSettings {
const defaults = defaultSettings();
const normalized: AppSettings = {
@ -73,7 +111,10 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
megaPassword: asText(settings.megaPassword),
bestToken: asText(settings.bestToken),
allDebridToken: asText(settings.allDebridToken),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n/g, "\n"),
ddownloadLogin: asText(settings.ddownloadLogin),
ddownloadPassword: asText(settings.ddownloadPassword),
oneFichierApiKey: asText(settings.oneFichierApiKey),
archivePasswordList: String(settings.archivePasswordList ?? "").replace(/\r\n|\r/g, "\n"),
rememberToken: Boolean(settings.rememberToken),
providerPrimary: settings.providerPrimary,
providerSecondary: settings.providerSecondary,
@ -96,6 +137,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
autoResumeOnStart: Boolean(settings.autoResumeOnStart),
autoReconnect: Boolean(settings.autoReconnect),
maxParallel: clampNumber(settings.maxParallel, defaults.maxParallel, 1, 50),
maxParallelExtract: clampNumber(settings.maxParallelExtract, defaults.maxParallelExtract, 1, 8),
retryLimit: clampNumber(settings.retryLimit, defaults.retryLimit, 0, 99),
reconnectWaitSeconds: clampNumber(settings.reconnectWaitSeconds, defaults.reconnectWaitSeconds, 10, 600),
completedCleanupPolicy: settings.completedCleanupPolicy,
@ -103,7 +145,7 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
speedLimitKbps: clampNumber(settings.speedLimitKbps, defaults.speedLimitKbps, 0, 500000),
speedLimitMode: settings.speedLimitMode,
autoUpdateCheck: Boolean(settings.autoUpdateCheck),
updateRepo: asText(settings.updateRepo) || defaults.updateRepo,
updateRepo: migrateUpdateRepo(asText(settings.updateRepo), defaults.updateRepo),
clipboardWatch: Boolean(settings.clipboardWatch),
minimizeToTray: Boolean(settings.minimizeToTray),
collapseNewPackages: settings.collapseNewPackages !== undefined ? Boolean(settings.collapseNewPackages) : defaults.collapseNewPackages,
@ -111,7 +153,10 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
confirmDeleteSelection: settings.confirmDeleteSelection !== undefined ? Boolean(settings.confirmDeleteSelection) : defaults.confirmDeleteSelection,
totalDownloadedAllTime: typeof settings.totalDownloadedAllTime === "number" && settings.totalDownloadedAllTime >= 0 ? settings.totalDownloadedAllTime : defaults.totalDownloadedAllTime,
theme: VALID_THEMES.has(settings.theme) ? settings.theme : defaults.theme,
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules)
bandwidthSchedules: normalizeBandwidthSchedules(settings.bandwidthSchedules),
columnOrder: normalizeColumnOrder(settings.columnOrder),
extractCpuPriority: settings.extractCpuPriority,
autoExtractWhenStopped: settings.autoExtractWhenStopped !== undefined ? Boolean(settings.autoExtractWhenStopped) : defaults.autoExtractWhenStopped
};
if (!VALID_PRIMARY_PROVIDERS.has(normalized.providerPrimary)) {
@ -141,6 +186,9 @@ export function normalizeSettings(settings: AppSettings): AppSettings {
if (!VALID_SPEED_MODES.has(normalized.speedLimitMode)) {
normalized.speedLimitMode = defaults.speedLimitMode;
}
if (!VALID_EXTRACT_CPU_PRIORITIES.has(normalized.extractCpuPriority)) {
normalized.extractCpuPriority = defaults.extractCpuPriority;
}
return normalized;
}
@ -156,7 +204,9 @@ function sanitizeCredentialPersistence(settings: AppSettings): AppSettings {
megaPassword: "",
bestToken: "",
allDebridToken: "",
archivePasswordList: ""
ddownloadLogin: "",
ddownloadPassword: "",
oneFichierApiKey: ""
};
}
@ -164,13 +214,15 @@ export interface StoragePaths {
baseDir: string;
configFile: string;
sessionFile: string;
historyFile: string;
}
export function createStoragePaths(baseDir: string): StoragePaths {
return {
baseDir,
configFile: path.join(baseDir, "rd_downloader_config.json"),
sessionFile: path.join(baseDir, "rd_session_state.json")
sessionFile: path.join(baseDir, "rd_session_state.json"),
historyFile: path.join(baseDir, "rd_history.json")
};
}
@ -198,7 +250,7 @@ function readSettingsFile(filePath: string): AppSettings | null {
}
}
function normalizeLoadedSession(raw: unknown): SessionState {
export function normalizeLoadedSession(raw: unknown): SessionState {
const fallback = emptySession();
const parsed = asRecord(raw);
if (!parsed) {
@ -224,6 +276,8 @@ function normalizeLoadedSession(raw: unknown): SessionState {
const status: DownloadStatus = VALID_DOWNLOAD_STATUSES.has(statusRaw) ? statusRaw : "queued";
const providerRaw = asText(item.provider) as DebridProvider;
const onlineStatusRaw = asText(item.onlineStatus);
itemsById[id] = {
id,
packageId,
@ -241,6 +295,7 @@ function normalizeLoadedSession(raw: unknown): SessionState {
attempts: clampNumber(item.attempts, 0, 0, 10_000),
lastError: asText(item.lastError),
fullStatus: asText(item.fullStatus),
onlineStatus: VALID_ONLINE_STATUSES.has(onlineStatusRaw) ? onlineStatusRaw as "online" | "offline" | "checking" : undefined,
createdAt: clampNumber(item.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(item.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
@ -271,6 +326,7 @@ function normalizeLoadedSession(raw: unknown): SessionState {
.filter((value) => value.length > 0),
cancelled: Boolean(pkg.cancelled),
enabled: pkg.enabled === undefined ? true : Boolean(pkg.enabled),
priority: VALID_PACKAGE_PRIORITIES.has(asText(pkg.priority)) ? asText(pkg.priority) as PackagePriority : "normal",
createdAt: clampNumber(pkg.createdAt, now, 0, Number.MAX_SAFE_INTEGER),
updatedAt: clampNumber(pkg.updatedAt, now, 0, Number.MAX_SAFE_INTEGER)
};
@ -301,7 +357,8 @@ function normalizeLoadedSession(raw: unknown): SessionState {
return true;
});
for (const packageId of Object.keys(packagesById)) {
if (!packageOrder.includes(packageId)) {
if (!seenOrder.has(packageId)) {
seenOrder.add(packageId);
packageOrder.push(packageId);
}
}
@ -373,7 +430,7 @@ function sessionBackupPath(sessionFile: string): string {
return `${sessionFile}.bak`;
}
function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
export function normalizeLoadedSessionTransientFields(session: SessionState): SessionState {
// Reset transient fields that may be stale from a previous crash
const ACTIVE_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const item of Object.values(session.items)) {
@ -385,6 +442,19 @@ function normalizeLoadedSessionTransientFields(session: SessionState): SessionSt
item.speedBps = 0;
}
// Reset package-level active statuses to queued (mirrors item reset above)
const ACTIVE_PKG_STATUSES = new Set(["downloading", "validating", "extracting", "integrity_check", "paused", "reconnect_wait"]);
for (const pkg of Object.values(session.packages)) {
if (ACTIVE_PKG_STATUSES.has(pkg.status)) {
pkg.status = "queued";
}
pkg.postProcessLabel = undefined;
}
// Clear stale session-level running/paused flags
session.running = false;
session.paused = false;
return session;
}
@ -410,12 +480,17 @@ export function saveSettings(paths: StoragePaths, settings: AppSettings): void {
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
const tempPath = `${paths.configFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.configFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSettingsSaveRunning = false;
let asyncSettingsSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let asyncSettingsSaveQueued: { paths: StoragePaths; settings: AppSettings } | null = null;
async function writeSettingsPayload(paths: StoragePaths, payload: string): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
@ -429,6 +504,7 @@ async function writeSettingsPayload(paths: StoragePaths, payload: string): Promi
await fsp.copyFile(tempPath, paths.configFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
@ -438,7 +514,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
const persisted = sanitizeCredentialPersistence(normalizeSettings(settings));
const payload = JSON.stringify(persisted, null, 2);
if (asyncSettingsSaveRunning) {
asyncSettingsSaveQueued = { paths, payload };
asyncSettingsSaveQueued = { paths, settings };
return;
}
asyncSettingsSaveRunning = true;
@ -451,7 +527,7 @@ export async function saveSettingsAsync(paths: StoragePaths, settings: AppSettin
if (asyncSettingsSaveQueued) {
const queued = asyncSettingsSaveQueued;
asyncSettingsSaveQueued = null;
void writeSettingsPayload(queued.paths, queued.payload).catch((err) => logger.error(`Async Settings-Save (queued) fehlgeschlagen: ${String(err)}`));
void saveSettingsAsync(queued.paths, queued.settings);
}
}
}
@ -504,6 +580,7 @@ export function loadSession(paths: StoragePaths): SessionState {
}
export function saveSession(paths: StoragePaths, session: SessionState): void {
syncSaveGeneration += 1;
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.sessionFile)) {
try {
@ -514,25 +591,41 @@ export function saveSession(paths: StoragePaths, session: SessionState): void {
}
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
const tempPath = sessionTempPath(paths.sessionFile, "sync");
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.sessionFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
let asyncSaveRunning = false;
let asyncSaveQueued: { paths: StoragePaths; payload: string } | null = null;
let syncSaveGeneration = 0;
async function writeSessionPayload(paths: StoragePaths, payload: string): Promise<void> {
async function writeSessionPayload(paths: StoragePaths, payload: string, generation: number): Promise<void> {
await fs.promises.mkdir(paths.baseDir, { recursive: true });
await fsp.copyFile(paths.sessionFile, sessionBackupPath(paths.sessionFile)).catch(() => {});
const tempPath = sessionTempPath(paths.sessionFile, "async");
await fsp.writeFile(tempPath, payload, "utf8");
// If a synchronous save occurred after this async save started, discard the stale write
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
try {
await fsp.rename(tempPath, paths.sessionFile);
} catch (renameError: unknown) {
if (renameError && typeof renameError === "object" && "code" in renameError && (renameError as NodeJS.ErrnoException).code === "EXDEV") {
if (generation < syncSaveGeneration) {
await fsp.rm(tempPath, { force: true }).catch(() => {});
return;
}
await fsp.copyFile(tempPath, paths.sessionFile);
await fsp.rm(tempPath, { force: true }).catch(() => {});
} else {
await fsp.rm(tempPath, { force: true }).catch(() => {});
throw renameError;
}
}
@ -544,8 +637,9 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
return;
}
asyncSaveRunning = true;
const gen = syncSaveGeneration;
try {
await writeSessionPayload(paths, payload);
await writeSessionPayload(paths, payload, gen);
} catch (error) {
logger.error(`Async Session-Save fehlgeschlagen: ${String(error)}`);
} finally {
@ -558,7 +652,98 @@ async function saveSessionPayloadAsync(paths: StoragePaths, payload: string): Pr
}
}
export function cancelPendingAsyncSaves(): void {
asyncSaveQueued = null;
asyncSettingsSaveQueued = null;
syncSaveGeneration += 1;
}
export async function saveSessionAsync(paths: StoragePaths, session: SessionState): Promise<void> {
const payload = JSON.stringify({ ...session, updatedAt: Date.now() });
await saveSessionPayloadAsync(paths, payload);
}
const MAX_HISTORY_ENTRIES = 500;
function normalizeHistoryEntry(raw: unknown, index: number): HistoryEntry | null {
const entry = asRecord(raw);
if (!entry) return null;
const id = asText(entry.id) || `hist-${Date.now().toString(36)}-${index}`;
const name = asText(entry.name) || "Unbenannt";
const providerRaw = asText(entry.provider);
return {
id,
name,
totalBytes: clampNumber(entry.totalBytes, 0, 0, Number.MAX_SAFE_INTEGER),
downloadedBytes: clampNumber(entry.downloadedBytes, 0, 0, Number.MAX_SAFE_INTEGER),
fileCount: clampNumber(entry.fileCount, 0, 0, 100000),
provider: VALID_ITEM_PROVIDERS.has(providerRaw as DebridProvider) ? providerRaw as DebridProvider : null,
completedAt: clampNumber(entry.completedAt, Date.now(), 0, Number.MAX_SAFE_INTEGER),
durationSeconds: clampNumber(entry.durationSeconds, 0, 0, Number.MAX_SAFE_INTEGER),
status: entry.status === "deleted" ? "deleted" : "completed",
outputDir: asText(entry.outputDir),
urls: Array.isArray(entry.urls) ? (entry.urls as unknown[]).map(String).filter(Boolean) : undefined
};
}
export function loadHistory(paths: StoragePaths): HistoryEntry[] {
ensureBaseDir(paths.baseDir);
if (!fs.existsSync(paths.historyFile)) {
return [];
}
try {
const raw = JSON.parse(fs.readFileSync(paths.historyFile, "utf8")) as unknown;
if (!Array.isArray(raw)) return [];
const entries: HistoryEntry[] = [];
for (let i = 0; i < raw.length && entries.length < MAX_HISTORY_ENTRIES; i++) {
const normalized = normalizeHistoryEntry(raw[i], i);
if (normalized) entries.push(normalized);
}
return entries;
} catch {
return [];
}
}
export function saveHistory(paths: StoragePaths, entries: HistoryEntry[]): void {
ensureBaseDir(paths.baseDir);
const trimmed = entries.slice(0, MAX_HISTORY_ENTRIES);
const payload = JSON.stringify(trimmed, null, 2);
const tempPath = `${paths.historyFile}.tmp`;
try {
fs.writeFileSync(tempPath, payload, "utf8");
syncRenameWithExdevFallback(tempPath, paths.historyFile);
} catch (error) {
try { fs.rmSync(tempPath, { force: true }); } catch { /* ignore */ }
throw error;
}
}
export function addHistoryEntry(paths: StoragePaths, entry: HistoryEntry): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = [entry, ...existing].slice(0, MAX_HISTORY_ENTRIES);
saveHistory(paths, updated);
return updated;
}
export function removeHistoryEntry(paths: StoragePaths, entryId: string): HistoryEntry[] {
const existing = loadHistory(paths);
const updated = existing.filter(e => e.id !== entryId);
saveHistory(paths, updated);
return updated;
}
export function clearHistory(paths: StoragePaths): void {
ensureBaseDir(paths.baseDir);
if (fs.existsSync(paths.historyFile)) {
try {
fs.unlinkSync(paths.historyFile);
} catch {
// ignore
}
}
}

View File

@ -14,8 +14,32 @@ const DOWNLOAD_BODY_IDLE_TIMEOUT_MS = 45000;
const RETRIES_PER_CANDIDATE = 3;
const RETRY_DELAY_MS = 1500;
const UPDATE_USER_AGENT = `RD-Node-Downloader/${APP_VERSION}`;
const UPDATE_WEB_BASE = "https://codeberg.org";
const UPDATE_API_BASE = "https://codeberg.org/api/v1";
type UpdateSource = {
name: string;
webBase: string;
apiBase: string;
};
const UPDATE_SOURCES: UpdateSource[] = [
{
name: "git24",
webBase: "https://git.24-music.de",
apiBase: "https://git.24-music.de/api/v1"
},
{
name: "codeberg",
webBase: "https://codeberg.org",
apiBase: "https://codeberg.org/api/v1"
},
{
name: "github",
webBase: "https://github.com",
apiBase: "https://api.github.com"
}
];
const PRIMARY_UPDATE_SOURCE = UPDATE_SOURCES[0];
const UPDATE_WEB_BASE = PRIMARY_UPDATE_SOURCE.webBase;
const UPDATE_API_BASE = PRIMARY_UPDATE_SOURCE.apiBase;
let activeUpdateAbortController: AbortController | null = null;
@ -57,9 +81,9 @@ export function normalizeUpdateRepo(repo: string): string {
const normalizeParts = (input: string): string => {
const cleaned = input
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com):/i, "")
.replace(/^https?:\/\/(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^(?:www\.)?(?:codeberg\.org|github\.com|git\.24-music\.de)\//i, "")
.replace(/^git@(?:codeberg\.org|github\.com|git\.24-music\.de):/i, "")
.replace(/\.git$/i, "")
.replace(/^\/+|\/+$/g, "");
const parts = cleaned.split("/").filter(Boolean);
@ -76,7 +100,13 @@ export function normalizeUpdateRepo(repo: string): string {
try {
const url = new URL(raw);
const host = url.hostname.toLowerCase();
if (host === "codeberg.org" || host === "www.codeberg.org" || host === "github.com" || host === "www.github.com") {
if (
host === "codeberg.org"
|| host === "www.codeberg.org"
|| host === "github.com"
|| host === "www.github.com"
|| host === "git.24-music.de"
) {
const normalized = normalizeParts(url.pathname);
if (normalized) {
return normalized;
@ -306,6 +336,8 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
const releaseUrl = String(payload.html_url || fallback.releaseUrl);
const setup = pickSetupAsset(readReleaseAssets(payload));
const body = typeof payload.body === "string" ? payload.body.trim() : "";
return {
updateAvailable: isRemoteNewer(APP_VERSION, latestVersion),
currentVersion: APP_VERSION,
@ -314,7 +346,8 @@ function parseReleasePayload(payload: Record<string, unknown>, fallback: UpdateC
releaseUrl,
setupAssetUrl: setup?.browser_download_url || "",
setupAssetName: setup?.name || "",
setupAssetDigest: setup?.digest || ""
setupAssetDigest: setup?.digest || "",
releaseNotes: body || undefined
};
}
@ -761,7 +794,8 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
};
const reader = response.body.getReader();
const chunks: Buffer[] = [];
const tempPath = targetPath + ".tmp";
const writeStream = fs.createWriteStream(tempPath);
try {
resetIdleTimer();
@ -775,27 +809,39 @@ async function downloadFile(url: string, targetPath: string, onProgress?: Update
break;
}
const buf = Buffer.from(value.buffer, value.byteOffset, value.byteLength);
chunks.push(buf);
if (!writeStream.write(buf)) {
await new Promise<void>((resolve) => writeStream.once("drain", resolve));
}
downloadedBytes += buf.byteLength;
resetIdleTimer();
emitDownloadProgress(false);
}
} catch (error) {
writeStream.destroy();
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw error;
} finally {
clearIdleTimer();
}
await new Promise<void>((resolve, reject) => {
writeStream.end(() => resolve());
writeStream.on("error", reject);
});
if (idleTimedOut) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download Body Timeout nach ${Math.ceil(idleTimeoutMs / 1000)}s`);
}
const fileBuffer = Buffer.concat(chunks);
if (totalBytes && fileBuffer.byteLength !== totalBytes) {
throw new Error(`Update Download unvollständig (${fileBuffer.byteLength} / ${totalBytes} Bytes)`);
if (totalBytes && downloadedBytes !== totalBytes) {
await fs.promises.rm(tempPath, { force: true }).catch(() => {});
throw new Error(`Update Download unvollständig (${downloadedBytes} / ${totalBytes} Bytes)`);
}
await fs.promises.writeFile(targetPath, fileBuffer);
await fs.promises.rename(tempPath, targetPath);
emitDownloadProgress(true);
logger.info(`Update-Download abgeschlossen: ${targetPath} (${fileBuffer.byteLength} Bytes)`);
logger.info(`Update-Download abgeschlossen: ${targetPath} (${downloadedBytes} Bytes)`);
return { expectedBytes: totalBytes };
}

View File

@ -3,6 +3,8 @@ import {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
@ -29,6 +31,7 @@ const api: ElectronApi = {
ipcRenderer.invoke(IPC_CHANNELS.RESOLVE_START_CONFLICT, packageId, policy),
clearAll: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_ALL),
start: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START),
startPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_PACKAGES, packageIds),
stop: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.STOP),
togglePause: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PAUSE),
cancelPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CANCEL_PACKAGE, packageId),
@ -36,7 +39,7 @@ const api: ElectronApi = {
reorderPackages: (packageIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REORDER_PACKAGES, packageIds),
removeItem: (itemId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_ITEM, itemId),
togglePackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_PACKAGE, packageId),
exportQueue: (): Promise<string> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
exportQueue: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_QUEUE),
importQueue: (json: string): Promise<{ addedPackages: number; addedLinks: number }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_QUEUE, json),
toggleClipboard: (): Promise<boolean> => ipcRenderer.invoke(IPC_CHANNELS.TOGGLE_CLIPBOARD),
pickFolder: (): Promise<string | null> => ipcRenderer.invoke(IPC_CHANNELS.PICK_FOLDER),
@ -47,8 +50,17 @@ const api: ElectronApi = {
exportBackup: (): Promise<{ saved: boolean }> => ipcRenderer.invoke(IPC_CHANNELS.EXPORT_BACKUP),
importBackup: (): Promise<{ restored: boolean; message: string }> => ipcRenderer.invoke(IPC_CHANNELS.IMPORT_BACKUP),
openLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_LOG),
openSessionLog: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.OPEN_SESSION_LOG),
retryExtraction: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RETRY_EXTRACTION, packageId),
extractNow: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.EXTRACT_NOW, packageId),
resetPackage: (packageId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_PACKAGE, packageId),
getHistory: (): Promise<HistoryEntry[]> => ipcRenderer.invoke(IPC_CHANNELS.GET_HISTORY),
clearHistory: (): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.CLEAR_HISTORY),
removeHistoryEntry: (entryId: string): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.REMOVE_HISTORY_ENTRY, entryId),
setPackagePriority: (packageId: string, priority: PackagePriority): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SET_PACKAGE_PRIORITY, packageId, priority),
skipItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.SKIP_ITEMS, itemIds),
resetItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.RESET_ITEMS, itemIds),
startItems: (itemIds: string[]): Promise<void> => ipcRenderer.invoke(IPC_CHANNELS.START_ITEMS, itemIds),
onStateUpdate: (callback: (snapshot: UiSnapshot) => void): (() => void) => {
const listener = (_event: unknown, snapshot: UiSnapshot): void => callback(snapshot);
ipcRenderer.on(IPC_CHANNELS.STATE_UPDATE, listener);

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,25 @@
import type { PackageEntry } from "../shared/types";
export function reorderPackageOrderByDrop(order: string[], draggedPackageId: string, targetPackageId: string): string[] {
const fromIndex = order.indexOf(draggedPackageId);
const toIndex = order.indexOf(targetPackageId);
if (fromIndex < 0 || toIndex < 0 || fromIndex === toIndex) {
return order;
}
const next = [...order];
const [dragged] = next.splice(fromIndex, 1);
const insertIndex = Math.max(0, Math.min(next.length, toIndex));
next.splice(insertIndex, 0, dragged);
return next;
}
export function sortPackageOrderByName(order: string[], packages: Record<string, PackageEntry>, descending: boolean): string[] {
const sorted = [...order];
sorted.sort((a, b) => {
const nameA = (packages[a]?.name ?? "").toLowerCase();
const nameB = (packages[b]?.name ?? "").toLowerCase();
const cmp = nameA.localeCompare(nameB, undefined, { numeric: true, sensitivity: "base" });
return descending ? -cmp : cmp;
});
return sorted;
}

View File

@ -344,6 +344,15 @@ body,
background: rgba(244, 63, 94, 0.1);
}
.ctrl-icon-btn.ctrl-move:not(:disabled) {
color: var(--accent);
}
.ctrl-icon-btn.ctrl-move:hover:not(:disabled) {
border-color: var(--accent);
background: color-mix(in srgb, var(--accent) 10%, transparent);
}
.ctrl-icon-btn.ctrl-speed.active {
color: #f59e0b;
border-color: rgba(245, 158, 11, 0.5);
@ -577,7 +586,7 @@ body,
.pkg-column-header {
display: grid;
grid-template-columns: 1fr 70px 160px 220px 180px 100px;
/* grid-template-columns set via inline style from columnOrder */
gap: 8px;
padding: 5px 12px;
background: var(--card);
@ -589,6 +598,17 @@ body,
user-select: none;
}
.pkg-column-header .pkg-col-progress,
.pkg-column-header .pkg-col-size,
.pkg-column-header .pkg-col-hoster,
.pkg-column-header .pkg-col-account,
.pkg-column-header .pkg-col-prio,
.pkg-column-header .pkg-col-status,
.pkg-column-header .pkg-col-speed,
.pkg-column-header .pkg-col-added {
text-align: center;
}
.pkg-column-header .sortable {
cursor: pointer;
}
@ -603,7 +623,7 @@ body,
.pkg-columns {
display: grid;
grid-template-columns: 1fr 70px 160px 220px 180px 100px;
/* grid-template-columns set via inline style from columnOrder */
gap: 8px;
align-items: center;
min-width: 0;
@ -626,17 +646,22 @@ body,
.pkg-columns .pkg-col-progress,
.pkg-columns .pkg-col-size,
.pkg-columns .pkg-col-hoster,
.pkg-columns .pkg-col-account,
.pkg-columns .pkg-col-prio,
.pkg-columns .pkg-col-status,
.pkg-columns .pkg-col-speed {
.pkg-columns .pkg-col-speed,
.pkg-columns .pkg-col-added {
font-size: 13px;
color: var(--muted);
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
text-align: center;
}
.pkg-col-progress {
font-variant-numeric: tabular-nums;
padding-right: 12px;
}
.progress-size {
@ -848,7 +873,7 @@ body,
.status-bar {
display: flex;
flex-wrap: wrap;
gap: 16px;
gap: 8px 16px;
align-items: center;
color: var(--muted);
font-size: 12px;
@ -859,6 +884,16 @@ body,
margin: 0 -14px -10px;
}
.footer-spacer {
flex: 1;
}
.footer-btn {
font-size: 11px;
padding: 2px 8px;
min-height: 0;
}
.settings-shell {
display: grid;
grid-template-rows: auto 1fr;
@ -1103,6 +1138,11 @@ body,
font-size: 13px;
}
.package-card header .progress-inline-text-filled,
.package-card header .progress-size-text-filled {
color: #0a0f1a;
}
.pkg-toggle {
display: inline-flex;
@ -1267,7 +1307,7 @@ td {
.item-row {
display: grid;
grid-template-columns: 1fr 70px 160px 220px 180px 100px;
/* grid-template-columns set via inline style from columnOrder */
gap: 8px;
align-items: center;
margin: 0 -12px;
@ -1286,6 +1326,11 @@ td {
white-space: nowrap;
color: var(--muted);
font-size: 13px;
text-align: center;
}
.item-row .pkg-col-name {
text-align: left;
}
.item-row .pkg-col-name {
@ -1293,6 +1338,89 @@ td {
padding-left: 32px;
}
.link-status-dot {
display: inline-block;
width: 8px;
height: 8px;
border-radius: 50%;
margin-right: 6px;
flex-shrink: 0;
vertical-align: middle;
}
.link-status-dot.online {
background: #22c55e;
box-shadow: 0 0 4px #22c55e80;
}
.link-status-dot.offline {
background: #ef4444;
box-shadow: 0 0 4px #ef444480;
}
.link-status-dot.checking {
background: #f59e0b;
box-shadow: 0 0 4px #f59e0b80;
}
.prio-high {
color: #f59e0b !important;
font-weight: 700;
}
.prio-low {
color: #64748b !important;
}
.pkg-col-dragging {
opacity: 0.4;
}
.pkg-col-drop-target {
box-shadow: -2px 0 0 0 var(--accent);
}
.pkg-column-header .pkg-col {
cursor: grab;
}
.pkg-column-header .pkg-col.sortable {
cursor: pointer;
}
.ctx-menu-sub {
position: relative;
}
.ctx-menu-sub > .ctx-menu-item::after {
content: "";
}
.ctx-menu-sub-items {
display: none;
position: absolute;
left: 100%;
top: 0;
min-width: 120px;
background: var(--card);
border: 1px solid var(--border);
border-radius: 6px;
padding: 4px 0;
box-shadow: 0 4px 12px rgba(0,0,0,.3);
z-index: 1001;
}
.ctx-menu-sub:hover .ctx-menu-sub-items {
display: block;
}
.ctx-menu-active {
color: var(--accent) !important;
}
.ctx-menu-disabled {
opacity: 0.4;
cursor: not-allowed !important;
pointer-events: none;
}
.item-remove {
background: none;
border: none;
@ -1511,6 +1639,7 @@ td {
border-radius: 12px;
padding: 10px 14px;
box-shadow: 0 16px 30px rgba(0, 0, 0, 0.35);
z-index: 50;
}
.ctx-menu {
@ -1620,6 +1749,7 @@ td {
font-weight: 600;
pointer-events: none;
backdrop-filter: blur(2px);
z-index: 200;
}
.modal-backdrop {
@ -1634,6 +1764,8 @@ td {
.modal-card {
width: min(560px, 100%);
max-height: calc(100vh - 40px);
overflow-y: auto;
border: 1px solid var(--border);
border-radius: 14px;
background: linear-gradient(180deg, color-mix(in srgb, var(--card) 98%, transparent), color-mix(in srgb, var(--surface) 98%, transparent));
@ -1652,6 +1784,34 @@ td {
color: var(--muted);
}
.modal-details {
border: 1px solid var(--border);
border-radius: 6px;
padding: 0;
}
.modal-details summary {
padding: 6px 10px;
cursor: pointer;
font-size: 13px;
color: var(--muted);
user-select: none;
}
.modal-details summary:hover {
color: var(--text);
}
.modal-details pre {
margin: 0;
padding: 8px 10px;
border-top: 1px solid var(--border);
font-size: 12px;
line-height: 1.5;
white-space: pre-wrap;
word-break: break-word;
max-height: 260px;
overflow-y: auto;
color: var(--muted);
}
.modal-path {
font-size: 12px;
word-break: break-all;
@ -1722,23 +1882,30 @@ td {
}
.pkg-columns,
.pkg-column-header {
grid-template-columns: 1fr;
.pkg-column-header,
.item-row {
grid-template-columns: 1fr !important;
}
.pkg-column-header .pkg-col-progress,
.pkg-column-header .pkg-col-size,
.pkg-column-header .pkg-col-hoster,
.pkg-column-header .pkg-col-account,
.pkg-column-header .pkg-col-prio,
.pkg-column-header .pkg-col-status,
.pkg-column-header .pkg-col-speed {
.pkg-column-header .pkg-col-speed,
.pkg-column-header .pkg-col-added {
display: none;
}
.pkg-columns .pkg-col-progress,
.pkg-columns .pkg-col-size,
.pkg-columns .pkg-col-hoster,
.pkg-columns .pkg-col-account,
.pkg-columns .pkg-col-prio,
.pkg-columns .pkg-col-status,
.pkg-columns .pkg-col-speed {
.pkg-columns .pkg-col-speed,
.pkg-columns .pkg-col-added {
display: none;
}

View File

@ -12,6 +12,7 @@ export const IPC_CHANNELS = {
RESOLVE_START_CONFLICT: "queue:resolve-start-conflict",
CLEAR_ALL: "queue:clear-all",
START: "queue:start",
START_PACKAGES: "queue:start-packages",
STOP: "queue:stop",
TOGGLE_PAUSE: "queue:toggle-pause",
CANCEL_PACKAGE: "queue:cancel-package",
@ -32,6 +33,15 @@ export const IPC_CHANNELS = {
EXPORT_BACKUP: "app:export-backup",
IMPORT_BACKUP: "app:import-backup",
OPEN_LOG: "app:open-log",
OPEN_SESSION_LOG: "app:open-session-log",
RETRY_EXTRACTION: "queue:retry-extraction",
EXTRACT_NOW: "queue:extract-now"
EXTRACT_NOW: "queue:extract-now",
RESET_PACKAGE: "queue:reset-package",
GET_HISTORY: "history:get",
CLEAR_HISTORY: "history:clear",
REMOVE_HISTORY_ENTRY: "history:remove-entry",
SET_PACKAGE_PRIORITY: "queue:set-package-priority",
SKIP_ITEMS: "queue:skip-items",
RESET_ITEMS: "queue:reset-items",
START_ITEMS: "queue:start-items"
} as const;

View File

@ -2,6 +2,8 @@ import type {
AddLinksPayload,
AppSettings,
DuplicatePolicy,
HistoryEntry,
PackagePriority,
SessionStats,
StartConflictEntry,
StartConflictResolutionResult,
@ -24,6 +26,7 @@ export interface ElectronApi {
resolveStartConflict: (packageId: string, policy: DuplicatePolicy) => Promise<StartConflictResolutionResult>;
clearAll: () => Promise<void>;
start: () => Promise<void>;
startPackages: (packageIds: string[]) => Promise<void>;
stop: () => Promise<void>;
togglePause: () => Promise<boolean>;
cancelPackage: (packageId: string) => Promise<void>;
@ -31,7 +34,7 @@ export interface ElectronApi {
reorderPackages: (packageIds: string[]) => Promise<void>;
removeItem: (itemId: string) => Promise<void>;
togglePackage: (packageId: string) => Promise<void>;
exportQueue: () => Promise<string>;
exportQueue: () => Promise<{ saved: boolean }>;
importQueue: (json: string) => Promise<{ addedPackages: number; addedLinks: number }>;
toggleClipboard: () => Promise<boolean>;
pickFolder: () => Promise<string | null>;
@ -42,8 +45,17 @@ export interface ElectronApi {
exportBackup: () => Promise<{ saved: boolean }>;
importBackup: () => Promise<{ restored: boolean; message: string }>;
openLog: () => Promise<void>;
openSessionLog: () => Promise<void>;
retryExtraction: (packageId: string) => Promise<void>;
extractNow: (packageId: string) => Promise<void>;
resetPackage: (packageId: string) => Promise<void>;
getHistory: () => Promise<HistoryEntry[]>;
clearHistory: () => Promise<void>;
removeHistoryEntry: (entryId: string) => Promise<void>;
setPackagePriority: (packageId: string, priority: PackagePriority) => Promise<void>;
skipItems: (itemIds: string[]) => Promise<void>;
resetItems: (itemIds: string[]) => Promise<void>;
startItems: (itemIds: string[]) => Promise<void>;
onStateUpdate: (callback: (snapshot: UiSnapshot) => void) => () => void;
onClipboardDetected: (callback: (links: string[]) => void) => () => void;
onUpdateInstallProgress: (callback: (progress: UpdateInstallProgress) => void) => () => void;

View File

@ -14,9 +14,11 @@ export type CleanupMode = "none" | "trash" | "delete";
export type ConflictMode = "overwrite" | "skip" | "rename" | "ask";
export type SpeedMode = "global" | "per_download";
export type FinishedCleanupPolicy = "never" | "immediate" | "on_start" | "package_done";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid";
export type DebridProvider = "realdebrid" | "megadebrid" | "bestdebrid" | "alldebrid" | "ddownload" | "onefichier";
export type DebridFallbackProvider = DebridProvider | "none";
export type AppTheme = "dark" | "light";
export type PackagePriority = "high" | "normal" | "low";
export type ExtractCpuPriority = "high" | "middle" | "low";
export interface BandwidthScheduleEntry {
id: string;
@ -40,6 +42,9 @@ export interface AppSettings {
megaPassword: string;
bestToken: string;
allDebridToken: string;
ddownloadLogin: string;
ddownloadPassword: string;
oneFichierApiKey: string;
archivePasswordList: string;
rememberToken: boolean;
providerPrimary: DebridProvider;
@ -65,6 +70,7 @@ export interface AppSettings {
reconnectWaitSeconds: number;
completedCleanupPolicy: FinishedCleanupPolicy;
maxParallel: number;
maxParallelExtract: number;
retryLimit: number;
speedLimitEnabled: boolean;
speedLimitKbps: number;
@ -79,6 +85,9 @@ export interface AppSettings {
confirmDeleteSelection: boolean;
totalDownloadedAllTime: number;
bandwidthSchedules: BandwidthScheduleEntry[];
columnOrder: string[];
extractCpuPriority: ExtractCpuPriority;
autoExtractWhenStopped: boolean;
}
export interface DownloadItem {
@ -100,6 +109,7 @@ export interface DownloadItem {
fullStatus: string;
createdAt: number;
updatedAt: number;
onlineStatus?: "online" | "offline" | "checking";
}
export interface PackageEntry {
@ -111,6 +121,8 @@ export interface PackageEntry {
itemIds: string[];
cancelled: boolean;
enabled: boolean;
priority: PackagePriority;
postProcessLabel?: string;
createdAt: number;
updatedAt: number;
}
@ -211,6 +223,7 @@ export interface UpdateCheckResult {
setupAssetUrl?: string;
setupAssetName?: string;
setupAssetDigest?: string;
releaseNotes?: string;
error?: string;
}
@ -255,3 +268,22 @@ export interface SessionStats {
activeDownloads: number;
queuedDownloads: number;
}
export interface HistoryEntry {
id: string;
name: string;
totalBytes: number;
downloadedBytes: number;
fileCount: number;
provider: DebridProvider | null;
completedAt: number;
durationSeconds: number;
status: "completed" | "deleted";
outputDir: string;
urls?: string[];
}
export interface HistoryState {
entries: HistoryEntry[];
maxEntries: number;
}

View File

@ -1,5 +1,5 @@
import { describe, expect, it } from "vitest";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/App";
import { reorderPackageOrderByDrop, sortPackageOrderByName } from "../src/renderer/package-order";
describe("reorderPackageOrderByDrop", () => {
it("moves adjacent package down by one on drop", () => {
@ -25,9 +25,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg3", "pkg1", "pkg2"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
false
);
@ -38,9 +38,9 @@ describe("sortPackageOrderByName", () => {
const sorted = sortPackageOrderByName(
["pkg1", "pkg2", "pkg3"],
{
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, createdAt: 0, updatedAt: 0 }
pkg1: { id: "pkg1", name: "Alpha", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg2: { id: "pkg2", name: "beta", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 },
pkg3: { id: "pkg3", name: "Gamma", outputDir: "", extractDir: "", status: "queued", itemIds: [], cancelled: false, enabled: true, priority: "normal", createdAt: 0, updatedAt: 0 }
},
true
);

View File

@ -62,6 +62,22 @@ describe("extractEpisodeToken", () => {
it("extracts from episode token at end of string", () => {
expect(extractEpisodeToken("show.s02e03")).toBe("S02E03");
});
it("extracts double episode token s01e01e02", () => {
expect(extractEpisodeToken("tvr-mammon-s01e01e02-720p")).toBe("S01E01E02");
});
it("extracts double episode with dot separators", () => {
expect(extractEpisodeToken("Show.S01E03E04.720p")).toBe("S01E03E04");
});
it("extracts double episode at end of string", () => {
expect(extractEpisodeToken("show.s02e05e06")).toBe("S02E05E06");
});
it("extracts double episode with single-digit numbers", () => {
expect(extractEpisodeToken("show-s1e1e2-720p")).toBe("S01E01E02");
});
});
describe("applyEpisodeTokenToFolderName", () => {
@ -96,6 +112,21 @@ describe("applyEpisodeTokenToFolderName", () => {
it("is case-insensitive for -4SF/-4SJ suffix", () => {
expect(applyEpisodeTokenToFolderName("Show.720p-4SF", "S01E01")).toBe("Show.720p.S01E01-4SF");
});
it("applies double episode token to season-only folder", () => {
expect(applyEpisodeTokenToFolderName("Mammon.S01.German.1080P.Bluray.x264-SMAHD", "S01E01E02"))
.toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("replaces existing double episode in folder with new token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01E02.720p-4sf", "S01E03E04"))
.toBe("Show.S01E03E04.720p-4sf");
});
it("replaces existing single episode in folder with double episode token", () => {
expect(applyEpisodeTokenToFolderName("Show.S01E01.720p-4sf", "S01E01E02"))
.toBe("Show.S01E01E02.720p-4sf");
});
});
describe("sourceHasRpToken", () => {
@ -238,6 +269,7 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S99.720p-4sf", "show.s99e999.720p.mkv");
// SCENE_EPISODE_RE allows up to 3-digit episodes and 2-digit seasons
expect(result).not.toBeNull();
expect(result!).toContain("S99E999");
});
// Real-world scene release patterns
@ -312,6 +344,7 @@ describe("buildAutoRenameBaseName", () => {
const result = buildAutoRenameBaseName("Show.S01-4sf", "show.s01e01.mkv");
// "mkv" should not be treated as part of the filename match
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
});
it("does not match episode-like patterns in codec strings", () => {
@ -342,6 +375,7 @@ describe("buildAutoRenameBaseName", () => {
// Extreme edge case - sanitizeFilename trims leading dots
expect(result).not.toBeNull();
expect(result!).toContain("S01E01");
expect(result!).toContain("-4sf");
expect(result!).not.toContain(".S01E01.S01E01"); // no duplication
});
@ -556,4 +590,105 @@ describe("buildAutoRenameBaseNameFromFolders", () => {
);
expect(result).toBe("Cheat.der.Betrug.S01E01.GERMAN.720p.WEB.h264-tmsf");
});
it("renames double episode file into season folder (Mammon style)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e01e02-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E01E02.German.1080P.Bluray.x264-SMAHD");
});
it("renames second double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e03e04-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E03E04.German.1080P.Bluray.x264-SMAHD");
});
it("renames third double episode file correctly", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mammon.S01.German.1080P.Bluray.x264-SMAHD"
],
"tvr-mammon-s01e05e06-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mammon.S01E05E06.German.1080P.Bluray.x264-SMAHD");
});
// Last-resort fallback: folder has season but no scene group suffix (user-renamed packages)
it("renames when folder has season but no scene group suffix (Mystery Road case)", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Mystery Road S02E05");
});
it("renames with season-only folder and custom name without dots", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Meine Serie S03"],
"meine-serie-s03e10-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Meine Serie S03E10");
});
it("prefers scene-group folder over season-only fallback", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
[
"Mystery Road S02",
"Mystery.Road.S02.GERMAN.DL.AC3.720p.HDTV.x264-hrs"
],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: true }
);
// Should use the scene-group folder (hrs), not the custom one
expect(result).toBe("Mystery.Road.S02E05.GERMAN.DL.AC3.720p.HDTV.x264-hrs");
});
it("does not use season-only fallback when forceEpisodeForSeasonFolder is false", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Mystery Road S02"],
"myst.road.de.dl.hdtv.7p-s02e05",
{ forceEpisodeForSeasonFolder: false }
);
expect(result).toBeNull();
});
it("renames Riviera S02 with single-digit episode s02e2", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Riviera.S02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP"],
"tvp-riviera-s02e2-720p",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Riviera.S02E02.GERMAN.DUBBED.DL.720p.WebHD.x264-TVP");
});
it("renames Room 104 abbreviated source r104.de.dl.web.7p-s04e02", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"r104.de.dl.web.7p-s04e02",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E02.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
it("renames Room 104 wayne source with episode", () => {
const result = buildAutoRenameBaseNameFromFoldersWithOptions(
["Room.104.S04.GERMAN.DL.720p.WEBRiP.x264-LAW"],
"room.104.s04e01.german.dl.720p.web.h264-wayne",
{ forceEpisodeForSeasonFolder: true }
);
expect(result).toBe("Room.104.S04E01.GERMAN.DL.720p.WEBRiP.x264-LAW");
});
});

View File

@ -25,15 +25,15 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(dir, "movie.mkv"))).toBe(true);
});
it("removes sample artifacts and link files", () => {
it("removes sample artifacts and link files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
fs.mkdirSync(path.join(dir, "Samples"), { recursive: true });
fs.writeFileSync(path.join(dir, "Samples", "demo-sample.mkv"), "x");
fs.writeFileSync(path.join(dir, "download_links.txt"), "https://example.com/a\n");
const links = removeDownloadLinkArtifacts(dir);
const samples = removeSampleArtifacts(dir);
const links = await removeDownloadLinkArtifacts(dir);
const samples = await removeSampleArtifacts(dir);
expect(links).toBeGreaterThan(0);
expect(samples.files + samples.dirs).toBeGreaterThan(0);
});
@ -66,7 +66,7 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(sub2, "subtitle.srt"))).toBe(true);
});
it("detects link artifacts by URL content in text files", () => {
it("detects link artifacts by URL content in text files", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
tempDirs.push(dir);
@ -81,7 +81,7 @@ describe("cleanup", () => {
// .dlc files should always be removed
fs.writeFileSync(path.join(dir, "container.dlc"), "encrypted-data");
const removed = removeDownloadLinkArtifacts(dir);
const removed = await removeDownloadLinkArtifacts(dir);
expect(removed).toBeGreaterThanOrEqual(3); // download_links.txt + bookmark.url + container.dlc
expect(fs.existsSync(path.join(dir, "download_links.txt"))).toBe(false);
expect(fs.existsSync(path.join(dir, "bookmark.url"))).toBe(false);
@ -90,7 +90,7 @@ describe("cleanup", () => {
expect(fs.existsSync(path.join(dir, "readme.txt"))).toBe(true);
});
it("does not recurse into sample symlink or junction targets", () => {
it("does not recurse into sample symlink or junction targets", async () => {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-"));
const external = fs.mkdtempSync(path.join(os.tmpdir(), "rd-clean-ext-"));
tempDirs.push(dir, external);
@ -102,7 +102,7 @@ describe("cleanup", () => {
const linkType: fs.symlink.Type = process.platform === "win32" ? "junction" : "dir";
fs.symlinkSync(external, linkedSampleDir, linkType);
const result = removeSampleArtifacts(dir);
const result = await removeSampleArtifacts(dir);
expect(result.files).toBe(0);
expect(fs.existsSync(outsideFile)).toBe(true);
});

View File

@ -317,7 +317,7 @@ describe("debrid service", () => {
const controller = new AbortController();
const abortTimer = setTimeout(() => {
controller.abort("test");
}, 25);
}, 200);
try {
await expect(service.unrestrictLink("https://rapidgator.net/file/abort-mega-web", controller.signal)).rejects.toThrow(/aborted/i);

View File

@ -177,7 +177,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "retry", links: ["https://dummy/retry"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -253,7 +253,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "same-name", links: ["https://dummy/first", "https://dummy/second"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const items = Object.values(manager.getSnapshot().session.items);
@ -465,7 +465,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "stall", links: ["https://dummy/stall"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -563,7 +563,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "connect-stall", links: ["https://dummy/connect-stall"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -666,7 +666,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "stall-connect", links: ["https://dummy/stall-connect"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -765,7 +765,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "watchdog-stall", links: ["https://dummy/watchdog-stall"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -880,7 +880,14 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "drain-stall", links: ["https://dummy/drain-stall"] }]);
const queuedSnapshot = manager.getSnapshot();
const packageId = queuedSnapshot.session.packageOrder[0] || "";
const itemId = queuedSnapshot.session.packages[packageId]?.itemIds[0] || "";
manager.start();
await waitFor(() => {
const status = manager.getSnapshot().session.items[itemId]?.fullStatus || "";
return status.includes("Warte auf Festplatte");
}, 12000);
await waitFor(() => !manager.getSnapshot().session.running, 40000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -962,7 +969,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "content-name", links: ["https://rapidgator.net/file/6f09df2984fe01378537c7cd8d7fa7ce"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -1092,7 +1099,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1225,7 +1232,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1369,7 +1376,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const item = manager.getSnapshot().session.items[itemId];
@ -1484,7 +1491,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 45000);
const item = manager.getSnapshot().session.items[itemId];
@ -1565,7 +1572,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "status-retry", links: ["https://dummy/status-retry"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -1785,7 +1792,7 @@ describe("download manager", () => {
expect(fs.existsSync(targetPath)).toBe(false);
});
it("detects start conflicts when extract output already exists", () => {
it("detects start conflicts when extract output already exists", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
@ -1844,7 +1851,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
const conflicts = manager.getStartConflicts();
const conflicts = await manager.getStartConflicts();
expect(conflicts.length).toBe(1);
expect(conflicts[0]?.packageId).toBe(packageId);
});
@ -2488,7 +2495,7 @@ describe("download manager", () => {
createStoragePaths(path.join(root, "state"))
);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 5000);
const snapshot = manager.getSnapshot();
@ -2720,7 +2727,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "new", links: ["https://dummy/new"] }]);
manager.start();
await manager.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const runningSnapshot = manager.getSnapshot();
@ -2798,7 +2805,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "fresh-retry", links: ["https://dummy/fresh"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const item = Object.values(manager.getSnapshot().session.items)[0];
@ -2883,7 +2890,7 @@ describe("download manager", () => {
expect(extractDir).toBeTruthy();
expect(fs.existsSync(extractDir)).toBe(false);
manager.start();
await manager.start();
await new Promise((resolve) => setTimeout(resolve, 140));
expect(fs.existsSync(extractDir)).toBe(false);
@ -2892,14 +2899,14 @@ describe("download manager", () => {
const snapshot = manager.getSnapshot();
const item = Object.values(snapshot.session.items)[0];
expect(item?.status).toBe("completed");
expect(item?.fullStatus).toBe("Entpackt");
expect(item?.fullStatus).toBe("Entpackt - Done");
expect(fs.existsSync(extractDir)).toBe(true);
expect(fs.existsSync(path.join(extractDir, "inside.txt"))).toBe(true);
} finally {
server.close();
await once(server, "close");
}
});
}, 35000);
it("keeps accurate summary when completed items are cleaned immediately", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
@ -2960,7 +2967,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "cleanup", links: ["https://dummy/cleanup"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const snapshot = manager.getSnapshot();
@ -3041,7 +3048,7 @@ describe("download manager", () => {
);
manager.addPackages([{ name: "cleanup-package", links: ["https://dummy/cleanup-package"] }]);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 30000);
const snapshot = manager.getSnapshot();
@ -3054,7 +3061,7 @@ describe("download manager", () => {
server.close();
await once(server, "close");
}
});
}, 35000);
it("counts queued package cancellations in run summary", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
@ -3206,7 +3213,7 @@ describe("download manager", () => {
const disabledItemId = initial.session.packages[disabledPkgId]?.itemIds[0] || "";
manager.togglePackage(disabledPkgId);
manager.start();
await manager.start();
await waitFor(() => !manager.getSnapshot().session.running, 25000);
const snapshot = manager.getSnapshot();
@ -3698,7 +3705,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(path.join(extractDir, "episode.txt")), 25000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
});
it("does not fail startup post-processing when source package dir is missing but extract output exists", async () => {
@ -3765,7 +3772,7 @@ describe("download manager", () => {
await waitFor(() => manager.getSnapshot().session.items[itemId]?.fullStatus.startsWith("Entpackt"), 12000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
});
it("marks missing source package dir as extracted instead of failed", async () => {
@ -4005,7 +4012,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(expectedPath), 12000);
const snapshot = manager.getSnapshot();
expect(snapshot.session.packages[packageId]?.status).toBe("completed");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(snapshot.session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(fs.existsSync(expectedPath)).toBe(true);
expect(fs.existsSync(originalExtractedPath)).toBe(false);
});
@ -4128,7 +4135,7 @@ describe("download manager", () => {
await waitFor(() => fs.existsSync(flattenedPath), 12000);
expect(manager.getSnapshot().session.packages[packageId]?.status).toBe("completed");
expect(manager.getSnapshot().session.items[itemId]?.fullStatus).toBe("Entpackt");
expect(manager.getSnapshot().session.items[itemId]?.fullStatus).toBe("Entpackt - Done");
expect(fs.existsSync(flattenedPath)).toBe(true);
expect(fs.existsSync(originalExtractedPath)).toBe(false);
});
@ -4299,7 +4306,7 @@ describe("download manager", () => {
expect(internal.globalSpeedLimitNextAt).toBeGreaterThan(start);
});
it("resets speed window head when start finds no runnable items", () => {
it("resets speed window head when start finds no runnable items", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
@ -4323,12 +4330,69 @@ describe("download manager", () => {
internal.speedEventsHead = 5;
internal.speedBytesLastWindow = 999;
manager.start();
await manager.start();
expect(internal.speedEventsHead).toBe(0);
expect(internal.speedEvents.length).toBe(0);
expect(internal.speedBytesLastWindow).toBe(0);
});
it("does not trigger global stall abort while write-buffer is disk-blocked", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);
const previousGlobalWatchdog = process.env.RD_GLOBAL_STALL_TIMEOUT_MS;
process.env.RD_GLOBAL_STALL_TIMEOUT_MS = "2500";
try {
const manager = new DownloadManager(
{
...defaultSettings(),
token: "rd-token",
outputDir: path.join(root, "downloads"),
extractDir: path.join(root, "extract")
},
emptySession(),
createStoragePaths(path.join(root, "state"))
);
manager.addPackages([{ name: "disk-block-guard", links: ["https://dummy/disk-block-guard"] }]);
const snapshot = manager.getSnapshot();
const packageId = snapshot.session.packageOrder[0] || "";
const itemId = snapshot.session.packages[packageId]?.itemIds[0] || "";
const internal = manager as unknown as any;
internal.session.running = true;
internal.session.paused = false;
internal.session.reconnectUntil = 0;
internal.session.totalDownloadedBytes = 0;
internal.session.items[itemId].status = "downloading";
internal.lastGlobalProgressBytes = 0;
internal.lastGlobalProgressAt = Date.now() - 10000;
const abortController = new AbortController();
internal.activeTasks.set(itemId, {
itemId,
packageId,
abortController,
abortReason: "none",
resumable: true,
nonResumableCounted: false,
blockedOnDiskWrite: true,
blockedOnDiskSince: Date.now() - 5000
});
internal.runGlobalStallWatchdog(Date.now());
expect(abortController.signal.aborted).toBe(false);
expect(internal.lastGlobalProgressAt).toBeGreaterThan(Date.now() - 2000);
} finally {
if (previousGlobalWatchdog === undefined) {
delete process.env.RD_GLOBAL_STALL_TIMEOUT_MS;
} else {
process.env.RD_GLOBAL_STALL_TIMEOUT_MS = previousGlobalWatchdog;
}
}
});
it("cleans run tracking when start conflict is skipped", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dm-"));
tempDirs.push(root);

204
tests/extractor-jvm.test.ts Normal file
View File

@ -0,0 +1,204 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { spawnSync } from "node:child_process";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { extractPackageArchives } from "../src/main/extractor";
const tempDirs: string[] = [];
const originalBackend = process.env.RD_EXTRACT_BACKEND;
function hasJavaRuntime(): boolean {
const result = spawnSync("java", ["-version"], { stdio: "ignore" });
return result.status === 0;
}
function hasJvmExtractorRuntime(): boolean {
const root = path.join(process.cwd(), "resources", "extractor-jvm");
const classesMain = path.join(root, "classes", "com", "sucukdeluxe", "extractor", "JBindExtractorMain.class");
const requiredLibs = [
path.join(root, "lib", "sevenzipjbinding.jar"),
path.join(root, "lib", "sevenzipjbinding-all-platforms.jar"),
path.join(root, "lib", "zip4j.jar")
];
return fs.existsSync(classesMain) && requiredLibs.every((libPath) => fs.existsSync(libPath));
}
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalBackend;
}
});
describe.skipIf(!hasJavaRuntime() || !hasJvmExtractorRuntime())("extractor jvm backend", () => {
it("extracts zip archives through SevenZipJBinding backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zipPath = path.join(packageDir, "release.zip");
const zip = new AdmZip();
zip.addFile("episode.txt", Buffer.from("ok"));
zip.writeZip(zipPath);
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "episode.txt"))).toBe(true);
});
it("emits progress callbacks with archiveName and percent", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-progress-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create a ZIP with some content to trigger progress
const zipPath = path.join(packageDir, "progress-test.zip");
const zip = new AdmZip();
zip.addFile("file1.txt", Buffer.from("Hello World ".repeat(100)));
zip.addFile("file2.txt", Buffer.from("Another file ".repeat(100)));
zip.writeZip(zipPath);
const progressUpdates: Array<{
archiveName: string;
percent: number;
phase: string;
archivePercent?: number;
}> = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
progressUpdates.push({
archiveName: update.archiveName,
percent: update.percent,
phase: update.phase,
archivePercent: update.archivePercent,
});
},
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
// Should have at least preparing, extracting, and done phases
const phases = new Set(progressUpdates.map((u) => u.phase));
expect(phases.has("preparing")).toBe(true);
expect(phases.has("extracting")).toBe(true);
// Extracting phase should include the archive name
const extracting = progressUpdates.filter((u) => u.phase === "extracting" && u.archiveName === "progress-test.zip");
expect(extracting.length).toBeGreaterThan(0);
// Should end at 100%
const lastExtracting = extracting[extracting.length - 1];
expect(lastExtracting.archivePercent).toBe(100);
// Files should exist
expect(fs.existsSync(path.join(targetDir, "file1.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "file2.txt"))).toBe(true);
});
it("extracts multiple archives sequentially with progress for each", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-multi-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create two separate ZIP archives
const zip1 = new AdmZip();
zip1.addFile("episode01.txt", Buffer.from("ep1 content"));
zip1.writeZip(path.join(packageDir, "archive1.zip"));
const zip2 = new AdmZip();
zip2.addFile("episode02.txt", Buffer.from("ep2 content"));
zip2.writeZip(path.join(packageDir, "archive2.zip"));
const archiveNames = new Set<string>();
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
onProgress: (update) => {
if (update.phase === "extracting" && update.archiveName) {
archiveNames.add(update.archiveName);
}
},
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
// Both archive names should have appeared in progress
expect(archiveNames.has("archive1.zip")).toBe(true);
expect(archiveNames.has("archive2.zip")).toBe(true);
// Both files extracted
expect(fs.existsSync(path.join(targetDir, "episode01.txt"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "episode02.txt"))).toBe(true);
});
it("respects ask/skip conflict mode in jvm backend", async () => {
process.env.RD_EXTRACT_BACKEND = "jvm";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-jvm-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zipPath = path.join(packageDir, "conflict.zip");
const zip = new AdmZip();
zip.addFile("same.txt", Buffer.from("new"));
zip.writeZip(zipPath);
const existingPath = path.join(targetDir, "same.txt");
fs.writeFileSync(existingPath, "old", "utf8");
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "ask",
removeLinks: false,
removeSamples: false
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
expect(fs.readFileSync(existingPath, "utf8")).toBe("old");
});
});

View File

@ -2,15 +2,41 @@ import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import AdmZip from "adm-zip";
import { afterEach, describe, expect, it } from "vitest";
import { buildExternalExtractArgs, collectArchiveCleanupTargets, extractPackageArchives } from "../src/main/extractor";
import { afterEach, beforeEach, describe, expect, it } from "vitest";
import {
buildExternalExtractArgs,
collectArchiveCleanupTargets,
extractPackageArchives,
archiveFilenamePasswords,
detectArchiveSignature,
classifyExtractionError,
findArchiveCandidates,
} from "../src/main/extractor";
const tempDirs: string[] = [];
const originalExtractBackend = process.env.RD_EXTRACT_BACKEND;
const originalStatfs = fs.promises.statfs;
const originalZipEntryMemoryLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
beforeEach(() => {
process.env.RD_EXTRACT_BACKEND = "legacy";
});
afterEach(() => {
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
if (originalExtractBackend === undefined) {
delete process.env.RD_EXTRACT_BACKEND;
} else {
process.env.RD_EXTRACT_BACKEND = originalExtractBackend;
}
(fs.promises as any).statfs = originalStatfs;
if (originalZipEntryMemoryLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = originalZipEntryMemoryLimit;
}
});
describe("extractor", () => {
@ -556,7 +582,6 @@ describe("extractor", () => {
});
it("keeps original ZIP size guard error when external fallback is unavailable", async () => {
const previousLimit = process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = "8";
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
@ -570,7 +595,6 @@ describe("extractor", () => {
zip.addFile("large.bin", Buffer.alloc(9 * 1024 * 1024, 7));
zip.writeZip(zipPath);
try {
const result = await extractPackageArchives({
packageDir,
targetDir,
@ -582,20 +606,9 @@ describe("extractor", () => {
expect(result.extracted).toBe(0);
expect(result.failed).toBe(1);
expect(String(result.lastError)).toMatch(/ZIP-Eintrag.*groß/i);
} finally {
if (previousLimit === undefined) {
delete process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB;
} else {
process.env.RD_ZIP_ENTRY_MEMORY_LIMIT_MB = previousLimit;
}
}
});
it("matches resume-state archive names case-insensitively on Windows", async () => {
if (process.platform !== "win32") {
return;
}
it.skipIf(process.platform !== "win32")("matches resume-state archive names case-insensitively on Windows", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-extract-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
@ -618,4 +631,459 @@ describe("extractor", () => {
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
describe("disk space check", () => {
it("aborts extraction when disk space is insufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
(fs.promises as any).statfs = async () => ({ bfree: 1, bsize: 1 });
await expect(
extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
})
).rejects.toThrow(/Nicht genug Speicherplatz/);
});
it("proceeds when disk space is sufficient", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-diskspace-ok-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("test.txt", Buffer.alloc(1024, 0x41));
zip.writeZip(path.join(packageDir, "test.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
});
describe("nested extraction", () => {
it("extracts archives found inside extracted output", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const innerZip = new AdmZip();
innerZip.addFile("deep.txt", Buffer.from("deep content"));
const outerZip = new AdmZip();
outerZip.addFile("inner.zip", innerZip.toBuffer());
outerZip.writeZip(path.join(packageDir, "outer.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
expect(fs.existsSync(path.join(targetDir, "deep.txt"))).toBe(true);
});
it("does not extract blacklisted extensions like .iso", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-nested-bl-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
fs.mkdirSync(targetDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("disc.iso", Buffer.alloc(64, 0));
zip.addFile("readme.txt", Buffer.from("hello"));
zip.writeZip(path.join(packageDir, "package.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none" as any,
conflictMode: "overwrite" as any,
removeLinks: false,
removeSamples: false,
});
expect(result.extracted).toBe(1);
expect(fs.existsSync(path.join(targetDir, "disc.iso"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "readme.txt"))).toBe(true);
});
});
describe("archiveFilenamePasswords", () => {
it("extracts stem and spaced variant from archive name", () => {
const result = archiveFilenamePasswords("MyRelease.S01E01.rar");
expect(result).toContain("MyRelease.S01E01");
expect(result).toContain("MyRelease S01E01");
});
it("strips multipart rar suffix", () => {
const result = archiveFilenamePasswords("Show.S02E03.part01.rar");
expect(result).toContain("Show.S02E03");
expect(result).toContain("Show S02E03");
});
it("strips .zip.001 suffix", () => {
const result = archiveFilenamePasswords("Movie.2024.zip.001");
expect(result).toContain("Movie.2024");
});
it("strips .tar.gz suffix", () => {
const result = archiveFilenamePasswords("backup.tar.gz");
expect(result).toContain("backup");
});
it("returns empty array for empty input", () => {
expect(archiveFilenamePasswords("")).toEqual([]);
});
it("returns single entry when no dots/underscores", () => {
const result = archiveFilenamePasswords("simple.zip");
expect(result).toEqual(["simple"]);
});
it("replaces underscores with spaces", () => {
const result = archiveFilenamePasswords("my_archive_name.7z");
expect(result).toContain("my_archive_name");
expect(result).toContain("my archive name");
});
});
describe(".rev cleanup", () => {
it("collects .rev files for single RAR cleanup", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-rev-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const mainRar = path.join(packageDir, "show.rar");
const rev = path.join(packageDir, "show.rev");
const r00 = path.join(packageDir, "show.r00");
fs.writeFileSync(mainRar, "a", "utf8");
fs.writeFileSync(rev, "b", "utf8");
fs.writeFileSync(r00, "c", "utf8");
const targets = new Set(collectArchiveCleanupTargets(mainRar));
expect(targets.has(mainRar)).toBe(true);
expect(targets.has(rev)).toBe(true);
expect(targets.has(r00)).toBe(true);
});
it("collects .rev files for multipart RAR cleanup", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-rev-mp-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const part1 = path.join(packageDir, "show.part01.rar");
const part2 = path.join(packageDir, "show.part02.rar");
const rev = path.join(packageDir, "show.rev");
fs.writeFileSync(part1, "a", "utf8");
fs.writeFileSync(part2, "b", "utf8");
fs.writeFileSync(rev, "c", "utf8");
const targets = new Set(collectArchiveCleanupTargets(part1));
expect(targets.has(part1)).toBe(true);
expect(targets.has(part2)).toBe(true);
expect(targets.has(rev)).toBe(true);
});
});
describe("generic .001 split cleanup", () => {
it("collects all numbered parts for generic splits", () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-split-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
const p001 = path.join(packageDir, "movie.001");
const p002 = path.join(packageDir, "movie.002");
const p003 = path.join(packageDir, "movie.003");
const other = path.join(packageDir, "other.001");
fs.writeFileSync(p001, "a", "utf8");
fs.writeFileSync(p002, "b", "utf8");
fs.writeFileSync(p003, "c", "utf8");
fs.writeFileSync(other, "x", "utf8");
const targets = new Set(collectArchiveCleanupTargets(p001));
expect(targets.has(p001)).toBe(true);
expect(targets.has(p002)).toBe(true);
expect(targets.has(p003)).toBe(true);
expect(targets.has(other)).toBe(false);
});
});
describe("detectArchiveSignature", () => {
it("detects RAR signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.rar");
// RAR5 signature: 52 61 72 21 1A 07
fs.writeFileSync(filePath, Buffer.from("526172211a0700", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("rar");
});
it("detects ZIP signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.zip");
fs.writeFileSync(filePath, Buffer.from("504b030414000000", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("zip");
});
it("detects 7z signature", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.7z");
fs.writeFileSync(filePath, Buffer.from("377abcaf271c0004", "hex"));
const sig = await detectArchiveSignature(filePath);
expect(sig).toBe("7z");
});
it("returns null for non-archive files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-sig-"));
tempDirs.push(root);
const filePath = path.join(root, "test.txt");
fs.writeFileSync(filePath, "Hello World", "utf8");
const sig = await detectArchiveSignature(filePath);
expect(sig).toBeNull();
});
it("returns null for non-existent file", async () => {
const sig = await detectArchiveSignature("/nonexistent/file.rar");
expect(sig).toBeNull();
});
});
describe("findArchiveCandidates extended formats", () => {
it("finds .tar.gz files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-tar-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "backup.tar.gz"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "readme.txt"), "info", "utf8");
const candidates = await findArchiveCandidates(packageDir);
expect(candidates.map((c) => path.basename(c))).toContain("backup.tar.gz");
});
it("finds .tar.bz2 files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-tar-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "archive.tar.bz2"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
expect(candidates.map((c) => path.basename(c))).toContain("archive.tar.bz2");
});
it("finds generic .001 split files", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-split-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "movie.001"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "movie.002"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
const names = candidates.map((c) => path.basename(c));
expect(names).toContain("movie.001");
// .002 should NOT be in candidates (only .001 is the entry point)
expect(names).not.toContain("movie.002");
});
it("does not duplicate .zip.001 as generic split", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-dedup-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
fs.mkdirSync(packageDir, { recursive: true });
fs.writeFileSync(path.join(packageDir, "movie.zip.001"), "data", "utf8");
fs.writeFileSync(path.join(packageDir, "movie.zip.002"), "data", "utf8");
const candidates = await findArchiveCandidates(packageDir);
const names = candidates.map((c) => path.basename(c));
// .zip.001 should appear once from zipSplit detection, not duplicated by genericSplit
expect(names.filter((n) => n === "movie.zip.001")).toHaveLength(1);
});
});
describe("classifyExtractionError", () => {
it("classifies CRC errors", () => {
expect(classifyExtractionError("CRC failed for file.txt")).toBe("crc_error");
expect(classifyExtractionError("Checksum error in data")).toBe("crc_error");
});
it("classifies wrong password", () => {
expect(classifyExtractionError("Wrong password")).toBe("wrong_password");
expect(classifyExtractionError("Falsches Passwort")).toBe("wrong_password");
});
it("classifies missing parts", () => {
expect(classifyExtractionError("Missing volume: part2.rar")).toBe("missing_parts");
expect(classifyExtractionError("Unexpected end of archive")).toBe("missing_parts");
});
it("classifies unsupported format", () => {
expect(classifyExtractionError("kein RAR-Archiv")).toBe("unsupported_format");
expect(classifyExtractionError("UNSUPPORTEDMETHOD")).toBe("unsupported_format");
});
it("classifies disk full", () => {
expect(classifyExtractionError("Nicht genug Speicherplatz")).toBe("disk_full");
expect(classifyExtractionError("No space left on device")).toBe("disk_full");
});
it("classifies timeout", () => {
expect(classifyExtractionError("Entpacken Timeout nach 360s")).toBe("timeout");
});
it("classifies abort", () => {
expect(classifyExtractionError("aborted:extract")).toBe("aborted");
});
it("classifies no extractor", () => {
expect(classifyExtractionError("WinRAR/UnRAR nicht gefunden")).toBe("no_extractor");
});
it("returns unknown for unrecognized errors", () => {
expect(classifyExtractionError("something weird happened")).toBe("unknown");
});
});
describe("password discovery", () => {
it("extracts first archive serially before parallel pool when multiple passwords", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
// Create 3 zip archives
for (const name of ["ep01.zip", "ep02.zip", "ep03.zip"]) {
const zip = new AdmZip();
zip.addFile(`${name}.txt`, Buffer.from(name));
zip.writeZip(path.join(packageDir, name));
}
const seenOrder: string[] = [];
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 2,
passwordList: "pw1|pw2|pw3",
onProgress: (update) => {
if (update.phase !== "extracting" || !update.archiveName) return;
if (seenOrder[seenOrder.length - 1] !== update.archiveName) {
seenOrder.push(update.archiveName);
}
}
});
expect(result.extracted).toBe(3);
expect(result.failed).toBe(0);
// First archive should be ep01 (natural order, extracted serially for discovery)
expect(seenOrder[0]).toBe("ep01.zip");
});
it("skips discovery when only one password candidate", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-skip-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
for (const name of ["a.zip", "b.zip"]) {
const zip = new AdmZip();
zip.addFile(`${name}.txt`, Buffer.from(name));
zip.writeZip(path.join(packageDir, name));
}
// No passwordList → only empty string → length=1 → no discovery phase
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 4
});
expect(result.extracted).toBe(2);
expect(result.failed).toBe(0);
});
it("skips discovery when only one archive", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "rd-pwdisc-one-"));
tempDirs.push(root);
const packageDir = path.join(root, "pkg");
const targetDir = path.join(root, "out");
fs.mkdirSync(packageDir, { recursive: true });
const zip = new AdmZip();
zip.addFile("single.txt", Buffer.from("single"));
zip.writeZip(path.join(packageDir, "only.zip"));
const result = await extractPackageArchives({
packageDir,
targetDir,
cleanupMode: "none",
conflictMode: "overwrite",
removeLinks: false,
removeSamples: false,
maxParallel: 4,
passwordList: "pw1|pw2|pw3"
});
expect(result.extracted).toBe(1);
expect(result.failed).toBe(0);
});
});
});

View File

@ -166,7 +166,7 @@ describe("mega-web-fallback", () => {
const controller = new AbortController();
const timer = setTimeout(() => {
controller.abort("test");
}, 30);
}, 200);
try {
await expect(fallback.unrestrict("https://mega.debrid/link2", controller.signal)).rejects.toThrow(/aborted/i);

View File

@ -0,0 +1,188 @@
import { describe, expect, it } from "vitest";
import { resolveArchiveItemsFromList } from "../src/main/download-manager";
type MinimalItem = {
targetPath?: string;
fileName?: string;
[key: string]: unknown;
};
function makeItems(names: string[]): MinimalItem[] {
return names.map((name) => ({
targetPath: `C:\\Downloads\\Package\\${name}`,
fileName: name,
id: name,
status: "completed",
}));
}
describe("resolveArchiveItemsFromList", () => {
// ── Multipart RAR (.partN.rar) ──
it("matches multipart .part1.rar archives", () => {
const items = makeItems([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
"Other.rar",
]);
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(3);
expect(result.map((i: any) => i.fileName)).toEqual([
"Movie.part1.rar",
"Movie.part2.rar",
"Movie.part3.rar",
]);
});
it("matches multipart .part01.rar archives (zero-padded)", () => {
const items = makeItems([
"Film.part01.rar",
"Film.part02.rar",
"Film.part10.rar",
"Unrelated.zip",
]);
const result = resolveArchiveItemsFromList("Film.part01.rar", items as any);
expect(result).toHaveLength(3);
});
// ── Old-style RAR (.rar + .r00, .r01, etc.) ──
it("matches old-style .rar + .rNN volumes", () => {
const items = makeItems([
"Archive.rar",
"Archive.r00",
"Archive.r01",
"Archive.r02",
"Other.zip",
]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(4);
});
// ── Single RAR ──
it("matches a single .rar file", () => {
const items = makeItems(["SingleFile.rar", "Other.mkv"]);
const result = resolveArchiveItemsFromList("SingleFile.rar", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("SingleFile.rar");
});
// ── Split ZIP ──
it("matches split .zip.NNN files", () => {
const items = makeItems([
"Data.zip",
"Data.zip.001",
"Data.zip.002",
"Data.zip.003",
]);
const result = resolveArchiveItemsFromList("Data.zip.001", items as any);
expect(result).toHaveLength(4);
});
// ── Split 7z ──
it("matches split .7z.NNN files", () => {
const items = makeItems([
"Backup.7z.001",
"Backup.7z.002",
]);
const result = resolveArchiveItemsFromList("Backup.7z.001", items as any);
expect(result).toHaveLength(2);
});
// ── Generic .NNN splits ──
it("matches generic .NNN split files", () => {
const items = makeItems([
"video.001",
"video.002",
"video.003",
]);
const result = resolveArchiveItemsFromList("video.001", items as any);
expect(result).toHaveLength(3);
});
// ── Exact filename match ──
it("matches a single .zip by exact name", () => {
const items = makeItems(["myarchive.zip", "other.rar"]);
const result = resolveArchiveItemsFromList("myarchive.zip", items as any);
expect(result).toHaveLength(1);
expect((result[0] as any).fileName).toBe("myarchive.zip");
});
// ── Case insensitivity ──
it("matches case-insensitively", () => {
const items = makeItems([
"MOVIE.PART1.RAR",
"MOVIE.PART2.RAR",
]);
const result = resolveArchiveItemsFromList("movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Stem-based fallback ──
it("uses stem-based fallback when exact patterns fail", () => {
// Simulate a debrid service that renames "Movie.part1.rar" to "Movie.part1_dl.rar"
// but the disk file is "Movie.part1.rar"
const items = makeItems([
"Movie.rar",
]);
// The archive on disk is "Movie.part1.rar" but there's no item matching the
// .partN pattern. The stem "movie" should match "Movie.rar" via fallback.
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
// stem fallback: "movie" starts with "movie" and ends with .rar
expect(result).toHaveLength(1);
});
// ── Single item fallback ──
it("returns single archive item when no pattern matches", () => {
const items = makeItems(["totally-different-name.rar"]);
const result = resolveArchiveItemsFromList("Original.rar", items as any);
// Single item in list with archive extension → return it
expect(result).toHaveLength(1);
});
// ── Empty when no match ──
it("returns empty when items have no archive extensions", () => {
const items = makeItems(["video.mkv", "subtitle.srt"]);
const result = resolveArchiveItemsFromList("Archive.rar", items as any);
expect(result).toHaveLength(0);
});
// ── Items without targetPath ──
it("falls back to fileName when targetPath is missing", () => {
const items = [
{ fileName: "Movie.part1.rar", id: "1", status: "completed" },
{ fileName: "Movie.part2.rar", id: "2", status: "completed" },
];
const result = resolveArchiveItemsFromList("Movie.part1.rar", items as any);
expect(result).toHaveLength(2);
});
// ── Multiple archives, should not cross-match ──
it("does not cross-match different archive groups", () => {
const items = makeItems([
"Episode.S01E01.part1.rar",
"Episode.S01E01.part2.rar",
"Episode.S01E02.part1.rar",
"Episode.S01E02.part2.rar",
]);
const result1 = resolveArchiveItemsFromList("Episode.S01E01.part1.rar", items as any);
expect(result1).toHaveLength(2);
expect(result1.every((i: any) => i.fileName.includes("S01E01"))).toBe(true);
const result2 = resolveArchiveItemsFromList("Episode.S01E02.part1.rar", items as any);
expect(result2).toHaveLength(2);
expect(result2.every((i: any) => i.fileName.includes("S01E02"))).toBe(true);
});
});

View File

@ -153,7 +153,7 @@ async function main(): Promise<void> {
createStoragePaths(path.join(tempRoot, "state-pause"))
);
manager2.addPackages([{ name: "pause", links: ["https://dummy/slow"] }]);
manager2.start();
await manager2.start();
await new Promise((resolve) => setTimeout(resolve, 120));
const paused = manager2.togglePause();
assert(paused, "Pause konnte nicht aktiviert werden");

163
tests/session-log.test.ts Normal file
View File

@ -0,0 +1,163 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { initSessionLog, getSessionLogPath, shutdownSessionLog } from "../src/main/session-log";
import { setLogListener } from "../src/main/logger";
const tempDirs: string[] = [];
afterEach(() => {
// Ensure session log is shut down between tests
shutdownSessionLog();
// Ensure listener is cleared between tests
setLogListener(null);
for (const dir of tempDirs.splice(0)) {
fs.rmSync(dir, { recursive: true, force: true });
}
});
describe("session-log", () => {
it("initSessionLog creates directory and file", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath();
expect(logPath).not.toBeNull();
expect(fs.existsSync(logPath!)).toBe(true);
expect(fs.existsSync(path.join(baseDir, "session-logs"))).toBe(true);
expect(path.basename(logPath!)).toMatch(/^session_\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}\.txt$/);
const content = fs.readFileSync(logPath!, "utf8");
expect(content).toContain("=== Session gestartet:");
shutdownSessionLog();
});
it("logger listener writes to session log", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
// Simulate a log line via the listener
const { logger } = await import("../src/main/logger");
logger.info("Test-Nachricht für Session-Log");
// Wait for flush (200ms interval + margin)
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("Test-Nachricht für Session-Log");
shutdownSessionLog();
});
it("shutdownSessionLog writes closing line", () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
const content = fs.readFileSync(logPath, "utf8");
expect(content).toContain("=== Session beendet:");
});
it("shutdownSessionLog removes listener", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const logPath = getSessionLogPath()!;
shutdownSessionLog();
// Log after shutdown - should NOT appear in session log
const { logger } = await import("../src/main/logger");
logger.info("Nach-Shutdown-Nachricht");
await new Promise((resolve) => setTimeout(resolve, 500));
const content = fs.readFileSync(logPath, "utf8");
expect(content).not.toContain("Nach-Shutdown-Nachricht");
});
it("cleanupOldSessionLogs deletes old files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a fake old session log
const oldFile = path.join(logsDir, "session_2020-01-01_00-00-00.txt");
fs.writeFileSync(oldFile, "old session");
// Set mtime to 30 days ago
const oldTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
fs.utimesSync(oldFile, oldTime, oldTime);
// Create a recent file
const newFile = path.join(logsDir, "session_2099-01-01_00-00-00.txt");
fs.writeFileSync(newFile, "new session");
// initSessionLog triggers cleanup
initSessionLog(baseDir);
// Wait for async cleanup
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(oldFile)).toBe(false);
expect(fs.existsSync(newFile)).toBe(true);
shutdownSessionLog();
});
it("cleanupOldSessionLogs keeps recent files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
const logsDir = path.join(baseDir, "session-logs");
fs.mkdirSync(logsDir, { recursive: true });
// Create a file from 2 days ago (should be kept)
const recentFile = path.join(logsDir, "session_2025-12-01_00-00-00.txt");
fs.writeFileSync(recentFile, "recent session");
const recentTime = new Date(Date.now() - 2 * 24 * 60 * 60 * 1000);
fs.utimesSync(recentFile, recentTime, recentTime);
initSessionLog(baseDir);
await new Promise((resolve) => setTimeout(resolve, 300));
expect(fs.existsSync(recentFile)).toBe(true);
shutdownSessionLog();
});
it("multiple sessions create different files", async () => {
const baseDir = fs.mkdtempSync(path.join(os.tmpdir(), "rd-slog-"));
tempDirs.push(baseDir);
initSessionLog(baseDir);
const path1 = getSessionLogPath();
shutdownSessionLog();
// Small delay to ensure different timestamp
await new Promise((resolve) => setTimeout(resolve, 1100));
initSessionLog(baseDir);
const path2 = getSessionLogPath();
shutdownSessionLog();
expect(path1).not.toBeNull();
expect(path2).not.toBeNull();
expect(path1).not.toBe(path2);
expect(fs.existsSync(path1!)).toBe(true);
expect(fs.existsSync(path2!)).toBe(true);
});
});

View File

@ -22,7 +22,7 @@ afterEach(() => {
describe("update", () => {
it("normalizes update repo input", () => {
expect(normalizeUpdateRepo("")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://codeberg.org/owner/repo")).toBe("owner/repo");
expect(normalizeUpdateRepo("https://www.codeberg.org/owner/repo")).toBe("owner/repo");
@ -31,14 +31,14 @@ describe("update", () => {
expect(normalizeUpdateRepo("git@codeberg.org:owner/repo.git")).toBe("owner/repo");
});
it("uses normalized repo slug for Codeberg API requests", async () => {
it("uses normalized repo slug for API requests", async () => {
let requestedUrl = "";
globalThis.fetch = (async (input: RequestInfo | URL): Promise<Response> => {
requestedUrl = typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
return new Response(
JSON.stringify({
tag_name: `v${APP_VERSION}`,
html_url: "https://codeberg.org/owner/repo/releases/tag/v1.0.0",
html_url: "https://git.24-music.de/owner/repo/releases/tag/v1.0.0",
assets: []
}),
{
@ -48,8 +48,8 @@ describe("update", () => {
);
}) as typeof fetch;
const result = await checkGitHubUpdate("https://codeberg.org/owner/repo/releases");
expect(requestedUrl).toBe("https://codeberg.org/api/v1/repos/owner/repo/releases/latest");
const result = await checkGitHubUpdate("https://git.24-music.de/owner/repo/releases");
expect(requestedUrl).toBe("https://git.24-music.de/api/v1/repos/owner/repo/releases/latest");
expect(result.currentVersion).toBe(APP_VERSION);
expect(result.latestVersion).toBe(APP_VERSION);
expect(result.updateAvailable).toBe(false);
@ -484,14 +484,14 @@ describe("normalizeUpdateRepo extended", () => {
});
it("returns default for malformed inputs", () => {
expect(normalizeUpdateRepo("just-one-part")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("just-one-part")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo(" ")).toBe("Administrator/real-debrid-downloader");
});
it("rejects traversal-like owner or repo segments", () => {
expect(normalizeUpdateRepo("../owner/repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Sucukdeluxe/real-debrid-downloader");
expect(normalizeUpdateRepo("../owner/repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("owner/../repo")).toBe("Administrator/real-debrid-downloader");
expect(normalizeUpdateRepo("https://codeberg.org/owner/../../repo")).toBe("Administrator/real-debrid-downloader");
});
it("handles www prefix", () => {

View File

@ -12,5 +12,5 @@
"isolatedModules": true,
"types": ["node", "vite/client"]
},
"include": ["src", "tests", "vite.config.ts"]
"include": ["src", "tests", "vite.config.mts"]
}

View File

@ -1,58 +0,0 @@
import crypto from "node:crypto";
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { Readable } from "node:stream";
import { pipeline } from "node:stream/promises";
const localPath = "release/Real-Debrid-Downloader Setup 1.4.61.exe";
const remoteUrl = "https://codeberg.org/Sucukdeluxe/real-debrid-downloader/releases/download/v1.4.61/Real-Debrid-Downloader%20Setup%201.4.61.exe";
const tmpPath = path.join(os.tmpdir(), "rd-verify-1.4.61.exe");
// Local file info
const localSize = fs.statSync(localPath).size;
const localHash = crypto.createHash("sha512");
localHash.update(fs.readFileSync(localPath));
const localSha = localHash.digest("hex");
console.log("Local file size:", localSize);
console.log("Local SHA512:", localSha.substring(0, 40) + "...");
// Download from Codeberg
console.log("\nDownloading from Codeberg...");
const resp = await fetch(remoteUrl, { redirect: "follow" });
console.log("Status:", resp.status);
console.log("Content-Length:", resp.headers.get("content-length"));
const source = Readable.fromWeb(resp.body);
const target = fs.createWriteStream(tmpPath);
await pipeline(source, target);
const remoteSize = fs.statSync(tmpPath).size;
const remoteHash = crypto.createHash("sha512");
remoteHash.update(fs.readFileSync(tmpPath));
const remoteSha = remoteHash.digest("hex");
console.log("\nRemote file size:", remoteSize);
console.log("Remote SHA512:", remoteSha.substring(0, 40) + "...");
console.log("\nSize match:", localSize === remoteSize);
console.log("SHA512 match:", localSha === remoteSha);
if (localSha !== remoteSha) {
console.log("\n!!! FILE ON CODEBERG IS CORRUPTED !!!");
console.log("The upload to Codeberg damaged the file.");
// Find first difference
const localBuf = fs.readFileSync(localPath);
const remoteBuf = fs.readFileSync(tmpPath);
for (let i = 0; i < Math.min(localBuf.length, remoteBuf.length); i++) {
if (localBuf[i] !== remoteBuf[i]) {
console.log(`First byte difference at offset ${i}: local=0x${localBuf[i].toString(16)} remote=0x${remoteBuf[i].toString(16)}`);
break;
}
}
} else {
console.log("\n>>> File on Codeberg is identical to local file <<<");
console.log("The problem is on the user's server (network/proxy issue).");
}
fs.unlinkSync(tmpPath);