v2024.11.06

This commit is contained in:
kittykat 2024-11-08 09:32:54 +00:00
parent 1f87d598a4
commit 808d64768b
118 changed files with 28958 additions and 0 deletions

View file

@ -0,0 +1,54 @@
------ Basic Usage (Downloads) -----------------------
# Process urls and download files from urls.txt
./mad.sh urls.txt
# Process specific host urls in urls.txt
./mad.sh <hostcode> urls.txt
(ie ./mad.sh kraken urls.txt, ./mad.sh hex urls.txt)
# Launch 4 terminals to process specific host urls in urls.txt (fallback to allhosts)
./mad.sh multi auto 4 urls.txt
# Launch terminals to process specific host urls in urls.txt (fallback to allhosts)
./mad.sh multi auto urls.txt
# Show the status of urls in urls.txt
./mad.sh status urls.txt
# Reset any #RETRY# lines in urls.txt
./mad.sh reset urls.txt
------ Basic Usage (Uploads) -------------------------
# Display MAD UI Uploader (process files in ./uploads/ folder to selected hosts)
./mad.sh upload
# Use MAD file processing (batch like the downloader)
./mad.sh upload uploads.txt
# Show the status of urls in urls.txt
./mad.sh upload status uploads.txt
# Reset any #RETRY# lines in uploads.txt
./mad.sh upload reset uploads.txt
------ Informational Display -----------------------
# Display Host Modules and their internal description
./mad.sh hosts
# Diplay Plugins and their internal description
./mad.sh plugins
------ Other Arguments ------------------------------
Install curl_impersonate: Downloads the latest binary for curl_impersonate from github repo
.mad.sh install_curl_impersonate
MAD Clipboard Monitor: Monitor clipboard for supported urls and add them to file (requires xclip -- apt install xclip)
./mad.sh clipmon urls.txt

View file

@ -0,0 +1,480 @@
# Additions by kittykat
# Tail format (newest to oldest)
#
# ---------- Initial release with MAD Uploader functionality ----------
# 2024.09.30 - [up_firestorage] Add firestorage.jp as upload host
# 2024.09.29 - [free4e/up_free4e] Add free4e.com as download and upload host
# 2024.09.29 - [harrault/up_harrault] Add harrault.fr as download and upload host
# 2024.09.29 - [acid/up_acid] Add acid.fr as download and upload host
# 2024.09.29 - [mad] Fix duplicate rename with literal chars in url
# 2024.09.28 - [dataupload/up_dataupload] Add dataupload.net as download and upload host
# 2024.09.27 - [netlib/up_netlib] Add mhep.netlib.re as download and upload host
# 2024.09.27 - [filesquid/up_filesquid] Add filesquid.net as download and upload host
# 2024.09.27 - [soyjak/up_soyjak] Add soyjak.download as download and upload host
# 2024.09.27 - [linxx/up_linxx] Add linxx.net as download and upload host
# 2024.09.27 - [nantes/up_nantes] Add nantes.cloud as download and upload host
# 2024.09.27 - [depotkaz/up_depotkaz] Add depot.kaz.bzh as download and upload host
# 2024.09.27 - [anarchaserver/up_anarchaserver] Add transitional.anarchaserver.org as download and upload host
# 2024.09.26 - [AutoResetAndRetryDownloads] Add autoloop handling to Doze&Retry / Ticket Expiry
# 2024.09.26 - [bowfile] Add handling of 'File has been removed due to inactivity'
# 2024.09.26 - [dailyuploads] Fix parsing blank referer
# 2024.09.26 - [dosya] Improve cookie cleanup
# 2024.09.26 - [1fichier] Improve cookie cleanup for exit node process
# 2024.09.26 - [mad] Fix direct= onion addresses (revert back to http)
# 2024.09.26 - [mad] Add additional direct= filename cleaning
# 2024.09.26 - [SkipUrlsInDownloadsCompletedTxt] Fix detection of already completed "direct=" urls
# 2024.09.25 - [bowfile] Add bowfile as download host (finally)
# 2024.09.25 - [mad + hosts] Do not remove file lock on Skip if another term is actively downloading the file
# 2024.09.25 - [click] Add clickndownload.name and clicknupload.name domains
# 2024.09.25 - [mad] Add global $UrlsVars that can be accessed from any function / plugin (code beautfar)
# - Any #key=value line added to urls.txt is parsed into this variable and their current value
# is accessible as ${UrlsVars["$key"]} -- ie. ${UrlsVars[pw]}
# 2024.09.25 - [mad] Fix ScriptDir ref when mad.sh is ran as a soft link (code beautfar)
# 2024.09.25 - [mad] Fix passing return code from hooked functions (thanks beautfar)
# 2024.09.25 - [uflix] Add server general error response handling
# 2024.09.25 - [ocr_captcha] Fix temp filename issue from decluttering / renaming
# 2024.09.24 - Update help, and documentation
# 2024.09.24 - Decluttered MAD folder structure and naming:
# (* READ the document on migration in the ./documentation folder *)
# (* REVIEW FolderStructure Pictures in documentation as well *)
# 2024.09.24 - [*all plugins / all hosts*] Updates to use the new decluttered folder structure and names
# 2024.09.24 - [SkipUrlsInDownloadsCompletedTxt] Add #REMOVED# to the Skip Url check
# 2024.09.24 - [up_gofile] Attempt to retrieve best upload server prior to file send
# 2024.09.23 - [mad] Add MAD Upload Reset (to reset #RETRY# lines in uploads.txt)
# * ./mad.sh upload reset uploads.txt
# 2024.09.23 - [kraken] Add cleanup of extra chars added to token
# 2024.09.23 - [filedot] Url encode user / pass in request
# 2024.09.23 - [mad] Complete MAD Upload Status (for uploads.txt)
# * ./mad.sh upload status uploads.txt
# 2024.09.23 - [mad] Fix trimming #pw= lines with special characters (beautfar)
# 2024.09.22 - [mad] Add extended upload argument (filepath) to process uploads in uploads.txt
# * ./mad.sh upload uploads.txt
# * This will process any line not starting with '#' and containing a '|'
# # Required format:
# * filename|HostCode (defaults in the ./uploads folder)
# * filepath|HostCode (uses file path passed in)
# * ie.
# MyArchive01.7z|oshi
# MyArchive01.7z|1f
# MyArchive01.7z|bow
# ! This functionality is quite new and likely I will find something I need to fix. Please
# report anything you encounter.
# 2024.09.22 - [*all upload hosts*] Updates to handle uploads.txt file processing
# 2024.09.22 - [mad] Add one more hookable function: PostFailRetryUpload()
# This is unused currrently, but will be implemented in file processing in a future update
# 2024.09.22 - [mad] Modify plugin loading system: allow multiple plugins to hook the same "hookable" functions
# * Same hookable functions:
# OnLoad(), BeginProcessing(), PreProcessUrl(), PostSuccessfulDownload(), PostFailedDownload(),
# PostFailRetryDownload(), DoneProcessingAllUrls(), PostSuccessfulUpload(), PostFailedUpload()
# Summary of changes:
# * To hook a function, it must be named "HookName_" and be unique. Best practice is to use filename
# ie. OnLoad_MyFilename()
# * NOTE: To upgrade any current plugins you wrote to function this way, just add "_<yourfilename>"
# to the hooked function name in your plugin.
# * (Review ExampleMainHooks for more details)
# 2024.09.22 - [*all plugins*] Modified function names to use the new v2 hook mechanism, allowing for multiple
# hooks of the same function.
# 2024.09.21 - [mad] Sanitize printf in success/fail/retry/etc messaging [code: beautfar]
# 2024.09.21 - [mad] Add '#ref=' keyword to store links (like folder=) to "$CurrentRef" [code: beautfar]
# 2024.09.21 - [dbree] - Add dbree.me download host (zipcluster)
# 2024.09.21 - [up_dbree] - Add dbree.me upload host (zipcluster)
# 2024.09.21 - [nofile] - Add nofile.org download host (zipcluster)
# 2024.09.21 - [up_nofile] - Add nofile.org upload host (zipcluster)
# 2024.09.21 - [shareonline] - Add shareonline download host (zipcluster)
# 2024.09.21 - [up_shareonline] - Add shareonline upload host (zipcluster)
# 2024.09.21 - [up_yolobit] - Add yolobit upload host (zipcluster)
# 2024.09.20 - [yolobit] Add new host domain -- download files from yolobit.com (zipcluster)
# 2024.09.20 - [mad] Changed default UploadSpeedMin to 100 for uploads with RateMonitor (still catch stale uploads)
# 2024.09.20 - [lainsafe_onion] - Add lainsafe.kallist4mcluuxbjnr5p2asdlmdhaos3pcrvhk3fbzmiiiftwg6zncid.onion
# 2024.09.20 - [SkipOkUrlsInResultsTxt, SkipUrlsInCompletedTxt] - Add line verification prior to check
# 2024.09.20 - [nippy] Handle 302 response on download from some servers
# 2024.09.19 - [ocr_captcha] Create new plugin to perform OCR on images (primarily for dailyuploads)
# new image captcha system -- (WIP, accuracy maybe 25-35%, but it is all local)
# * Add "LoadPlugins=ocr_captcha.sh" to use
# * Dependencies: tesseract-ocr & imagemagick
# * (sudo apt-get install tesseract-ocr, sudo apt-get install imagemagick)
# 2024.09.19 - [dailyuploads] Fix dailyuploads captcha process -- was changed to an image captcha.
# (image captcha requires ocr_captcha plugin. not perfect -- maybe 25%-35% accuracy)
# 2024.09.18 - [mad] Add '[', ']' to literalize_string func
# 2024.09.18 - [up_uploadflix] Updated the response parsing.. working now
# 2024.09.17 - [dosya] Fix potential issue getting dosya filename
# 2024.09.17 - [mad] Fix LoopThroughFileUntilComplete=false not processing initially
# 2024.09.16 - Lots of pre-release updates & cleanup
# 2024.09.16 - Add bowfile as upload host
# 2024.09.16 - Add 3 new upload hosts -- dailyuploads, filehaus (down atm), uploadflix (down atm)
# 2024.09.16 - Add nippy upload (zipcluster: random, nippydrive, nippyshare, nippybox, nippyspace, nippyfile)
# 2024.09.16 - Add 3 new upload hosts -- hexload, gofile, dosya upload host
# 2024.09.16 - Add debug message to plugins to help locate issues if plugin has any errors
# 2024.09.16 - Add detection of failed uploads to hosts so other terminals will not attempt to upload
# 2024.09.16 - Add flock upload ticket detection and notification
# 2024.09.16 - Create 3 initial working upload hosts (1F, oshi, kraken). Also an example upload host.
# 2024.09.16 - Categorized the Options in the script and config into sections (minimized the config)
# 2024.09.16 - Added and Uploads section with 2 options in script and config
# - MaxUploadRetries (default=3) max tries to upload a file to each host
# - DefaultUploadHosts (default=1f,kraken,oshi)
# * This allows fast selection / entry of hosts at the prompt by typing 'd'
# 2024.09.15 - Add 3 new arguments / functionality to mad
# 1. ./mad.sh hosts -- displays all host modules and some internal details:
# hostcode, nick, prefix, functions, and upload specific info, etc.)
# 2. ./mad.sh plugins -- displays all plugins and internal details:
# (hostcode, nick, prefix, functions, entrypoint)
# 3. ./mad.sh upload -- This begins the batch upload processing:
# * Batch uploads will pickup any supported filetypes in the ./uploads folder
# (.7z, .rar, .001 - .009)
# * Once an upload is successfully uploaded, the download link and info is displayed,
# and a ticket is created in the ./uploads folder with all the details as well.
# * On completion, or already uploaded, or fail/retry, or failure, all information is
# logged to the ./results_upload.txt file in shortform, and detailed information is
# written to the ./uploads/uploads_processed.txt file.
# * The ./uploads/uploads_processed.txt file is used to ensure files are not uploaded
# more than once to each selected host. To re-up, the file can be edited to remove lines,
# or simply deleted. It's main purpose is to function until all files are uploaded in
# that batch, and then the folder cleaned for the next round.
# --- @ Uploading has several safety measures in place:
# 1. Supported file extension checking
# 2. A 2-step batch begin process: (require user to type 'y' to proceed selecting hosts,
# and then also require the user to type in the hostcodes to upload to).
# 3. Prior to the prompts, all files to be uploaded are displayed on the screen with details
# 4. Prior to hostcode input, all availabe upload hostcodes and hostnicks are displayed.
# 5. All the other MAD features inherent in initialization
# ** That said, be certain you take your own safety measures uploading:
# - Remove metadata from images, password protect your archives, etc.
# 2024.09.15 - Updates to the SkipUrlsInCompletedTxt.sh plugin to be more robust
# 2024.09.15 - Build out upload hosts templates "./hosts/up_<host>.sh"
# ('up_' prefix is reserved for upload host modules)
# 2024.09.15 - Add MAD Upload functionality
# 2024.09.15 - Add MAD Host Details (run ./mad.sh hostdetails)
# Displays host information queried from all available host modules (./hosts/)
# 2024.09.15 - Add MAD Plugin Details (run ./mad.sh plugindetails)
# Displays available plugins (./plugins/ and their hooked functions
#
# ---------- Initial release with MAD Hosts functionality ----------
# 2024.09.14 - Few small plugin updates (only functionality change is in SkipUrlsInCompletedTxt:
# include matching line number in output
# 2024.09.14 - Port clipmon functionality to use dynamic hosts
# 2024.09.13 - Change running in UrlOnly mode (passing in a url to process), to allow a second argument
# for the filename override (ie. ./mad.sh http://oshi.at/abcd "my filename.7z")
# 2024.09.13 - Port arguments to process a specific host urls to use dynamic hosts
# 2024.09.13 - Port .mad.sh help to use dynamic host data
# 2024.09.13 - Lots of testing.. lots of changes.. (all is passing so far.. though expect updates)
# 2024.09.13 - Add VerboseLoading option (display all hosts / plugin loads, or only FAIL statuses)
# 2024.09.13 - Added verification to hosts.sh file loading (check format, ensure unique HostFuncPrefix)
# 2024.09.13 - Created an example host with some descriptive help
# 2024.09.13 - Moved hosts functions into individual loadable host files (additional hosts can be added
# (additional supported hosts can be added with the example template "./hosts/Examples/ExampleNewHost.host"
# 2024.09.13 - Created a host folder and LoadHosts() function to load *.host files into mad.sh
# 2024.09.13 - Initial port of all host data (HostCode, HostNick, HostDomainRegex) into a modular string
# 2024.09.13 - [Major Update]: Host processing and code (modularized, moved into loadable hosts)
# - Created ListHostAndDomainRegexes object to allow modularization:
# - Allow loading hosts (and creating additional hosts) similarly to plugins
# - Merge ~4000 lines of host url checks and processing to make script more maintainable
# 2024.09.13 - Add detection of duplicate hook usage (functions) during plugin load and disallow
# 2024.09.12 - Created a few working plugins and one example plugin with helpful information for builders
# ** Plugins have passed args available, as well as all mad.sh vars and functions available **
# - AutoResetAndRetryDownloads: Runs mad.sh reset after processing all urls and then relaunches MAD
# - CatnapCtrlC: Keeps mad.sh running until Ctrl-C, waiting for urls.txt updates
# - ExamplesMainHooks: Examples of the 7 main hooks
# - SkipOkUrlsInResultsTxt: Skips urls that already exist in results.txt with an #OK# flag
# - SkipUrlsInCompletedTxt: Better version of SkipOkUrlsInResultsTxt (uses new completed.txt)
# - UnzipAfterCompleted: (WIP) Unzips archives marked #OK# in urls.txt immediately after they are successfully
# downloaded and marked #OK# [this is not working yet]
# 2024.09.11 - Added completed.txt logging with more detailed info.
# (helpful for plugins such as unzipping and skip already downloaded urls as it contains filepath / date)
# 2024.09.11 - Worked with beautfar to build out ability to skip downloads already successfully downloaded
# in the results.txt (via SkipOkUrlsInResultsTxt.sh plugin).
# 2024.09.11 - Designed plugins framework in code: plugins folder, loading plugins, 5 main hooks (see readme)
# * The plugin system was designed to allow intermediate coders to implement workflows to fit their needs
#
# 2024.09.10 - Updates to nippy host processing (multi-domain, retries on unavailable response)
# 2024.09.10 - Add additional nippy hosts (nippybox.com, nippyfile.com, nippyspace.com)
# 2024.09.09 - Add retries to hexload head (ran into issue were cdn was not resolvable--likely gone)
# 2024.09.08 - Sanitize all vars written to urls.txt (prevent failures leaving a flock)
# 2024.09.07 - Add additional uflix responses
# 2024.09.06 - Add wait time response to hex and handling
# 2024.09.06 - Sanitize logging for unknown (html) errors with hexload
# 2024.09.05 - Update MinimumAllowedFilesize check for all hosts (1KB default)
# 2024.09.03 - Add new host up2sha.re
# 2024.09.03 - Replace strings dependency for bad html detection (code by beautfar)
# 2024.09.02 - Add nippyshare.me
# 2024.09.02 - Add handling of "download is temporarily unavailable" response from nippy
# 2024.09.01 - Fix MadStatus line #
# 2024.08.30 - Speed up MadStatus check / report
# 2024.08.30 - Add WorkDirOverride option to allow the working directory to be somewhere other than ScriptDir
# 2024.08.30 - Complete overhaul of ScriptDir / WorkDir to allow specifying locations
# 2024.08.30 - Converted hundreds unary operations to be more robust
# 2024.08.30 - Merge redundant shared code for maintainability and to reduce script size (~3000 lines)
# 2024.08.30 - Moved random functions out from the script configurables
# ** If you are using mad.config, it will need to be updated (grab the new one and update or merge)
# 2024.08.29 - Add handling 522 response for kraken
# 2024.08.29 - Add fdot download-limit reached response detection and removing user for further sessions
# 2024.08.29 - Add additional status [FAIL] to allow for unavailable / no retry links
# 2024.08.29 - Add fdot response handling for premium users only files
# 2024.08.28 - Add the ability to pass in a URL to simply process it instead of urls.txt
# * ./mad.sh http://oshi.at/ZZZZ
# * ./mad.sh http://oshi.at/ZZZZ\|MyFileName.7z (override filename -- don't forget the cli escape '\|' )
# 2024.08.28 - Stringify all the rm commands for best practice (flocks, etc.)
# 2024.08.27 - Update which for curl_impersonate to look in ScriptDir
# 2024.08.26 - Updates to dailyuploads.net response parsing
# 2024.08.25 - Add option to specify terms to auto start in "multi auto"
# ./mad.sh multi auto # urls.txt
# 2024.08.24 - Add new host -- dailyuploads.net
# 2024.08.23 - Fix specific host processing completion (switch back to processing allhosts)
# 2024.08.23 - Update LaunchTerminal / ReloadScript args processing
# 2024.08.23 - clipmon: If specified urls.txt file does not exist, create it
# 2024.08.22 - Update curl_impersonate forks (cleanup / testing)
# 2024.08.22 - Add handling for multi-link download.gg urls (2 or more download files available on page)
# 2024.08.22 - Limit filehaus "no response" retries--server is likely down--mark Retry later
# 2024.08.21 - Modify catnapping message to not keep scrolling while waiting for downloads to finish
# 2024.08.21 - Make direct= download header retrieval and response check more robust
# 2024.08.21 - Revert the multi # urls.txt argument order (it was that, or change the documentation)
# * ./mad.sh multi # urls.txt
# * ./mad.sh multi host # urls.txt
# 2024.08.20 - Add several more 1F family domains
# * alterupload.com, cjoint.net, desfichiers.com, dfichiers.com, megadl.fr, mesfichiers.org,
# piecejointe.net, pjointe.com, dl4free.com
# 2024.08.20 - Fix script reload with multiple args
# 2024.08.20 - Fix Launch Terminal with multi # args
# 2024.08.20 - Dosya working again.. (up to 60 second delay on cdn server sending file)
# 2024.08.20 - Fix input file quick url count after initial argument parsing
# 2024.08.20 - Fix host parsing of args -- multi # host
# 2024.08.19 - Clean gofile filename
# 2024.08.19 - Fix download.gg post url for files with meta characters (ie. spaces)
# 2024.08.18 - Fix first line bash
# 2024.08.18 - Fix possible gofile cdn parsing issue
# 2024.08.18 - Updates to click file not found responses
# 2024.08.18 - Add clicknupload.site / clickndownload.site domain
# 2024.08.18 - Clean download.gg filename
# 2024.08.18 - Add download.gg Removed and Error responses
# 2024.08.17 - Fix flocks for active downloads with AutoRenameDuplicateFilenames=true
# (Only allow one download pure unique url -- including dupes)
# 2024.08.16 - Add AutoRenameDuplicateFilenames option (default=false)
# For any download filename that is a duplicate, this will prepend the filename with a random string
# ie. MyFile.rar --> 20240801124552305_renamed_MyFile.rar
# ** NOTE: Enabling AutoRenameDuplicateFilenames will result in downloading every url regardless
# of whether it is a duplicate or not.
# Enabled:
# (+) No need to skip simultaneous downloads of same named files
# (+) Less concern for unique filename overrides or collisions
# (-) Cannot use the |fname.ext override to try multiple download urls in order.
# ie.
# http://hosturl1.com/abcd|myfile.rar
# http://hosturl2.com/abcd|myfile.rar
# http://hosturl3.com/abcd|myfile.rar
# -- instead, use comments and uncomment if necessary --
# http://hosturl1.com/abcd|myfile.rar
# # alt http://hosturl2.com/abcd|myfile.rar
# # alt http://hosturl3.com/abcd|myfile.rar
# Disabled: (normal / previous functionality)
# (+) Can use the |fname.ext override to try multiple download urls in order.
# (+) More control over downloads and the expected end result
# (-) More concern for unique filename overrides or collisions
# (-) Have to wait for duplicate filenames to finish downloading before starting the next.
# 2024.08.15 - Add tenvoi urls as download host (1F)
# 2024.08.14 - Add OshiBaseUrlOverride option to allow using the input url or forcing to oshi.at or oshi.onion
# 2024.08.12 - Fix file.flock check (needs to happen prior to downloads exist check)
# This fixes issues with two downloads with the same filename occurring where the second is marked failed/retry)
# 2024.08.11 - Fix gofile possible head filename parsing (new filename*=)
# 2024.08.10 - Add new host -- offshore.cat
# 2024.08.06 - Fix mad.config override for UseTorCurlImpersonate
# 2024.08.05 - Set curl_ff109 priority in which check
# 2024.08.03 - Let click resolve url domain for first 3 attempts, then fallback to .org
# 2024.08.02 - Add curl_impersonate menu choices lwthiker (orig), lexiforest (fork)
# 2024.07.30 - Add curl_impersonate lexiforest fork (more active, upgraded curl 8.7.1)
# 2024.07.28 - Fix possible 9saves fetch error
# 2024.07.26 - Fix upload.ee fileinfo request
# 2024.07.20 - Oshi file removed / no filename in header
# 2024.07.18 - Fix click dns resolution for alternate orgs that often fail
# 2024.07.16 - Fix for oshi and https cert error
# 2024.07.13 - Allow nippydrive downloads as well (ala nippyshare)
# 2024.07.12 - Fix click post for filenames with url metacharacters
# 2024.07.12 - Fix to not add blank lines at the end of processing list (from reloads)
# 2024.07.12 - Ensure url flock exists prior to download start
# 2024.07.06 - Sanitize clicknupload cdn url (fix for filenames with spaces and metacharacters)
# 2024.07.05 - Rework the reload function and terminal launcher
# 2024.07.05 - Fix bad partial detection logic / add "Too many connections"
# 2024.07.05 - Fix handling of an unexpected head query response for click and most other hosts
# 2024.07.04 - Add new host -- clicknupload / clickndownload
# 2024.07.03 - Remove nekofiles (host is gone)
# 2024.07.02 - Add new host -- gofile.io
# 2024.07.02 - Make reload script more dynamic
# 2024.07.01 - Add new host -- nippyshare.com
# 2024.06.30 - Add several direct hosts so urls can just be added
# -- Neko, lainsafe, FileDoge, Eternal, DiscreetShare
# 2024.06.28 - Add new host -- download.gg (works for good links, still needs file removed response detection)
# 2024.06.28 - Add new host -- firestorage (works for good links, still needs file removed response detection)
# 2024.06.27 - Fixes for ninesaves, biteblob, and other additions that were not tested long enough
# 2024.06.25 - Updates to filename handling with url-enconding
# 2024.06.25 - Updates to biteblob.com url handling and responses
# 2024.06.24 - Add new host -- biteblob.com
# 2024.06.24 - Add new host -- 9saves.com
# 2024.06.23 - Add clipboard monitoring option (rudimentary for now). Run in a separate terminal.
# * Dependencies: xclip (sudo apt install xclip)
# ie. ./mad.sh clipmon urls.txt
# 2024.06.22 - Add addtl pixeldrain and uploadhive file removal responses
# 2024.06.18 - Add a check to fix url flock for direct downloads with no head (fix for last update)
# 2024.06.18 - Update downloads folder to use script dir instead of pwd
# 2024.06.17 - Update to download curl_impersonate (retrieve version/date)
# 2024.06.16 - Add addtl uploadflix removed response
# 2024.06.16 - Update uploadhive removed file detection and head 500 server response
# 2024.06.16 - Fix detection of already completed if in downloads and size is equal
# 2024.06.15 - Updates to direct to handle no head response (api.discreetshare.com, and others)
# 2024.06.14 - Modify bad partial detection
# 2024.06.14 - Add debug logging to bad html check to show bad lines found
# 2024.06.14 - Updates to file downloads (generic and specific -- pd, direct)
# 2024.06.14 - Only use agent-specific header overrides if not using curl_impersonate (they are already handled)
# 2024.06.12 - Handle incorrect head response from pixeldrain
# 2024.06.12 - Make pixeldrain bypass an option (default false)
# 2024.06.11 - Make direct downloads more robust (perform no-resume downloads where no content-length is sent.
# ie. filedoge.com (this fixes filedoge.com downloads using direct=http://api.filedoge.com/)
# 2024.06.08 - Add notification option to install curl_impersonate if option is set to true and it is not found
# 2024.06.08 - Add a option to download / extract curl_impersonate (using tor+curl) to the script
# 2024.06.08 - Fix youdbox removal detection when no response
# 2024.06.04 - Fix detect direct urls if no other url types exist in inputfile
# 2024.06.04 - Remove unecessary filename parsing when a filename override is used
# 2024.06.03 - Better handling of hexload download2 cdns
# 2024.06.01 - Add additional file removal response checks for youdbox
# 2024.05.30 - Attempt to fix incorrect kraken urls, make fileid more robust
# 2024.05.28 - Add filedot.top (filedot.to)
# 2024.05.26 - Add "file was deleted because of being against Upload.ee rules" catch
# 2024.05.26 - Re-incorporate new pixeldrain viewpump functionality (use PhantomJSCloud)
# 2024.05.26 - Add retry/fail if filesize parsing fails
# 2024.05.25 - Add check for "Too many connections from your IP" to partial repairing
# 2024.05.20 - Small fix for html detecting in partials and repairing (trunc)
# 2024.05.19 - Fix filehaus head response check
# 2024.05.19 - Make filesize parser more robust for all hosts
# 2024.05.18 - Updated random user-agents (remove mobile/linux and use the top 10 -- 2024/05)
# 2024.05.18 - Changed head query of dosya to better handle response (location with no filesize updates head query
# with new location.
# 2024.05.16 - Fix null error when running without curl_impersonate
# 2024.05.16 - Add optional loading of saved mad.config variables from mad.config file if it exists to allow
# upgrading without having to reconfigure all the settings.
# 2024.05.15 - Allow RateMonitor on kraken if not resuming (issues only occur if a partial exists and the cdn
# server connected to does not support byte resume correctly.. which tends to be about half the time).
# 2024.05.11 - Addition of "direct=" keyword in urls.txt to download from a direct link or cdn
# - If the direct url doesn't end in the filename, it is highly recommended to override it with |filename.ext
# - ie. direct=http://somehost.onion/abcD|filename.part1.rar
# 2024.05.11 - Disable RateMonitor for kraken (as not all servers support byte resume correctly)
# 2024.05.09 - Sanitize urls to handle potential non-acceptable chars
# 2024.05.09 - Fix possible dosya cdn issue
# 2024.05.08 - Fix to allow inputfile not being in the script directory
# 2024.05.08 - Fix detecting corrupt partial with html (and trunc logging)
# 2024.05.07 - Add OsType (used for launching terminals with "multi" argument
# 2024.05.06 - Fdot settings check, format updates, etc.
# 2024.05.05 - Add pixeldrain ip / rate limited failure (captcha locked). View pump broke. Bypass still in testing
# 2024.05.03 - Fix possible 1F filesize check failure
# 2024.05.02 - Add kraken detection of cloudflare server issues (521: Web server down)
# 2024.05.01 - Add detection and repair of html corruption in bad partial downloads
# 2024.04.28 - Update pixeldrain (viewpump broke, use bypass)
# 2024.04.26 - Host fixes (upload.ee, hex, 1F, uhive)
# 2024.04.20 - Add youdbox.site as host
# 2024.04 - Add filedot.to as host (integration with user/pass login)
# 2024.04 - Add AutoRepairBadPartials (deprecated backup/restore)
# 2024.04 - Add download file retries (quick retries)
# 2024.04 - Add auto-switching to .top/.su domains for filehaus on excessive retries
# 2024.04 - Additional url hardening
# 2024.04 - Add uploadflix.cc / uploadflix.org
# 2024.04 - Add uploadhive.com
# 2024.04 - Add upload.ee
# 2024.04 - Add random user agent for usage
# 2024.04 - Add dosyaupload.com
# Detection of html pollution in downloads
# Updates to pixeldrain bypass
# Catch kraken "Oops" server alert
# Update pixeldrain (viewpump broke, use bypass)
# Host fixes (upload.ee, hex, 1F, uhive)
# Add youdbox.site as host
# Add filedot.to as host (integration with user/pass login)
# Add AutoRepairBadPartials (deprecated backup/restore)
# Add download file retries (quick retries)
# Add auto-switching to .top/.su domains for filehaus on excessive retries
# Additional url hardening
# Add uploadflix.cc / uploadflix.org
# Add uploadhive.com
# Add upload.ee
# Add random user agent for usage
# Add dosyaupload.com
# (retry skipped collisions / allow multiple host for a file)
# Add "LoopThroughFileUntilComplete" option to continue processing urls.txt until it has no urls to process
# * When it comes back around, if it is completed, it will be marked #OK# My file!.rar (File exists)
# * First will lock and begin downloading, the second will skip it and move on, eventually coming back around to it.
# http://krakenfiles.com/view/abcd123456/file.html|My file!.rar
# http://oshi.at/eeaa/12345.rar|My file!.rar
# ie.
# Add download / inprogress file downloading to handle collisions and to allow multi-host options for a file.
# Add hexupload.net
# Make hosts unique, cleanup cookies and temp
# Add kraken downloading (kraken)
# ie. http:/oshi.at/abcd/1abc.7z|NewFilename.001
# Add ability to specify download filename by adding "|filename.ext" to the end of the url in file.
# Add filehaus downloading (fh)
# - Removes the _flocks folder to clear any stale tickets/locks.
# - Reverts all "#RETRY#" commented lines back so it can be downloaded again.
# Add mad.sh reset urls.txt
# leaking into the file. It also allows for resuming from a bad node, where if it is off, the download must restart.
# *deprecated* Add PdAutoBackupRestorePartials option that will backup / restore partial pixeldrain downloads to prevent bad api data
# ./mad multi auto urls.txt
# (OS dependent, 1 terminal per host -- whonix tested)
# ./mad multi fh 2 urls.txt
# ./mad multi oshi 2 urls.txt
# ./mad multi pd 2 urls.txt
# ./mad multi hex 2 urls.txt
# ./mad multi 1f 2 urls.txt
# (OS dependent, X terminals for a specific host -- whonix tested)
# ./mad multi [2-8] urls.txt
# (OS dependent, X terminals for all hosts -- whonix tested)
# ./mad urls.txt
# ./mad urls.txt
# ./mad urls.txt
# ./mad urls.txt
# (OS agnostic, run in X or more separate terminals)
# ./mad fh urls.txt
# ./mad oshi urls.txt
# ./mad pd urls.txt
# ./mad hex urls.txt
# ./mad 1f urls.txt
# (OS agnostic, run in X separate terminals)
# Add mutli-terminal / single-host (1 per host) downloading
# Add oshi downloading (oshi)
# Add pixeldrain downloading (pd)
# Add 1F french bytes conversion and potential incorrect download filesize detection (1Flove)
# Add hexload downloading functionality and integrated logging, moving, etc.
# Add multi-host downloading (1F, Hexload)
# More verbose logging on Retry/Fail reason
# Additional cdn debugging
# Add auto-commenting in urls.txt on completed / failed downloads
# Add check for completed download in the MoveToFolder (log and continue)
# Cleanup and debug additions
# Add verbose results logging
# Add optional minimum download size check
# Add retry attempts to acquire filename, filesize, and header
# Add retry on initial status attempts
# Add resume downloads (auto-resume by default)
# Incorporate multi-process download to find an empty slot faster (thanks 1flove devs)
# Add 1F Url validation
# Add STOP! keyword to allow ending after a specified download (perhaps drive limitations or another reason)
# Cleanup
# *deprecated* Update connection headers 2023.11
# folder="" --> Keeps downloads in initial downloads directory
# - ex. folder=Folder1 Name (desc) --> Creates a folder "Folder1 Name (desc)" and moves downloads there
# Add folder=<name> option to allow moving downloads into specified folders upon completion (blank to reset to downloads)
# *deprecated* Add option to keep partial downloads (move to *.partial folder) -- may use up space for large downloads
# *deprecated* Add multiple text recode options (some require apt install recode, or apt install html2text)
# *deprecated* Fixes for latin charset (UseRecode=html2iso8859)
# Add option to clear screen on filelist reload
# Add ability to auto-reload urls.txt if modifications are detected
# Allow reloading/restarting script (updated urls.txt) after finished processing current url "reload" or "restart" file exists
# Allow clearing the screen if "clear" file exists
# Allow aborting/stopping processing remaining urls if "stop" file exists
# Add allow comment lines (#), blank lines, and garbage lines (non-http starting)
# Add skipping file has been deleted
# Add skipping file removed by owner or does not exist
# Add skipping of removed files from host
# Output download status into results.txt in script directory
# Display try # on which successfully retrieved a valid circuit
# Add fixing (autoconverting) http:// --> https://
# Added debug option (save html response in _debug folder)
# Added more output verbosity
# Added configurable failed download retries
# Increased connection retries / configurable connection timeout
# Fixes to output and code

View file

@ -0,0 +1,51 @@
1.) Read and understand the changes -- see ./documentation/FolderStructure-Default (AIO) or
./documentation/FolderStructure-WorkDirOverride to visualize the changes.
Summary:
--------
* Renamed _debug, _temp, _flocks to .debug, .temp, .flocks (so you can hide them if desired)
* Renamed 'downloads' folder (transferring files) to .inflight
* Renamed 'completed' folder to downloads (makes more sense as there are uploads / downloads (& .inflight)
* Moved "plugins" and "hosts" folders to ScriptDir in code (so they must reside in the same folder as
mad.sh (and curl_impersonate, mad.config if you choose to use them)
* Added ./data folder in WorkDir (the downloads_completed, uploads_completed reside here). It is intended
to be a more long-term data storage (things that plugins will / do use.
* Moved results.txt to ./downloads for downloads and ./uploads for uploads. This file is the verbose
processing of lines (OK, SKIP, FAIL, RETRY, NOCDN, PASSWORD, RENAME, etc). It is helpful to see an outline
of all the terminal processing.
* A summary of all the uploaded download links is created in ./uploads/results-links.txt (only successes)
* Upload completed tickets are put in the ./uploads/_tickets folder to keep it less cluttered.
* If uploads are done manual mode (through the UI ./mad.sh upload), then a file named
./uploads/temp_upload_handler.txt is created and used to ensure multiple terminals do not reprocess
completed / failed upload files. If using uploads.txt, this is not created and not necessary as it is
handled directly in the uploads.txt
2.) End any mad.sh processes
3.) Delete your old hosts & plugins folders (save off any plugins you are working on or wish to save). If you
have plugins that reference any of the old completed.txt, results.txt, or results_uploads.txt, then will need
to be referenced in their new locations: $WorkDir/data/downloads_completed.txt,
$WorkDir/downloads/results.txt, $WorkDir/uploads/results.txt
4.) Delete _debug, _temp, _flocks in your old folder
5.) Rename 'downloads' folder to '.inflight' (or delete if empty)
6.) Rename 'completed' folder to 'downloads'
5.) Copy all the files in this bundle and move to your MAD locations (overwrite anything there)
6.) Rename completed.txt to downloads_completed.txt and move to ./data
7.) If you have LoadPlugins="SkipUrlsInCompletedTxt.sh" in your mad.sh or mad.config, the name has changed to
SkipUrlsInDownloadsCompletedTxt.sh, so update it in the LoadPlugins="" line.
Why?
* Less clutter everywhere (especially for those who use the WorkDirOverride)
- Scripts are in ScriptDir (mad.sh, mad.config, curl_impersonate, plugins, hosts)
- Downloads (complete) and all their result data is in downloads folder (except urls.txt, can be anywhere)
- Downloads can work in batches.. once done and extracted moved.. the contents can be purged
- Downloads that are being transferred / resuming, are in the .inflight folder.
- Once batch of downloads is complete.. this can be purged. There shouldn't be anything in it unless
something is processing or failed while processing and is waiting for resume.
- Uploads (files and all the results data / result-links.txt are in uploads (except uploads.txt);
- Uploads can work in batches.. once done and Download Links handled, this can be purged for next batch
- Upload tickets are saved in ./uploads/_tickets
- All .folders can be hidden, they are also generally short-term (really for processing a batch)
- Once processing is complete (no uploads or downloads active, these can be purged)
- New data folder is for longer term storage of statdata such as completed uploads / downloads for plugins
* That's it for now..

157
documentation/!README.txt Executable file
View file

@ -0,0 +1,157 @@
:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:
: Multi-host Auto Downloader [aka MAD] (by kittykat) :
:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:
Setup:
------
1. Extract to folder to a location.
2. Ensure the script is executable
# Open terminal in script folder (or open terminal and cd into script folder).
# Run "chmod +e mad.sh" to give the script execute permission.
3. Configure settings in mad.sh script that you desire.
* Optional config:
- Copy mad.config to script directory from the Documentation folder and configure settings there instead. This
file will load and override any settings configured in the script -- this allows upgrading mad.sh without
having to reconfigure the settings across versions.
* Optional curl_impersonate:
- See Optional Depenencies section below.
4. Add urls to urls.txt or any file you wish to use.
5. Run "./mad.sh urls.txt" (see usage section below for additional commands or run "./mad.sh ?" for help)
Optional Dependecies:
---------------------
Some hosts use CloudFlare to detect and block scripts (such as hexload).
To get around it, this script needs to impersonate a browser.
You'll need to download "curl-impersonate".
It can be obtained on GitHub, search online for "curl-impersonate"
To access the releases on GitHub without javascript, do this:
1. Visit the GitHub page of curl-impersonate and add "/releases/latest/" at end of URL.
2. You'll be redirected to the latest version, e.g: "/releases/tag/vx.x.x"
3. In the URL replace "tag" with "expanded_assets", e.g. "/releases/expanded_assets/v0.5.4"
- Download archive "curl-impersonate-vX.Y.Z.x86_64-linux-gnu.tar.gz".
- Extract files "curl-impersonate-ff" and "curl_ff109" next to this script or into your PATH."
Usage (urls.txt):
-----------------
- ENTER 1f, hexload, pixeldrain, kraken, dosya, filehaus, oshi, upload.ee, uploadhive, or uploadflix urls
in urls.txt (one url per line).
- ! No trailing spaces BUT requires at least one empty newline at the end of file
! Accepts/ignores comment lines and garbage lines (lines beginning with '#' or non http)
Keywords (urls.txt):
--------------------
folder=
- Changes save to folder (Lines beginning with keyword "folder=")
ie. folder=This is my folder! (vids) [2024]
|filename.ext
- Overrides filename to save download as (add the suffix "|Filename.ext" to a url)
ie. http://oshi.at/abcd|My new filename.rar
direct=
- Will download directly from url (special processing for Lainsafe, FileDoge, NekoFile, DiscreetShare, and others.
ie. direct=https://oshi.at/abcd/ABCD
#pw=
- Will update the $CurrentZipPassword variable in mad.sh (can be accessed in plugins)
STOP!
- Stops processing at this line
RELOAD!
- Forces a reload of the script and urls.txt file with the same commandline it started with
Example:
-----------
folder=New folder 01
# pw: **1234567890$$
# ref: http//reference.source.url/here.html
https://1fichier.com/?123456789abcdefghijk
http://hexload.com/123456789abc
folder=New folder 02
# pw: 4444555551-1
http://5ety7tpkim5me6eszuwcje7bmy25pbtrjtue7zkqqgziljwqy3rrikqd.onion/ZaZa/12az.rar
http://oshi.at/AAzz/11ZZ.rar|File - Set 001 (2001).7z
http://oshi.at/AAyy/11YY.rar|File - Set 002 (2001).7z
http://pixeldrain.com/u/ZZaa0011
folder=Direct link fun
# pw= 2022234092
direct=http://pomf2.lain.la/f/abcd123456789.7z
direct=http://pomf2.lain.la/f/ABCD998877000.rar|This is it [2022].rar
------ Informational Display -------------------------
[Status] of urls in urls.txt
./mad.sh status urls.txt
[Reset] failed / retry urls in urls.txt
./mad.sh reset urls.txt
[Host] Modules and their internal description
./mad.sh hosts
[Plugins] and their internal description
./mad.sh plugins
------ Basic Usage (Uploads) -------------------------
[Upload] launch MAD Uploader (process files in ./uploads/ folder to selected hosts)
./mad.sh upload
------ Basic Usage (Downloads) -----------------------
[Run]
./mad.sh urls.txt
## Multi Runs: (mutli-terminals / all-hosts / specific-host) ##
---------------------------------------------------------------
[Normal Mode] Process urls.txt in order with multiple terminals downloading
(OS agnostic, run in X or more separate terminals)
./mad urls.txt
./mad urls.txt
(OS dependent, X terminals for all hosts -- whonix tested)
./mad multi [2-8] urls.txt
[Specific Host] Process only X host in terminal
(OS agnostic, run in X separate terminals)
./mad 1f urls.txt
./mad hex urls.txt
./mad pd urls.txt
./mad kraken urls.txt
./mad dosya urls.txt
./mad fh urls.txt
./mad oshi urls.txt
./mad upee urls.txt
./mad uphive urls.txt
./mad upflix urls.txt
[**Multi Specific Host] Create X terminals for a specific host and process downloads in order
(**OS dependent, X terminals for a specific host -- whonix tested)
./mad multi 1f 2 urls.txt
./mad multi hex 2 urls.txt
./mad multi pd 2 urls.txt
./mad multi kraken 2 urls.txt
./mad multi dosya 2 urls.txt
./mad multi fh 2 urls.txt
./mad multi oshi 2 urls.txt
./mad multi upee 2 urls.txt
./mad multi uphive 2 urls.txt
./mad multi upflix 2 urls.txt
[**Multi Auto] Create 1 terminal for each host and process downloads in order
(**OS dependent, 1 terminal per host -- whonix tested)
./mad multi auto urls.txt
[**Multi Auto] Create 4 terminals (1 terminal for each host) and process downloads in order
(**OS dependent, 1 terminal per host -- whonix tested)
./mad multi auto 4 urls.txt

Binary file not shown.

After

Width:  |  Height:  |  Size: 49 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

View file

@ -0,0 +1,66 @@
# Upload Hosts / HostCodes (by Retention, Max Size)
# -------------------------------------------------
# Long Retention -----------------------------------------------------------------------
Max Size . HostCode . Nickname . Notes
# ---------------------------------------------------------------------------------------
300GB 1f 1fichier.com 15d expiry free accounts
300GB fh filehaus.top (.su) ?? expiry
20GB rz ranoz.gg ?? expiry
10GB gofile gofile.io ?? expiry
10GB tmpme tempfile.me 3mo expiry (tend to ban 7z faster)
5GB uhive uploadhive
5GB uflix uploadflix.cc 7d inactive expiry
5GB oshi oshi.at (.onion) 1000 file hits
- 4GB bd bedrive.ru ?? expiry
- 4GB daily dailyuploads.net ?? expiry
- 2GB hex hexload.com 30d inactive expiry
2GB dosya dosyaupload.com 45d inactive expiry
2GB fs firestorage.jp 90d+ inactive expiry
* 2GB axfc axfc.net 90d+ inactive expiry
- 1GB kraken krakenfiles.com 90d inactive expiry
1GB ansh anonsharing.com 6mo expiry
300MB trbo turbo.onion ~40d expiry
250MB upev uploadev.org 90d inactive expiry
* 240MB ko kouploader.jp 5mo expiry (240MB max)
100MB bow bowfile.com 20d inactive expiry
100MB yolo yolobit ?? expiry
100MB nofile nofile.org ?? expiry
100MB so share-online.vg ?? expiry
100MB inno innocent.onion ?? expiry
# Short Retention ----------------------------------------------------------------------
Max Size . HostCode . Nickname . Notes
# ---------------------------------------------------------------------------------------
10GB nant fichiers.nantes.cloud ~1mo or less expiry, jirafrau
10GB anarc anarchaserver.org ~1mo or less expiry, jirafrau
10GB nlib netlib.re ~1mo or less expiry, jirafrau
* 10GB raja uploadraja.com 4d inactive expiry
5GB squid filesquid.net ~1mo or less expiry, jirafrau
4GB tmpsh temp.sh 3d expiry
1GB kaz depot.kaz.bzh ~1mo or less expiry, jirafrau
512MB linx linxx.net ~1mo or less expiry, jirafrau
500MB soy soyjak.download ~1mo or less expiry, jirafrau
195MB dup dataupload.net ?? expiry
100MB nippy nippy* ?? expiry, (file, share, box, drive, space)
100MB dbree dbree.me ?? expiry
?? harr files.harrault.fr ~1mo or less expiry, jirafrau
?? acid dl.acid.fr ~1mo or less expiry, no resume, jirafrau
?? fr4e sendfree4e.fr ~1wk or less expiry, jirafrau
Failing (-):
----------------
daily dailyuploads.net (MAD download failing -- JS required / Google Recaptcha)
kraken kraken.com (MAD download failing -- JS required / Google Recaptcha)
hex hexload.com (MAD download failing -- JS required / Google Recaptcha)
bd bedrive.ru (MAD download failing -- JS required / Google Recaptcha)
NOTES (*):
----------------
raja uploadraja.com (MAD download not implemented)
ko kouploader.jp (MAD download not implemented)
axfc axfc.net (MAD download not implemented)