Aranea may be used as an additional OSINT tool for web application investigations, by crawling the links of the webapp or by examining the JavaScript files for likely useful data.
Clone the Repo:
git clone https://github.com/leddcode/AraneaInstall requirements:
pip3 install -r requirements.txtusage: aranea.py [-h] (-u URL | -ul URLLIST) -m MODE [-t THREADS] [--headers HEADERS] [-s] [--mainonly]
optional arguments:
-h, --help show this help message and exit
-u URL, --url URL Target URL
-ul URLLIST, --urllist URLLIST
Path to file containing list of URLs (one per line)
-m MODE, --mode MODE Available Modes: crawl (or c), analysis (or a)
-t THREADS, --threads THREADS
Default configuration: 10 threads
--headers HEADERS Should be a string as in the example:
"Authorization:Bearer ey..,Cookie:role=admin;"
-s, --strict For analysis mode: the URL will be parsed even if it
does not have a JS extension.
--mainonly For analysis mode: only the main.js file will be parsed.
-c, --continuous For analysis mode: recursively parse found JS files.Crawls the target URL and discovers internal/external links. All results are stored in the scans directory.
Analyzes JavaScript files to extract:
- API endpoints and paths
- Configuration objects
- Authentication-related paths
- User-related endpoints
- Token-related data
- Other potentially sensitive information
How it works:
- If you provide a direct
.jsURL, it analyzes that file immediately - If you provide a webpage URL, it discovers all JS files on the page
- For each discovered JS file, you'll be prompted:
Parse this file? y/N:- Press Enter or type
nto skip (default) - Type
yoryesto analyze the file
- Press Enter or type
- Use
--mainonlyflag to filter only files containing "main" in their name - Use
-cor--continuousto recursively discover and queue new JS files found within analyzed scripts.
When enabled with -c, Aranea becomes smarter:
- It parses selected JS files for references to other JS files.
- It automatically adds these new findings to the analysis queue.
- It handles relative path resolution (
./app.js->https://example.com/app.js). - It tracks visited files to prevent infinite loops.
- It notifies you of how many new files were found and how many are left in the queue.
Crawl a website with 100 threads (results stored in the scans directory):
python3 aranea.py -u https://example.com -m crawl -t 100Analyze a webpage and interactively choose which JS files to parse:
python3 aranea.py -u https://example.com -m analysisThis will discover all JS files and prompt you for each one. Press Enter to skip or type y to analyze.
Filter to only main.js files before prompting:
python3 aranea.py -u https://example.com -m analysis --mainonlyAnalyze a specific JavaScript file directly:
python3 aranea.py -u https://example.com/static/bundle.js -m analysisUse the -s flag if the JS file doesn't have a .js extension:
python3 aranea.py -u https://example.com/script -m analysis -sInclude authentication or custom headers:
python3 aranea.py -u https://example.com -m analysis --headers "Authorization:Bearer token123,Cookie:session=abc"Recursively find and analyze JS files referenced by other scripts:
python3 aranea.py -u https://example.com -m a -cYou can use a for analysis and c for crawl:
python3 aranea.py -u https://example.com -m c
python3 aranea.py -u https://example.com -m aProcess multiple URLs from a file (one URL per line, lines starting with # are ignored):
Create a file urls.txt:
https://example.com
https://another-site.com
https://third-site.com
# This is a comment and will be ignored
Run analysis on all URLs:
python3 aranea.py -ul urls.txt -m analysisRun crawl on all URLs:
python3 aranea.py -ul urls.txt -m crawl -t 50- Fork it (https://github.com/leddcode/Aranea)
- Create your feature branch (
git checkout -b feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin feature) - Create a new Pull Request