8

GitHub - edoardottt/cariddi: Take a list of domains, crawl urls and scan for end...

 2 years ago
source link: https://github.com/edoardottt/cariddi
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Take a list of domains, crawl urls and scan for endpoints, secrets, api keys, file extensions, tokens and more...

Coded with blue_heart by edoardottt.
Share on Twitter!

PreviewInstallGet StartedExamplesContributingLicense

Preview bar_chart

Installation satellite

Using Docker

docker build -t cariddi .
docker run cariddi -h

Building from source

You need Go.

  • Linux

    • git clone https://github.com/edoardottt/cariddi.git
    • cd cariddi
    • go get
    • make linux (to install)
    • make unlinux (to uninstall)

    Or in one line: git clone https://github.com/edoardottt/cariddi.git; cd cariddi; go get; make linux

  • Windows (executable works only in cariddi folder.)

    • git clone https://github.com/edoardottt/cariddi.git
    • cd cariddi
    • go get
    • .\make.bat windows (to install)
    • .\make.bat unwindows (to uninstall)

Get Started tada

cariddi -h prints the help in the command line.

Usage of cariddi:
  -c int
    	Concurrency level. (default 20)
  -cache
	Use the .cariddi_cache folder as cache.
  -d int
    	Delay between a page crawled and another.
  -e	Hunt for juicy endpoints.
  -ef string
    	Use an external file (txt, one per line) to use custom parameters for endpoints hunting.
  -examples
    	Print the examples.
  -ext int
    	Hunt for juicy file extensions. Integer from 1(juicy) to 7(not juicy).
  -h	Print the help.
  -i string
    	Ignore the URL containing at least one of the elements of this array.
  -intensive
    	Crawl searching for resources matching 2nd level domain.
  -it string
    	Ignore the URL containing at least one of the lines of this file.
  -oh string
    	Write the output into an HTML file.
  -ot string
    	Write the output into a TXT file.
  -plain
      Print only the results.
  -proxy string
    	Set a Proxy to be used (http and socks5 supported).
  -rua
      Use a random browser user agent on every request.
  -s	Hunt for secrets.
  -sf string
    	Use an external file (txt, one per line) to use custom regexes for secrets hunting.
  -t int
  	Set timeout for the requests. (default 10)
  -version
    	Print the version.

Examples bulb

  • cariddi -version (Print the version)

  • cariddi -h (Print the help)

  • cariddi -examples (Print the examples)

  • cat urls | cariddi -s (Hunt for secrets)

  • cat urls | cariddi -d 2 (2 seconds between a page crawled and another)

  • cat urls | cariddi -c 200 (Set the concurrency level to 200)

  • cat urls | cariddi -e (Hunt for juicy endpoints)

  • cat urls | cariddi -plain (Print only useful things)

  • cat urls | cariddi -ot target_name (Results in txt file)

  • cat urls | cariddi -oh target_name (Results in html file)

  • cat urls | cariddi -ext 2 (Hunt for juicy (level 2 out of 7) files)

  • cat urls | cariddi -e -ef endpoints_file (Hunt for custom endpoints)

  • cat urls | cariddi -s -sf secrets_file (Hunt for custom secrets)

  • cat urls | cariddi -i forum,blog,community,open (Ignore urls containing these words)

  • cat urls | cariddi -it ignore_file (Ignore urls containing at least one line in the input file)

  • cat urls | cariddi -cache (Use the .cariddi_cache folder as cache)

  • cat urls | cariddi -t 5 (Set the timeout for the requests)

  • cat urls | cariddi -intensive (Crawl searching for any resource under 2nd level domain (*.target.com))

  • cat urls | cariddi -rua (Use a random browser user agent on every request)

  • cat urls | cariddi -proxy http://127.0.0.1:8080 (Set a Proxy to be used (http and socks5 supported))

  • For Windows:

    • use powershell.exe -Command "cat urls | .\cariddi.exe" inside the Command prompt
    • or just cat urls | cariddi.exe using PowerShell

Contributing hammer_and_wrench

Just open an issue/pull request.
See also CONTRIBUTING and CODE OF CONDUCT files.

Help me building this!

A special thanks to:

To do:

  • Tests (joy)

  • Tor support

  • Cookie support

  • Proxy support

  • Ignore specific types of urls

  • Plain output (print only results)

  • HTML output

  • Output color

  • Endpoints (parameters) scan

  • Secrets scan

  • Extensions scan

  • TXT output

License memo

This repository is under GNU General Public License v3.0.
edoardoottavianelli.it to contact me.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK