Table of Contents
-
About The Project
-
Getting Started
-
Docker
- Usage
- Roadmap
- Contributing
- License
- Preview
Translation (Перевод)
About The Project
I've seen multiple projects out there in GitHub, that are crawlers for the deep web,
but most of them did not meet my standards of OSINT on the deep web.
So I decided to create my own deep web OSINT tool.
This tool serves as a reminder that the best practices of OPSEC should always be followed in the deep web.
The author of this project is not responsible for any possible harm caused by the usage of this tool.
Prying Deep crawls dark/clear net platforms and extracts as much intelligence as possible effectively.
Getting Started
Prerequisites
Before you can use our OSINT tool, please ensure you have the following dependencies installed:
-
Docker: (optional)
- You can download and install Docker by following the official installation instructions for your specific operating system:
- Docker Installation Guide.
-
Go: (required)
-
PostgresSQL: (required if you don't use docker)
- Make sure your
pryingdeep.yaml
environment variables match the environment in docker-compose.yaml
- PostgreSQL Installation
Binary Installation
- Install the binary via:
go install -v github.com/iudicium/pryingdeep/cmd/pryingdeep@latest
- Run the following command:
pryingdeep install
- Adjust the values inside the config folder to your needs.
Manual Installation
-
Clone the repo:
git clone https://github.com/iudicium/pryingdeep.git
-
Adjust the values in the .yaml configuration either through flags or manually.
Database
, logger
, tor
all require manual configuration.
You will need to read Colly Docs. Also, refer to Config Explanation
- Build the binary via:
go build
-> inside the cmd/pryingdeep
directory
go build cmd/pryingdeep/pryingdeep.go
-> root directory, binary will also be there.
Docker
To start run pryingdeep inside a docker container use this command:
docker-compose up
Config
Read more each parameter here:
config
Tor
Read more about building and running our tor container here:
Tor
Usage
Pryingdeep specializes in collecting information about dark-web/clearnet websites.
This tool was specifically built to extract as much information as possible from a .onion website
Usage:
pryingdeep [command]
Available Commands:
completion Generate the autocompletion script for the specified shell
crawl Start the crawling process
export Export the collected data into a file.
help Help about any command
install Installation of config files
Flags:
-c, --config string Path to the .yaml configuration. (default "pryingdeep.yaml")
-h, --help help for pryingdeep
-z, --save-config Save chosen options to your .yaml configuration
-s, --silent -s to disable logging and run silently
Roadmap
- Add a search command which will not require any onion links
- Acquire a shodan api key for testing the favicon module
- Think of a way to acquire IP address of the server
- Implement scan command
- Implement file identification and search
- Find a way to supress gorm unique duplicate errors as they take half the screen...
Contributing
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
)
- Commit your Changes (
git commit -m 'Add some AmazingFeature'
)
- Push to the Branch (
git push origin feature/AmazingFeature
)
- Open a Pull Request
License
Distributed under the GPL-3.0 license. See LICENSE
for more information.
Video Preview
Preview
Support
If you have found this repository useful and feel generous, you can donate some Monero (XMR) to the following address:
48bEkvkzP3W4SGKSJAkWx2V8s4axCKwpDFf7ZmwBawg5DBSq2imbcZVKNzMriukuPqjCyf2BSax1D3AktiUq5vWk1satWJt
Thank you!