Skip to content

Commit 6af04db

Browse files
committed
refactor(docs): rewrite the documentation with the new command system
1 parent 8c8a051 commit 6af04db

14 files changed

Lines changed: 156 additions & 235 deletions

File tree

docs/docs/api.md renamed to docs/docs/api.mdx

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# API
22

3-
ServerRawler currently focuses on its core crawling functionality. A Web API is planned for future development to allow programmatic access to the collected server data.
3+
ServerRawler currently focuses on its core crawling functionality.
4+
A Web API is planned for future development to allow programmatic access to the collected server data.
45

56
## Planned Web API
67

@@ -12,4 +13,6 @@ The Web API will provide endpoints to:
1213

1314
## Stay Tuned
1415

15-
Details regarding the API endpoints, authentication, and usage examples will be provided here once the Web API is implemented. Please check back for updates as the project progresses.
16+
Details regarding the API endpoints, authentication, and usage examples will be provided here once the Web API is implemented.
17+
Please check back for updates as the project progresses.
18+
More you can find at the [roadmap](https://github.com/users/Cyberdolfi/projects/2) for upcoming features and developments.

docs/docs/configuration/config.mdx

Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
---
2+
title: config.toml
3+
---
4+
5+
# Main Configuration
6+
7+
:::tip[Configuration System]
8+
ServerRawler uses a folder where all configuration files are stored. The folder is named `config`.
9+
:::
10+
11+
If you run ServerRawler fist time, it will automatically create a `config.toml` file with default values in the root directory of your project.
12+
It should look like this:
13+
14+
```toml title="config.toml" showLineNumbers
15+
# ServerRawler configuration file
16+
# Github: https://github.com/Cyberdolfi/ServerRawler
17+
# Read the docs here: https://cyberdolfi.github.io/ServerRawler/docs/configuration/config
18+
19+
[crawler]
20+
ips_per_iteration = 1000000
21+
max_tasks = 0
22+
runs = 0
23+
time_between_runs = 0
24+
default_ports = [25565]
25+
26+
[scanner]
27+
max_tasks = 0
28+
default_ports = [25565]
29+
30+
[network]
31+
max_tasks = 2000
32+
timeout = 3000
33+
```
34+
35+
Simply, fill in the correct values for your crawling and scanning needs and save the file.
36+
37+
### Crawler Configuration
38+
- `ips_per_iteration` determines how many IP addresses the crawler will generate and process in each iteration.
39+
- `max_tasks` sets the maximum number of concurrent tasks the crawler will execute. Setting it to 0 will use the value from the `[network]` section.
40+
- `runs` specifies how many iterations the crawler will perform. Setting it to 0 will run the crawler indefinitely.
41+
- `time_between_runs` defines the cooldown time in seconds between each crawling iteration. Setting it to 0 will disable the cooldown.
42+
- `default_ports` is a list of Minecraft server ports that the crawler will scan by default.
43+
44+
### Scanner Configuration
45+
- `max_tasks` sets the maximum number of concurrent tasks the scanner will execute. Setting it to 0 will use the value from the `[network]` section.
46+
- `default_ports` is a list of Minecraft server ports that the scanner will scan by default.
47+
48+
### Network Configuration
49+
- `max_tasks` defines the maximum number of concurrent network tasks (e.g., pings, queries). Recommended values are between 1000 and 5000, and the value must be between 10 and 20000.
50+
- `timeout` sets the timeout in milliseconds for server connections. The value must be between 80 and 15000.
51+
52+
### Using a Custom Configuration File Path
53+
54+
By default, ServerRawler looks for the `config.toml` file in the root directory of your project.
55+
If you want to use a custom path for your configuration file, you can specify it using the [`--config` argument](../usage/arguments/config.md).
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
---
2+
title: database.toml
3+
---
4+
5+
# Database Configuration
6+
7+
:::info[Prerequisites]
8+
Make sure to have a working PostgreSQL database set up and running before configuring ServerRawler to connect to it.
9+
If you haven't set up a PostgreSQL database yet, go to the [Database Setup Guide](./database-setup).
10+
:::
11+
12+
:::tip[Configuration System]
13+
ServerRawler uses a folder where all configuration files are stored. The folder is named `config`.
14+
:::
15+
16+
## Config
17+
18+
If you run ServerRawler fist time, it will automatically create a `database.toml` file with default values in the root directory of your project.
19+
It should look like this:
20+
21+
```toml title="database.toml" showLineNumbers {8}
22+
# ServerRawler configuration file
23+
# Github: https://github.com/Cyberdolfi/ServerRawler
24+
# Read the docs here: https://cyberdolfi.github.io/ServerRawler/docs/getting-started/configuration
25+
26+
host = "localhost"
27+
port = 5432
28+
user = "postgres" # Don't use superuser in production
29+
password = "your_strong_password" # Change this to your actual password
30+
database = "serverrawler"
31+
```
32+
33+
Simply, fill in the correct connection details for your PostgreSQL database and save the file.
34+
If you are using no password use `password = ""`.
35+
36+
### Using a Custom Configuration File Path
37+
38+
By default, ServerRawler looks for the `database.toml` file in the root directory of your project.
39+
If you want to use a custom path for your configuration file, you can specify it using the [`--config` argument](../usage/arguments/config.md).
40+
41+
### Weak Credentials Warning
42+
ServerRawler will check automatically if there are default or weak credentials in the `database.toml` file and will warn you if it detects any.
43+
Make sure to use strong and unique credentials for your database connection to ensure the security of your data.
Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# Contributing
22

3-
We welcome contributions to ServerRawler! Whether it's reporting bugs, suggesting features, or submitting code, your help is valuable.
3+
We welcome contributions to ServerRawler!
4+
Whether it's reporting bugs, suggesting features, or submitting code, your help is valuable.
45

56
## How to Contribute
67

@@ -12,13 +13,12 @@ git checkout -b feature/your-feature-name
1213
```
1314
4. **Make Your Changes:** Implement your changes, following the existing code style and conventions.
1415
5. **Test Your Changes:** Ensure your changes work as expected and don't introduce new issues.
15-
6. **Commit Your Changes:** Write clear and concise commit messages. (Use https://gist.github.com/Zekfad/f51cb06ac76e2457f11c80ed705c95a3)
16+
6. **Commit Your Changes:** Write clear and concise commit messages. (Use the [conversial commit message style](https://gist.github.com/Zekfad/f51cb06ac76e2457f11c80ed705c95a3))
1617
7. **Push to Your Fork:** Push your branch to your GitHub fork.
17-
8. **Open a Pull Request:** Open a pull request to the `main` branch of the original ServerRawler repository. Provide a clear description of your changes and why they are needed.
18+
8. **Open a Pull Request:** Open a pull request to the original ServerRawler repository. Provide a clear description of your changes and why they are needed.
1819

19-
## Code of Conduct
20-
21-
Please note that this project is released with a Contributor Code of Conduct. By participating in this project, you agree to abide by its terms.
20+
## Status
21+
You can find the current status of contributions, pull requests, and open issues on the [GitHub project board](https://github.com/users/Cyberdolfi/projects/2).
2222

2323
## Reporting Bugs
2424

File renamed without changes.
Lines changed: 15 additions & 160 deletions
Original file line numberDiff line numberDiff line change
@@ -1,169 +1,24 @@
11
---
2-
sidebar_position: 3
3-
title: Configuration
2+
title: Setup Configurations
43
---
54

6-
import Tabs from '@theme/Tabs';
7-
import TabItem from '@theme/TabItem';
8-
9-
# Configuration
10-
11-
ServerRawler relies on a `config.toml` file to manage all settings in a file.
12-
Create a file named `config.toml` in the root directory of your project (or [use a custom path](#using-a-custom-configuration-file-path)).
13-
14-
## Database
15-
16-
ServerRawler is designed to store collected data in a PostgreSQL database. You'll need to provide your PostgreSQL connection details.
17-
[Setup database](./database-setup)
18-
19-
:::tip
20-
Ensure your PostgreSQL database is running and accessible from where you run ServerRawler.
5+
:::info[Configuration System]
6+
ServerRawler uses a folder where all configuration files are stored. The folder is named `config`.
217
:::
228

9+
## Using a Custom Configuration File path
2310

24-
<Tabs queryString="configstyle">
25-
<TabItem value="stepbystep" label="Step by Step" default>
26-
## Step by Step Configuration
27-
Here is a breakdown of the sections of the `config.toml` file:
28-
29-
### Database Section
30-
The `[database]` section contains the connection details for your PostgreSQL database. Fill in the `host`, `port`, `user`, `password`, and `database` fields with your database credentials.
31-
```toml showLineNumbers=5 title="config.toml"
32-
[database]
33-
# Database configuration
34-
35-
host = "localhost"
36-
port = 5432
37-
user = "postgres" # Don't use superuser in production
38-
password = "your_strong_password"
39-
database = "serverrawler"
40-
```
41-
42-
### Crawler Section
43-
The `[crawler]` section allows you to configure the crawling behavior, including how many IPs to process per iteration, the number of concurrent tasks, and the ports to scan.
44-
```toml showLineNumbers=14 title="config.toml"
45-
[crawler]
46-
# Configuration specific to the crawling operations.
47-
# Number of IP addresses to generate and process per crawling iteration.
48-
ips_per_iteration = 1000000
49-
50-
# Maximum concurrent tasks the crawler will execute.
51-
# Use 0 to use the value max_tasks from [network].
52-
max_tasks = 0
53-
54-
# Number of loops. Use 0 to run infinitly.
55-
runs = 0
56-
57-
# Time in seconds to wait between each crawling iteration.
58-
# Use 0 to disable the cooldown (maybe is this comment here useless).
59-
time_between_runs = 0
60-
61-
# List of Minecraft server ports to scan.
62-
ports = [25565]
63-
```
64-
65-
### Scanner Section
66-
The `[scanner]` section configures the scanning operations, including concurrent tasks and default ports to scan.
67-
```toml showLineNumbers=34 title="config.toml"
68-
[scanner]
69-
# Configuration specific to the scanning operations.
70-
71-
# Maximum concurrent tasks the scanner will execute.
72-
# Use 0 to use the value max_tasks from [network].
73-
max_tasks = 0
74-
75-
# List of Minecraft server ports to scan.
76-
# The port in the IP list file will have the highst priority
77-
# (This is called default btw.)
78-
default_ports = [25565]
79-
```
80-
81-
### Network Section
82-
The `[network]` section defines the network connection settings, such as maximum concurrent tasks and connection timeout.
83-
```toml showLineNumbers=46 title="config.toml"
84-
[network]
85-
# Network connection configuration
86-
87-
# Maximum concurrent network tasks (e.g., pings, queries).
88-
# Recommended values: 1000 - 5000.
89-
# Value must be between 10 and 20000.
90-
max_tasks = 2000
91-
92-
# Timeout in milliseconds for server connections.
93-
# Value must be between 80 and 15000
94-
timeout = 3000
95-
```
96-
</TabItem>
97-
<TabItem value="everything" label="Complete Example">
98-
## Complete Example
99-
100-
Here is a complete example of a `config.toml` file with both the `[database]` and `[crawler]` sections configured:
101-
102-
```toml showLineNumbers title="config.toml"
103-
# ServerRawler configuration file
104-
# Github: https://github.com/Cyberdolfi/ServerRawler
105-
# Read the docs here: https://cyberdolfi.github.io/ServerRawler/docs/getting-started/configuration
106-
107-
[database]
108-
# Database configuration
109-
110-
host = "localhost"
111-
port = 5432
112-
user = "postgres" # Don't use superuser in production
113-
password = "your_strong_password"
114-
database = "serverrawler"
115-
116-
[crawler]
117-
# Configuration specific to the crawling operations.
118-
119-
# Number of IP addresses to generate and process per crawling iteration.
120-
ips_per_iteration = 1000000
121-
122-
# Maximum concurrent tasks the crawler will execute.
123-
# Use 0 to use the value max_tasks from [network].
124-
max_tasks = 0
125-
126-
# Number of loops. Use 0 to run infinitly.
127-
runs = 0
128-
129-
# Time in seconds to wait between each crawling iteration.
130-
# Use 0 to disable the cooldown (maybe is this comment here useless).
131-
time_between_runs = 0
132-
133-
# List of Minecraft server ports to scan.
134-
ports = [25565]
135-
136-
[scanner]
137-
# Configuration specific to the scanning operations.
138-
139-
# Maximum concurrent tasks the scanner will execute.
140-
# Use 0 to use the value max_tasks from [network].
141-
max_tasks = 0
142-
143-
# List of Minecraft server ports to scan.
144-
# The port in the IP list file will have the highst priority
145-
# (This is called default btw.)
146-
default_ports = [25565]
147-
148-
[network]
149-
# Network connection configuration
150-
151-
# Maximum concurrent network tasks (e.g., pings, queries).
152-
# Recommended values: 1000 - 5000.
153-
# Value must be between 10 and 20000.
154-
max_tasks = 2000
155-
156-
# Timeout in milliseconds for server connections.
157-
# Value must be between 80 and 15000
158-
timeout = 3000
159-
```
160-
</TabItem>
161-
</Tabs>
11+
To teach ServerRawler to use your custom `path/to/your/custom_config_folder` directory, specify its path using the [`--config` argument](../usage/arguments/config).
16212

163-
## Using a Custom Configuration File path
13+
:::warning[Important]
14+
The path you specify with the `--config` argument is the directory wher the folder `config` is located, not the path to the configuration files itself.
15+
For example, if your configuration files are located in `path/to/your/custom_config_folder/config`, you should specify `--config path/to/your/custom_config_folder` as the argument.
16+
:::
16417

165-
To instruct ServerRawler to use your custom `config.toml` file, specify its path using the [`--config`](../usage/arguments/config) command-line argument:
18+
## Configuration Files
19+
ServerRawler uses two configuration files at the moment:
20+
- `config/`
21+
- `config.toml`: This file contains the main configuration settings for the crawler and scanner. [Details here](../configuration/config).
22+
- `database.toml`: This file contains the database connection details for PostgreSQL. [Details here](../configuration/database).
16623

167-
```bash
168-
./ServerRawler --config /path/to/your/custom_config.toml
169-
```
24+
Click on the links above to learn how to set up each configuration file correctly.

docs/docs/getting-started/database-setup.mdx

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
---
2-
sidebar_position: 2
32
title: Database Setup
43
---
54

@@ -66,5 +65,5 @@ GRANT ALL PRIVILEGES ON DATABASE "serverrawler_db" TO "serverrawler_usr";
6665

6766
## 4. Configure ServerRawler for Database Connection
6867

69-
The next step is now, to edit the `config.toml` file.
70-
Use the [Configuration Guide](./configuration?configstyle=stepbystep#database-section) to set up your database connection details.
68+
The next step is now, to edit the `database.toml` file.
69+
Use the [Database Configuration Guide](../configuration/database.mdx) to set up your database connection details.

0 commit comments

Comments
 (0)