mirror of
https://github.com/akvorado/akvorado.git
synced 2025-12-11 22:14:02 +01:00
clickhouse: add a new Clickhouse component to help setting Clickhouse
This commit is contained in:
@@ -10,6 +10,7 @@ configured through a different section:
|
||||
- `snmp`: [SNMP poller](#snmp)
|
||||
- `geoip`: [GeoIP database](#geoip)
|
||||
- `kafka`: [Kafka broker](#kafka)
|
||||
- `clickhouse`: [Clickhouse helper](#clickhouse)
|
||||
- `core`: [Core](#core)
|
||||
|
||||
You can get the default configuration with `./akvorado --dump --check`.
|
||||
@@ -154,6 +155,11 @@ kafka:
|
||||
cleanup.policy: delete
|
||||
```
|
||||
|
||||
## Clickhouse
|
||||
|
||||
The Clickhouse component exposes some useful HTTP endpoints to
|
||||
configure a Clickhouse database. It takes no configuration.
|
||||
|
||||
## Core
|
||||
|
||||
The core orchestrates the remaining components. It receives the flows
|
||||
|
||||
33
docs/integration.md
Normal file
33
docs/integration.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Integrations
|
||||
|
||||
*Akvorado* needs some integration with external components to be
|
||||
useful. The most important one is Kafka, but it can also integrate
|
||||
with Clickhouse and Grafana.
|
||||
|
||||
## Kafka
|
||||
|
||||
The Kafka component sends flows to Kafka. Its
|
||||
[configuration](configuration.md#kafka) mostly needs a topic name and a list
|
||||
of brokers. It is possible to let *Akvorado* manage the topic with the
|
||||
appropriate settings (number of partitions, replication factor and
|
||||
additional configuration entries). If the topic exists, *Akvorado*
|
||||
won't update the number of partitions and the replication factor but
|
||||
other settings will be updated.
|
||||
|
||||
## Clickhouse
|
||||
|
||||
Clickhouse can collect the data from Kafka. To help its configuration,
|
||||
*Akvorado* exposes a few HTTP endpoint:
|
||||
|
||||
- `/api/v0/clickhouse/flow.proto` contains the schema
|
||||
- `/api/v0/clickhouse/init.sh` contains the schema in the form of a
|
||||
script to execute during initialization
|
||||
- `/api/v0/clickhouse/protocols.csv` contains a CSV with the mapping
|
||||
between protocol numbers and names
|
||||
- `/api/v0/clickhouse/asns.csv` contains a CSV with the mapping
|
||||
between AS numbers and organization names
|
||||
|
||||
## Grafana
|
||||
|
||||
No integration is currently done for Grafana, except a reverse proxy
|
||||
configured in the [web section](configuration.md#web).
|
||||
@@ -27,12 +27,11 @@ with `--check` if you don't want *Akvorado* to start.
|
||||
### Exposed HTTP endpoints
|
||||
|
||||
The embedded HTTP server contains the endpoints listed on the [home
|
||||
page](/). The [`/api/v0/flows`](/api/v0/flows?limit=1) continously
|
||||
printed flows sent to Kafka (using [ndjson]()). It also accepts a
|
||||
`limit` argument to stops after emitting the specified number of
|
||||
flows. This endpoint should not be used for anything else other than
|
||||
debug: it can skips some flows and if there are several users, flows
|
||||
will be dispatched between them.
|
||||
page](index.md). The [`/api/v0/flows`](/api/v0/flows?limit=1)
|
||||
continously printed flows sent to Kafka (using [ndjson]()). It also
|
||||
accepts a `limit` argument to stops after emitting the specified
|
||||
number of flows. This endpoint should not be used for anything else
|
||||
other than debug: it can skips some flows and if there are several
|
||||
users, flows will be dispatched between them.
|
||||
|
||||
[ndjson]: http://ndjson.org/
|
||||
|
||||
|
||||
Reference in New Issue
Block a user