352 Commits

Author SHA1 Message Date
Toni Uhlig
d629fda779 bump libnDPI to 75db1a8a66476b3c16cc1a8bf63ca2b0e2fba3ed
* incorporate upstream changes:
    - nDPI supports build directories now
    - set memory wrapper
    - classification states
    - process packet signature change

 * disabled fuzz-* test pcaps
    - cause timestamp diff's for some libpcap builds

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-11-18 09:54:15 +01:00
Toni Uhlig
643aa49d34 bump libnDPI to e9751cec26d80fe2d88706d4f7521a63ec12b3bb
* incorporate replacement of "TLS Susp ESNI Usage" with "Mismatching Protocol with server IP address"

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-10-29 13:51:07 +01:00
Toni Uhlig
8dfaa7c86c Fix CI
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-10-24 08:20:22 +02:00
Toni Uhlig
59caa5231e Dockerfile: build for ArchLinux as well
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-24 09:39:51 +02:00
Toni Uhlig
9c0f5141bc Fix "Potentially Dangerous" breed in c-notifyd
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-23 12:05:41 +02:00
Toni Uhlig
e8ef267e0a bump libnDPI to 560a4e4954e2db38d995d3cba2c1dcc4276f92d5
* fix some SonarCloud issues

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-17 10:37:51 +02:00
Toni Uhlig
2651833c58 CMake/CI: more robust against deprecations
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-16 11:19:02 +02:00
Toni Uhlig
bd7df393fe CI: ENABLE_CRYPTO for some builds
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-16 10:34:46 +02:00
Toni Uhlig
88cfecdf95 Remove CMake limitation
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 16:34:19 +02:00
Toni Uhlig
a91aab493c fixed spelling issue
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 15:09:18 +02:00
Toni Uhlig
fe42e998d0 fixed SonarCloud issues
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
22e44c1e0b removed crypto example
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
d8cad33a70 restored nio code
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
37989db0bb make TLS handshakes great again
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
19f80ba163 Added TLS ncrypt I/O
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
c8c58e0b16 nDPId crypto handshake done
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
6d3dc99fad Switch to OpenSSL for all crypto stuff
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
b8d3cf9e8f Added send packets with type i.e. keyex / json-data
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
510b03cbcd Added preps for different packet types + AAD (type+size)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
66aca303b6 Added HKDF to uniform distirbute a X25519 shared key
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
0e7e5216d8 Added preps for AAD/KeyEx
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
7ab7bb3772 Added some stats printing to c-decrypt
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
a47bc9caa3 Modified crypto to support multiple peers (multiple sender / multiple receiver) per ncrypt context
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:47 +02:00
Toni Uhlig
7d94632811 nDPId decryption example
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:46 +02:00
Toni Uhlig
2c81f116bf nDPId decryption example
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:46 +02:00
Toni Uhlig
49b058d2d3 Updated OpenWrt In-Source build patch
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:46 +02:00
Toni Uhlig
fea52d98ca Added nDPId decryption example
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:46 +02:00
Toni Uhlig
02b686241e initial nDPId UDP crypto [WiP!]
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 12:33:46 +02:00
Toni Uhlig
2cb0d7941b Improved/Updated Grafana Dashboard
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 10:22:17 +02:00
Toni Uhlig
97e60ad7ec Add security vuln reporting guide
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-15 08:07:48 +02:00
Toni Uhlig
eea5a49638 Fixed some example inconsistencies due to recent libnDPI / nDPId updates
* removed unused, unmaintained and erroneous py-flow-dashboard
 * adjusted Grafana dashboard flow breeds (flow categories will be done separately)
 * (C) update (a bit late)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-10 09:54:40 +02:00
Toni Uhlig
a9934e9c9e Removed nDPI/nDPId version/api serialization for nDPId-test to reduce result diff's
* fixed some SonarCloud complains

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-09 12:52:58 +02:00
Toni Uhlig
644fa2dfb3 bump libnDPI to 1c1894720e3827857cfe1afd19bb7fb4618ee594
* fixes a build error with clang on ubuntu due to missing `static inline`s in header files

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-09 12:05:26 +02:00
Toni Uhlig
1a6b1feda9 Print NDPI_(C|LD)FLAGS
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-09 12:05:26 +02:00
Toni Uhlig
648dedc7ba bump libnDPI to 70536876f2f97b977ed43474872195bf756de67d
* fixes upstream compilation warning due to string truncation

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-09 12:05:26 +02:00
Toni Uhlig
19036951c7 bump libnDPI to 1216ec6a2719408a487f696f5b601bdb9eec727d
* incorporated upstream API changes related to detection protocol bitmasks
 * added missing flow detection categories

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-09-09 12:05:26 +02:00
Toni Uhlig
4e7e361d84 bump libnDPI to f8869cd670adc439cc41bde0bd04960e1befafc5
* fix API issue due to changed name of a public struct

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-20 11:05:53 +02:00
Toni Uhlig
9809ae4ea0 rs-simple: improved readability and stability
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-20 09:55:21 +02:00
Toni Uhlig
97387d0f1c rs-simple: added argh command line parser and "stable" flow table index
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-18 14:58:32 +02:00
Toni Uhlig
46ef266139 rs-simple: added DaemonEventStatus deserialization and statistics mgmt
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-16 17:48:51 +02:00
Toni Uhlig
ae6864d4e4 CI: build Rust examples
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-15 09:23:35 +02:00
Toni Uhlig
f3c8ffe6c1 rs-simple: added first/last seen and timeout in
* prettify unit's

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-15 08:10:14 +02:00
Toni Uhlig
07d6018109 rs-simple: make primitive flow table work
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-14 12:36:38 +02:00
Toni Uhlig
dd909adeb8 rs-simple: add flow mgmt w/ TTL hash maps (moka-future)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-05-03 15:22:57 +02:00
Toni Uhlig
8848420a72 CI: use FreeBSD vmactions main branch
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-04-30 23:00:53 +02:00
Toni Uhlig
f8181d7f6a Fix CI build with PF_RING (build userspace lib only)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-04-30 22:33:51 +02:00
Toni Uhlig
b747255a5d Add simple rust example (WiP)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-04-30 22:05:52 +02:00
Toni Uhlig
a52a37ef78 Fix CI
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-04-17 11:00:27 +02:00
Toni Uhlig
ae95c95617 bump libnDPI to c49d126d3642d5b1f5168d049e3ebf0ee3451edc
* fix API issue with a changed function signature

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-03-06 19:00:23 +01:00
Toni
42c54d3755 Initial tunnel decoding (GRE - Layer4 only atm) (#55)
Initial tunnel decoding (GRE - Layer4 only atm). Fixes #53
 * make finally use of the thread distribution seed
 * Handle GRE/PPP subprotocol the right way
 * Add `-t` command line / config option
 * Removed duplicated and obsolete IP{4,6}_SIZE_SMALLER_THAN_HEADER which is the same as IP{4,6}_PACKET_TOO_SHORT
 * Updated error event schema

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-02-25 15:17:16 +01:00
Toni Uhlig
bb870cb98f Add FreeBSD CI build
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-02-01 10:54:27 +01:00
Alex Eganov
e262227d65 Fix missing header file for build on freebsd (macos) (#60) 2025-01-31 23:02:13 +01:00
Toni Uhlig
899e5a80d6 CI: Fixed config tests
* set max dots per line to improve CI output
 * commented `flow_risk.crawler_bot.list.load` out

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-01-26 09:58:22 +01:00
Toni Uhlig
053818b242 CI: Added libnl-genl-3-dev to PF_RING build
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-01-26 07:59:55 +01:00
Toni Uhlig
4048a8c300 Set minimal required nDPI version to 4.14 (tarball) and 4.13 (git)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-01-26 01:10:30 +01:00
Toni Uhlig
09b246dbfa Temp disable flow_risk.crawler_bot.list.load in default config file
* currently broken in upstream

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-01-26 01:00:18 +01:00
Toni Uhlig
471ea83493 bump libnDPI to e946f49aca13e4447a7d7b2acae6323a4531fb55
* incorporated upstream changes

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2025-01-25 10:07:25 +01:00
Toni Uhlig
064bd3aefa fix config header
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-12-09 11:26:45 +01:00
Toni Uhlig
acd9e871b6 Added --no-blink and --hide-risk-info
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-12-09 11:09:34 +01:00
Toni Uhlig
b9465c09d8 Increased maximum value for max-flows-per-thread to 65k
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-12-03 21:02:24 +01:00
Toni Uhlig
3a4b7b0860 CI: make dist test (extract archive, run CMake)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-20 13:40:14 +01:00
Toni Uhlig
34f01b90e3 Fixed CMake warnings
* `make dist`: improved libnDPI git version naming

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-20 12:05:03 +01:00
Toni Uhlig
7b91ad8458 Added script to warn a user about issues regarding wrong umask and CPack
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-20 11:01:01 +01:00
Toni Uhlig
442900bc14 Dockerfile update
* gitlab-ci runner fix (single runner / multiple jobs)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-18 14:44:44 +01:00
Toni Uhlig
0a4f3cb0c8 Fix Gitlab CI build for some runners
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-18 13:51:06 +01:00
Toni Uhlig
4bed2a791f CMake/RPM integration
* CI integration
 * RPM (un)install scripts

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-17 17:12:06 +01:00
Toni Uhlig
1aa7d9bdb6 nDPId daemon status event: serialize nDPI API version + Size/Flow
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-17 13:12:33 +01:00
Toni Uhlig
bd269c9ead Added global stats diff test
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-14 14:33:27 +01:00
Toni Uhlig
7e4c69635a Use chmod_chown() API from utils
* `chmod_chown()` returns EINVAL if path is NULL

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-14 13:47:46 +01:00
Toni Uhlig
9105b393e1 Fixed some SonarCloud issues
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-14 10:21:35 +01:00
Toni Uhlig
9efdecf4ef bump libnDPI to 59ee1fe1156be234fed796972a29a31a0589e25a
* set minimum nDPI version to 4.12.0 (incompatible API changes)
 * fixed `ndpi_debug_printf()` function signature
 * JSON schema (flow): added risk `56`: "Obfuscated Traffic"
 * JSON schema (flow): added "domainame"
 * fixed OpenWrt build

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-13 17:23:31 +01:00
Toni Uhlig
8c114e4916 cosmetics
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-10 13:43:26 +01:00
Toni Uhlig
a733d536ad Added env check NDPID_STARTED_BY_SYSTEMD to prevent logging to stderr in such a case
* removed `nDPId` shutdown on poll/epoll error
 * fixed `chmod_chown()` rv check

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-07 11:32:42 +01:00
Toni Uhlig
9fc35e7a7e Add NUL to risks, not needed but better be safe then sorry
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-05 14:20:30 +01:00
Toni Uhlig
ce9752af16 Fixed some SonarCloud issues
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-05 13:43:23 +01:00
Toni Uhlig
f7933d0fdb Slightly unified C example's logging
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-05 12:48:36 +01:00
Toni Uhlig
d5a84ce630 Temporarily disabled some OpenWrt builds
* See: https://github.com/openwrt/gh-action-sdk/issues/43

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-05 10:24:17 +01:00
Toni Uhlig
ce5f448d3b Switched OpenWrt GitHub Actions SDK to main branch
* fixed some SonarCloud complaints
 * added more systemd CI tests
 * fixed debian package scripts to obey remove/purge
 * changed `chmod_chown()` error handling

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-02 18:36:54 +01:00
Toni Uhlig
2b48eb0514 Added vlan_id dissection of the most outer (first) 802.1Q header. Fixes #50
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-02 15:48:45 +01:00
Toni Uhlig
ddc96ba614 Adjusted SonarCloud config and CI
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-11-02 12:05:07 +01:00
Toni Uhlig
7b2cd268bf Updated JSON schema files and a test to make use of the UUID feature.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-29 15:25:19 +01:00
Toni Uhlig
817559ffa7 Set an optional UUID used within all events (similar to the "alias").
* added default values to usage
 * UUID can be either read from a file or used directly from option value
 * adjusted example config file

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-29 12:12:02 +01:00
Toni Uhlig
25944e2089 Fixed some SonarCloud issues
* fixed dependabot werkzeug (3.0.3 to 3.0.6)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-26 11:35:30 +02:00
Toni Uhlig
5423797267 Added nDPId ndpi_process_packet() LLVM fuzzer
* replaced dumb `dumb_fuzzer.sh`
 * fixed nDPId NULL pointer deref found by fuzzer
 * nDPI: `--enable-debug-build` and `--enable-debug-messages` for non release builds
 * nDPI: do not force `log.level` to `3` anymore, use config value instead

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-24 15:45:04 +02:00
Toni Uhlig
7e126c205e Added additional (libnDPI) config files for test runs.
* redirect `run_tests.sh` stderr to filename which prepends config name

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-23 14:27:07 +02:00
Toni Uhlig
7d58703bdb Removed ENABLE_MEMORY_STATUS CMake option as it's now enabled for **all** builds
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-22 13:12:59 +02:00
Toni Uhlig
ae36f8df6c Added libnDPI global context init/deinit used for cache mgmt.
* support for adding *.ndpiconf for nDPI config tests
 * all other configs should have the suffix *.conf
 * fixed nDPI malloc/free wrapper set (was already too late set)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-21 16:17:01 +02:00
Toni Uhlig
8c5ee1f7bb Added config testing script.
* nDPId-test may now make use of an optional config file as cmd arg

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-21 16:10:09 +02:00
Toni Uhlig
9969f955dc Updated ReadMe's, ToDo's and ChangeLog.
* 1.7-release

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-21 15:44:42 +02:00
Toni Uhlig
2c771c54b0 Merge commit 'fb1dcc71de39e6dd5c11b8bc4288ec5e618fa946' 2024-10-17 12:16:40 +02:00
Toni Uhlig
fb1dcc71de Squashed 'dependencies/jsmn/' changes from 1aa2e8f8..25647e69
25647e69 Fix position of a comment in string parsing

git-subtree-dir: dependencies/jsmn
git-subtree-split: 25647e692c7906b96ffd2b05ca54c097948e879c
2024-10-17 12:16:40 +02:00
Toni Uhlig
071a9bcb91 Merge commit '9a14454d3c5589373253571cee7428c593adefd9' 2024-10-17 12:16:20 +02:00
Toni Uhlig
9a14454d3c Squashed 'dependencies/uthash/' changes from bf152630..f69112c0
f69112c0 utarray: Fix typo in docs
619fe95c Fix MSVC warning C4127 in HASH_BLOOM_TEST (#261)
eeba1961 uthash: Improve the docs for HASH_ADD_INORDER
ca98384c HASH_DEL should be able to delete a const-qualified node
095425f7 utlist: Add one more assertion in DL_DELETE2
399bf74b utarray: Stop making `oom` a synonym for `utarray_oom`
85bf75ab utarray_str_cpy: Remove strdup; utarray_oom() if strdup fails.
1a53f304 GitHub CI: Also test building the docs (#248)
4d01591e The MCST Elbrus C Compiler supports __typeof. (#247)
1e0baf06 CI: Add GitHub Actions CI
8844b529 Update test57.c per a suggestion by @mark-summerfield
44a66fe8 Update http:// URLs to https://, and copyright dates to 2022. NFC.

git-subtree-dir: dependencies/uthash
git-subtree-split: f69112c04f1b6e059b8071cb391a1fcc83791a00
2024-10-17 12:16:20 +02:00
Toni Uhlig
f9d9849300 Updated Grafana dashboard to make correct use of gauge max values.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-17 12:14:28 +02:00
Toni
efed6f196e Read and parse configuration files. Fixes #41. (#42)
Read and parse configuration files. Fixes #41.

 * supports nDPId / nDPIsrvd via command line parameter `-f`
 * nDPId: read general/tuning and libnDPI settings
 * support for settings risk domains libnDPI option via config file or via `-R` (Fixes #45, thanks to @UnveilTech)
 * added some documentation in the config file
 * adjusted Systemd and Debian packaging to make use of config files

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-16 14:13:55 +02:00
Naix
3e2ce661f0 Added Filebeat Configuration (#44)
Added Filebeat Configuration

Co-authored-by: Toni <matzeton@googlemail.com>
2024-10-06 11:09:54 +02:00
Toni Uhlig
76e1ea0598 Updated Grafana dashboard.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-02 19:29:14 +02:00
Toni Uhlig
0e792ba301 Generate global stats with microseconds precision.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-10-01 11:58:39 +02:00
Toni Uhlig
9ef17b7bd8 Added some static assertion based sanity checks.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-11 13:28:20 +02:00
Toni Uhlig
1c9aa85485 Save hostname after detection finished for later use within analyse/end/idle flow events. Fixes #39.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-11 13:01:23 +02:00
Toni Uhlig
aef9d629f0 bump libnDPI to 92507c014626bc542f2ab11c729742802c0bc345
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-09 09:29:08 +02:00
Toni Uhlig
f97b3880b6 CI: Set nDPI minimum required version to 4.10
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-03 13:58:44 +02:00
Toni Uhlig
c55429c131 Updated flow event schema with risk names/severites.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-03 13:56:15 +02:00
Toni Uhlig
7bebd7b2c7 Fix OpenWrt package build.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-09-02 17:51:38 +02:00
Toni Uhlig
335708d3e3 Extend flow JSON schema with more properties from nDPI JSON serializer.
* unfortunately, JSON schema definitions could not be used to make this easier to read and maintain

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-22 17:36:59 +02:00
Toni Uhlig
2a0161c1bb Fix CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-20 12:17:25 +02:00
Toni Uhlig
adb8fe96f5 CMake: add coverage-clean target and fix coverage dependency issue.
* improve/fix README

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-20 11:49:38 +02:00
Toni Uhlig
4efe7e43a2 Improved installation instructions. Fixes #40.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-19 18:39:35 +02:00
Toni
5e4005162b Add PF_RING support. (#38) 2024-08-19 18:33:18 +02:00
Toni Uhlig
a230eaf061 Improved Keras Autoencoder hyper parameter.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-16 13:20:35 +02:00
Toni Uhlig
68e0c1f280 Fix SonarCloud complaint.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-16 13:19:13 +02:00
Toni Uhlig
8271f15e25 Fixed build error due to missing nDPI includes.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-16 13:14:21 +02:00
Toni Uhlig
f6f3a4daab Extended analyse application to write global stats to a CSV.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-16 12:33:46 +02:00
Toni Uhlig
762e6d36bf Some small fixes.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-08-09 11:09:39 +02:00
Toni Uhlig
930aaf9276 Added global (heap) memory stats for daemon status events.
* added new CMake option `ENABLE_MEMORY_STATUS` to restore the old behavior
   (and increase performance)
 * splitted `ENABLE_MEMORY_PROFILING` into `ENABLE_MEMORY_STATUS` and `ENABLE_MEMORY_PROFILING`

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-06-19 14:25:42 +02:00
Toni Uhlig
165b18c829 Fixed OpenWrt nDPId-testing build.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-06-12 15:07:17 +02:00
dependabot[bot]
1fbfd46fe8 Bump werkzeug from 3.0.1 to 3.0.3 in /examples/py-flow-dashboard (#37)
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 3.0.1 to 3.0.3.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/3.0.1...3.0.3)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-29 14:03:11 +02:00
Toni Uhlig
5290f76b5f flow-info.py: Set min risk severity required to print a risk.
* ReadMe update

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-05-08 00:25:31 +02:00
Toni Uhlig
f4d0f80711 CI: don't run systemd integration test on mac
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-05-07 09:42:30 +02:00
Toni Uhlig
187ebeb4df CI: add DYLD_LIBRARY_PATH to env (mac/unix)
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-05-07 09:27:46 +02:00
Toni Uhlig
71d2fcc491 CMake: set MacOS RPATH
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-05-06 09:57:00 +02:00
Toni Uhlig
86aaf0e808 Workaround for fixing GitHub runners on macOS
* See: https://github.com/ntop/nDPI/pull/2411

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-05-06 09:41:09 +02:00
Toni Uhlig
e822bb6145 Fix OpenWrt builds.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-17 09:25:37 +02:00
Toni Uhlig
4c91038274 Removed unmaintained C JSON dumper.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-17 01:47:31 +02:00
Toni Uhlig
53126a0af9 bump libnDPI to 142c8f5afb90629762920db6703831826513e00b
* fixed `git format` hash length

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-10 16:06:53 +02:00
Toni Uhlig
15608bb571 bump libnDPI to 09bb383437c11ef55e926ed15cdf986c0d426827
* fixed "unused function" warning in `ndpi_bitmap64_fuse.c`

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-04 21:13:33 +02:00
Toni Uhlig
e93a4c9a81 bump libnDPI to df29e12f5efbe84306c1ee7c011a197caec6de50
* fixed "unused function" warning in `roaring.h`

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-04 19:33:34 +02:00
Toni Uhlig
b46f15de03 bump libnDPI to 6e61368cd609899048560405ad792705fffb1f1a
* fixed "unused function" warning in `gcrypt_light.c`

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-04 14:08:34 +02:00
Toni Uhlig
c7eace426c bump libnDPI to 9185c2ccc402d3368fc28ac90ab281b4f951719e
* incorporated API changes from 41eef9246c6a3055e3876e3dd7aeaadecb4b76c0

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-04-04 11:49:48 +02:00
Toni Uhlig
33560d64d2 Fix example build error if memory profiling enabled.
* CI: build against libnDPI with `-DNDPI_NO_PKGCONFIG=ON` and `-DSTATIC_LIBNDPI_INSTALLDIR=/usr`
 * CI: `ENABLE_DBUS=ON` for most builds

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-21 07:26:22 +01:00
Toni Uhlig
675640b0e6 Fixed libpcre2 build.
* CI: build against libpcre2 / libmaxminddb

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-20 14:55:09 +01:00
Toni Uhlig
5e5f268b3c Build against nDPI dev branch tarball if there is a new release required to build nDPId.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-16 18:45:11 +01:00
Toni Uhlig
7ef7667da3 Fix random sanitizer crashes caused by high-entropy ASLR on Ubuntu Github Runner.
* removed arch condition (c&p mistake)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-15 06:57:38 +01:00
Toni Uhlig
d43a3d1436 Fix random sanitizer crashes caused by high-entropy ASLR on Ubuntu Github Runner.
* See: https://github.com/actions/runner-images/issues/9491

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-14 18:26:31 +01:00
Toni Uhlig
b6e4162116 Extend CI pipeline build and test.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-03-07 17:46:31 +01:00
Toni Uhlig
717d66b0e7 Fixed missing statistics updating for unknown mapping keys in collectd/influxd.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-20 23:16:31 +01:00
Toni Uhlig
791b27219d CI maintenance
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-13 11:26:58 +01:00
Toni Uhlig
a487e53015 Added missing influxd test results.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-13 10:50:51 +01:00
Toni Uhlig
aeb6e6f536 Enable CURL in the CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-13 10:44:45 +01:00
Toni Uhlig
8af37b3770 Fix some SonarCloud complaints.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-13 07:21:47 +01:00
Toni Uhlig
8949ba39e6 Added test mode for influx push daemon.
* required for regression testing
 * added new confidence value (match by custom rule)
 * updated / tweaked grafana exported dashboard

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-08 01:01:35 +01:00
Toni Uhlig
ea968180a2 Read Ipv6 address and netmask using getifaddrs() instead of reading /proc/net/if_inet6.
* fixes a compatibility issue with Mac OSX

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-07 14:25:14 +01:00
Toni Uhlig
556025b34d Removed API version macro check as it's inconsistent on different platforms.
* set min required nDPI version to 4.9.0

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-06 10:49:47 +01:00
Toni Uhlig
feb2583ef6 bump libnDPI to 4543385d107fcc5a7e8632e35d9a60bcc40cb4f4
* incorporated API changes from nDPI

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-02-06 10:34:52 +01:00
Toni Uhlig
7368f222db Fixed broken "not-detected" event/packet capture in captured example.
* aligned it with influxd example

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-01-29 18:11:34 +01:00
Toni Uhlig
a007a907da Fixed invalid flow risk aggregation in collectd/influxd examples.
* CI: build single nDPId executable with `-Wall -Wextra -std=gnu99`
 * fixed missing error events in influxd example
 * added additional test cases for collectd
 * extended grafana dashboard

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-01-06 19:32:47 +01:00
Toni Uhlig
876aef98e1 Improved collectd example.
* similiar behavior to influxd example
 * gauges and counters are now handled properly

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2024-01-05 11:26:53 +01:00
Toni Uhlig
88cf57a16f Added Grafana example dashboard image.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-20 19:19:08 +01:00
Toni Uhlig
7e81f5b1b7 Added Grafana nDPId dashboard.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-16 15:20:27 +01:00
Toni Uhlig
8acf2d7273 Improved InfluxDB push daemon.
* added proper gauge handling that enables pushing data w/o missing out
   anything e.g. short flows with a lifetime in-between two InfluxDB intervals

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-15 09:16:28 +01:00
Toni Uhlig
71d933b0cd Fixed an event issue.
* a "detection-update" event was thrown even if nothing changed
 * in some cases "not-detected" events were spammed if detection not completed
 * tell `libnDPI` how many packets per flow we want to dissect
 * `nDPId-test` validates total active flows in the right way

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-14 15:45:08 +01:00
Toni Uhlig
fbe07fd882 Improved InfluxDB push daemon.
* fixed severity parsing and gauge handling
 * added flow state gauges
 * flow related gauges are only increased/decreased if a "new" event was seen (except for bytes xfer)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-14 15:38:38 +01:00
Toni Uhlig
5432b06665 Improved InfluxDB push daemon.
* fixed missing flow active gauge
 * fixed invalid flow risk severity gauges
 * fixed missing flow risk gauges

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-11 23:14:00 +01:00
Toni Uhlig
142a435bf6 Add InfluxDB push daemon.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-07 10:00:25 +01:00
Toni Uhlig
f5c5bc88a7 Replaced ambiguous naming of "JSON string" to more accurate "JSON message". #2
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-04 13:13:05 +01:00
Toni Uhlig
53d8a28582 Replaced ambiguous naming of "JSON string" to more accurate "JSON message".
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-04 13:01:27 +01:00
Toni Uhlig
37f3770e3e Improved zlib compression ratio.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-01 06:43:39 +01:00
Toni Uhlig
7368d34d8d c-collectd: Fixed missing escape char.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-01 06:43:39 +01:00
Toni Uhlig
ff77bab398 Warn about unused return values that are quite important.
* CI: ArchLinux build should now instrument `-Werror`
 * CI: Increased OpenWrt build verbosity

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-01 06:43:39 +01:00
Toni Uhlig
d274a06176 flow-info.py: Do not print any information if a flow is "empty" meaning no L4 payload seen so far.
* added JsonDecodeError to provide more information if builtin JSON decoder fails

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-12-01 06:43:39 +01:00
Paul Donald
a5dcc17396 Update README.md (#32)
Sp/gr. 

Co-authored-by: Toni <matzeton@googlemail.com>
2023-11-27 09:08:25 +01:00
Toni Uhlig
3416db11dc Updated ReadMe's, ToDo's and ChangeLog.
* 1.6-release

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 23:39:47 +01:00
Toni Uhlig
830174c7b5 Fixed possible buffer underflow.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 17:52:30 +01:00
Toni Uhlig
bb9f02719d Added SonarCloud exclusions for third-party files and files lacking relevance.
* fixed two other "bugs"

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 14:51:59 +01:00
Toni Uhlig
f38f1ec37f Changed CI image from ubuntu-18.04 to ubuntu-20.04 as it is deprecated since '22.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 13:44:19 +01:00
Toni Uhlig
fa7e76cc75 Fixed SonarCloud complaints.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 13:40:48 +01:00
Toni Uhlig
b0c343a795 Workaround for libpap (<1.9.0) on Ubuntu-18.04
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 12:41:24 +01:00
Toni Uhlig
d5266b7f44 Support simple config file reading via systemd environment file.
* cfg file path defaults to PREFIX/etc/default/ndpid

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-20 12:27:40 +01:00
Toni Uhlig
82934b7271 Fixed clang-tidy warnings.
* fixed/improved c-captured logging

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-13 13:52:42 +01:00
Toni Uhlig
4920b2a4be Use c-captured within test/run_tests.sh.
* Some logging related modifications were required to achieve this.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-11 19:16:07 +01:00
Toni Uhlig
8ebaccc27d py-flow-info: Improved analyse result printing.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-09 23:44:35 +01:00
Toni Uhlig
dcb595e161 bump libnDPI to b08c787fe267053afdea82701071f3878c09244b
* fix ndpi data anylsis struct min/max issue
 * py-flow-info cosmetics in printing some information

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-09 19:52:36 +01:00
Toni Uhlig
b667f9e1da Forcefully reset NDPI_UNIDIRECTIONAL_TRAFFIC if classification was done after the first packet. Nonsense.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-08 17:07:20 +01:00
Toni Uhlig
55c8a848d3 Fixed missing deflate during flow event json serializing.
* caused by recently added serializing some nDPI data even packet processing is still ongoing

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-08 13:16:57 +01:00
Toni Uhlig
d80ea84d2e Reset Unidirectional Traffc risk if packets from both directions processed.
* Fixed risk hash value calculation, which was only done lower 32 bits.
 * Reduced default reader threads count to two if cross compiling.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-08 01:27:42 +01:00
Toni Uhlig
b1e679b0bb Improved DBUS notification daemon.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-07 17:50:12 +01:00
Toni Uhlig
949fc0c35e bump libnDPI to 0db12b1390b1cc554b927230c76b05264c05b498
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-07 00:35:42 +01:00
Toni Uhlig
5d56288a11 Fixed more SonarCloud complaints.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-07 00:02:36 +01:00
Toni Uhlig
84b12cd02c Fixed some SonarCloud complaints.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-06 13:31:59 +01:00
Toni
93498fff02 Apple/BSD port (#30)
* Add MacOS to Github CI builds.
* Fixed libnDPI-4.8 CI build.
* Fixed missing include for `struct sockaddr*`.
* Reworked IPv4 address and netmask retrieval.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-06 13:25:49 +01:00
Toni
1b67927169 Event I/O abstraction layer. (#28)
* Finalize Event I/O abstraction layer.
* Fix possible fd leakage, Gitlab-CI build and error logging.
* Fixed possible uninitialized signalfd variable.
* Fixed possible memory leak.
* Fixed some SonarCloud complaints.
* Fixed nDPId-test nDPIsrvd-arpa-mockup stuck indefinitely.
* Add nDPId / nDPIsrvd command line option to use poll() on Linux instead of the default epoll().

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-06 12:38:15 +01:00
Toni Uhlig
17c21e1d27 Updated ToDo and added ChangeLog.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-05 16:00:47 +01:00
Toni Uhlig
5fb706e9a6 Set timeout for nDPId-test run's.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-03 16:38:58 +01:00
Toni Uhlig
5335d84fe5 Add DBUS suspicious flow event notification daemon.
* nDPIsrvd.h: support for closing/resetting a nDPIsrvd_socket (required for a reconnect)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-03 16:07:28 +01:00
Toni Uhlig
32ab500eb0 Bump werkzeug to 3.0.1
* see #29

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-11-02 14:15:06 +01:00
Toni Uhlig
e124f2d660 Switched to UNIX socket use for tests.
* use `ss` to make sure that the socket is not available anymore after every single test

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-20 00:25:59 +02:00
Toni Uhlig
6ff8982ffb Fixed bug which may happen if additional write buffers are empty but main write buffer not.
* may cause nDPIsrvd to hang indefinitly if no more data received from a collector

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-19 13:40:44 +02:00
Toni Uhlig
315dc32baf Improved syslog logging.
* fixed missing log level for non-error message, causes systemd to send broadcast messages
 * completly removed logging to stderr while started via systemd

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-19 13:38:26 +02:00
Toni Uhlig
3d0c06ef54 Disable SonarCloud Coverage generation.
* ToDo: Fix and Re-Enable?

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-18 22:48:44 +02:00
Toni Uhlig
8dca2b546a Added Coverage generation for SonarCloud.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-17 00:15:25 +02:00
Toni Uhlig
e134eef5bb Fixed Dockerfile related SonarCloud issues.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-16 20:38:54 +02:00
Toni Uhlig
d29efd4d7c Docker: Switched from Ubuntu 22.10 to 22.04 LTS.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-16 16:49:30 +02:00
Toni Uhlig
44adfc0b7d Sonarcloud integration
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-05 17:37:42 +02:00
Toni Uhlig
dfd0449306 Fix issues detected by SonarCloud.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-05 17:26:06 +02:00
Toni Uhlig
07f2c2d9cc nDPId-test: ingore event handler failures caused by arpa mockup
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-05 17:25:54 +02:00
Toni Uhlig
73b8c378f2 nDPId event I/O fixes.
* forcibly disable epoll even if available
 * nDPId-test event I/O selftest
 * CI event I/O tests

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-10-05 12:32:45 +02:00
Toni Uhlig
a0e0611c56 nDPIsrvd: Log error if collector unix socket can not be removed.
* systemd: add post stop hook to forcefully remove the collector unix socket

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-11 16:58:13 +02:00
Toni Uhlig
7f8e01d442 Fix CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-11 12:55:54 +02:00
Toni Uhlig
835a7bafb1 Fix CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-11 05:14:31 +02:00
Toni Uhlig
a7ac83385b Fix systemd CI test.
* CI Fix #3

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 22:16:44 +02:00
Toni Uhlig
0a0342ce28 c-captured: Log only flows w/o packet data to syslog if in logging mode.
* CI Fix #2

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 21:46:51 +02:00
Toni Uhlig
7515c1b072 Fix CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 20:28:11 +02:00
Toni Uhlig
be07c16c0e sklearn-random-forest.py: Pretty print false positive/negative.
* added max tree depth command line argument
 * print a note if loading an existing model while using --sklearn-* command line options

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 20:20:25 +02:00
Toni Uhlig
e42e3fe406 Serialize nDPId / libnDPI versions within daemon events.
* changed nDPI version hints / requirements

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 19:32:08 +02:00
Toni Uhlig
96b0a8a474 Add event I/O abstraction.
* required to support non-Linux OS e.g. Mac OS X / BSD
 * see Github issue #19

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-09-10 19:11:58 +02:00
Toni Uhlig
091fd4d116 Added CMake option BUILD_NDPI_FORCE_GIT_UPDATE to fix broken submodule caches in GitLab CIs..
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-27 22:37:55 +02:00
Toni Uhlig
dfb8d3379f bump libnDPI to 1f693c3f5a5dcd9d69dffb610b9a81bd33f95382
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-27 21:45:14 +02:00
Toni Uhlig
a7bd3570b0 Enable custom JSON filter expressions for Python scripts.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-27 20:08:01 +02:00
Toni Uhlig
b01498f011 Fix some GCC-12 warnings.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-27 12:49:39 +02:00
Toni Uhlig
cc60e819e8 Fixed invalid base64 encoding in some rare cases.
* nDPId-test may also verify the correct encoding/decoding

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-24 22:51:57 +02:00
Toni Uhlig
5234f4621b keras-autoencoder.py: TensorBoard, SGD optimizer, KLDivergence loss function, EarlyStopping
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-23 22:56:59 +02:00
Toni Uhlig
86ac09a8db keras-autoencoder.py: Improved Model
* added initial learning rate for Adam
 * plot some metrics using pyplot

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-20 23:05:08 +02:00
Toni Uhlig
4b3031245d keras-autoencoder.py: fixed invalid preprocessing of received base64 packet data
* split logic into seperate jobs; nDPIsrvd and Keras
 * nDPIsrvd: break event processing and re-run `epoll_wait()` after client disconnected

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-15 11:21:46 +02:00
Toni Uhlig
2b881d56e7 c-captured extension
* capture packets after error event occurred
 * add "logging" and "capture" mode

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-14 01:01:26 +02:00
Toni Uhlig
dd4357c238 CMake: install header files for experimental usage
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-05 01:15:48 +02:00
Toni Uhlig
7b15838696 Added docker build&push to the CI.
* update some git submodules

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-03 10:00:50 +02:00
Toni Uhlig
0e31829401 nDPId-test: threads should block all unix signals
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-02 23:34:10 +02:00
Toni Uhlig
d9f304e4b0 nDPId-test: print additional startup/init log messages
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-02 22:32:33 +02:00
Toni Uhlig
ebb439d959 Tiny improvments.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-02 21:00:39 +02:00
Toni Uhlig
79834df457 Removed CI matrix based jobs.
* Fixed multiple *.deb package upload issue.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-08-02 18:21:37 +02:00
Toni Uhlig
4b923bdf44 py-flow-info: print flow src/dst packets
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-28 13:16:29 +02:00
Toni Uhlig
ba8236c1f7 py-flow-info: print flow src/dst bytes/packets
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-28 11:24:09 +02:00
Toni Uhlig
d915530feb Circle CI integration
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-27 18:39:11 +02:00
Toni Uhlig
7bd8081cd2 bump libpcap dependency to 1.9.0
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-27 18:20:36 +02:00
Toni Uhlig
bc0a5782cc bump libnDPI to 2b230e28e0612e8654ad617534deb9aaaabd51b7
* fixes loading of gambling lists which increased nDPId's memory usage *a lot*
 * nDPId: handle EINTR correctly

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-23 18:43:58 +02:00
Toni Uhlig
8a8de12fb3 Keras AE supports loading/saving models.
* added training/batch size as cmdargs

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-22 09:25:11 +02:00
Toni Uhlig
c57ace2fd3 Correctly handle EINTR while doing I/O..
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-18 13:51:00 +02:00
Toni Uhlig
344934b7d9 CI: Upload generated packages.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-18 10:57:14 +02:00
Toni Uhlig
22ba5d5103 Improved OpenWrt Makefile: set an optional libnDPI commit hash
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-18 01:25:06 +02:00
Toni Uhlig
7217b90cd1 nDPId: `-v' give information about libnDPI linkage
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-17 19:32:41 +02:00
Toni Uhlig
74a9f7d86b nDPId: `-v' prints also information about dependecies
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-17 19:19:52 +02:00
Toni Uhlig
57d8dda350 nDPId-test: Fixed invalid error retval when epoll_wait() returns EINTR.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-16 23:23:07 +02:00
Toni Uhlig
425617abdf Added GLFW/OpenGL stats drawer written in C++.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-16 22:19:20 +02:00
Toni Uhlig
92b3c76446 Added Keras based Autoencode (Work-in-Progress!)
* minor fixes

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-16 22:06:36 +02:00
Toni Uhlig
967381a599 get-and-build-libndpi.sh uses GMake MAKEFLAGS for sub-make (required for e.g. jobserver)
* fixed invalid CMake `test -r ...`

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-07-16 22:06:31 +02:00
Toni Uhlig
d107560049 Updated OpenWrt In-Source build patch.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-06-12 19:56:02 +02:00
Toni Uhlig
c8ec505b9c bump libnDPI to 8ea0eaa0d0c4a3be05f67ef7fa1d22c2579cf7d1
* added build fix for Gitlab CI
 * added friendly C11 check
 * set required libnDPI versionto 4.7
   (ArchLinux ndpi-git sets version to 4.7, which is not released yet)
 * reduced sklearn-random-forest memory consumption by adjusting min. sample leaf

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-06-12 19:05:33 +02:00
lns
2b1db0a556 Required libnDPI version increases (>=4.8.0) due to an API change.
* fix CI issues

Signed-off-by: lns <matzeton@googlemail.com>
2023-05-31 12:53:49 +02:00
lns
d8c20d37e5 Allow in-source builds required for OpenWrt toolchain.
Signed-off-by: lns <matzeton@googlemail.com>
2023-05-30 12:03:34 +02:00
lns
5a9b40779d bump libnDPI to 04f5c5196e790db8b8cc39e42c8645fb7f3dd141
* added custom nDPI logging callback

Signed-off-by: lns <matzeton@googlemail.com>
2023-05-30 09:30:24 +02:00
lns
d0c070a800 Added CentOs and ArchLinux to the CI.
* added some additional checks in get-and-build-libndpi.sh
 * CMake fallback library checks

Signed-off-by: lns <matzeton@googlemail.com>
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-05-28 00:46:58 +02:00
lns
8a936a5072 Fixed integer overflow for tcp timeout (>INT_MAX).
Signed-off-by: lns <matzeton@googlemail.com>
2023-05-26 11:17:38 +02:00
Toni Uhlig
c9514136b7 bump libnDPI to ...
* upstream changed regression test interface, needed to adapt
 * improved libnDPI helper build script
 * updated JSON schema

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-05-24 19:30:19 +02:00
Toni Uhlig
a4e5bab9b2 Fix CI.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-04-25 17:18:07 +02:00
Paul Spooren
b76a0c4607 Update build-openwrt.yml to use snapshot explicitly
Consciously use the (unstable) snapshot tag.
2023-04-25 16:57:47 +02:00
Toni Uhlig
c9da8b0fd9 Github Actions: update OpenWrt SDK to use main branch
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-04-21 17:03:34 +02:00
Toni Uhlig
ca355b1fdb Updated js-rt-analyzer and js-rt-analyzer-frontend examples.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-04-21 16:51:58 +02:00
Toni Uhlig
99accd03a2 Moved datalink json key/value from error to packet events and renamed it to pkt_datalink.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-04-21 16:48:40 +02:00
Toni Uhlig
225f4b3fb6 Github Actions: enable build against libnDPI-4.6, build nDPId executable from CLI
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-02-27 13:00:46 +01:00
Toni Uhlig
a8d46ef343 Merge branch 'main' of github.com:utoni/nDPId 2023-02-27 02:02:12 +01:00
Toni Uhlig
aafc72a44b Github Actions: enable build against libnDPI-4.6, build nDPId executable from CLI
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-02-27 02:01:18 +01:00
Toni Uhlig
0a959993bc Improved:
* Gitlab-CI: build nDPId executable from CLI
 * C-Simple: log affected JSON line on READ/PARSE error
 * Sklearn: quality of life changes

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-02-27 01:20:23 +01:00
dependabot[bot]
595bd5c5e3 Bump werkzeug from 2.0 to 2.2.3 in /examples/py-flow-dashboard
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 2.0 to 2.2.3.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/2.0.0...2.2.3)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-02-16 17:43:31 +01:00
Toni Uhlig
4236aafa0d py-machine-learning: Print CSV line numbers for invalid lines (SKLearn Random Forest Classificator).
* c-analysed: fix wrong length check

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-30 22:36:17 +01:00
Toni Uhlig
23816f1403 Revert "Revert "Minor fixes.""
This reverts commit 42aad33ec8.
2023-01-27 12:48:20 +01:00
Toni Uhlig
42aad33ec8 Revert "Minor fixes."
This reverts commit 58439a6761.
2023-01-27 02:02:16 +01:00
Toni Uhlig
c71284291e updated js-rt-analyzer*
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-27 01:47:49 +01:00
Toni Uhlig
58439a6761 Minor fixes.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-27 01:39:03 +01:00
Toni Uhlig
5e313f43f9 Small CI/CD/nDPIsrvd.py improvements.
* Updated examples/js-rt-analyzer and examples/js-rt-analyzer-frontend

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-17 21:01:47 +01:00
Toni Uhlig
a3d20c17d1 Improved collectd risk processing to be in sync with libnDPI risks.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-11 06:28:10 +01:00
Toni Uhlig
c0717c7e6c Gitlab-CI: Upload coverage report.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-11 06:14:44 +01:00
Toni Uhlig
470ed99eaf Added https://gitlab.com/verzulli/ndpid-rt-analyzer-frontend.git example.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-11 06:13:51 +01:00
Toni Uhlig
ac3757a367 Merge branch 'main' of github.com:utoni/nDPId 2023-01-10 10:13:57 +01:00
Toni Uhlig
07efb1efd4 Added distclean-libnDPI target to CMake.
* Gitlab-CI: Additional job for debian packages
 * Install Python examples iff BUILD_EXAMPLES=ON

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-10 01:03:34 +01:00
Macauley Cheng
afe873c0de Delete docker-compose.yml 2023-01-09 21:13:53 +01:00
macauley_cheng
3dcc13b052 add Docker related file 2023-01-09 21:13:53 +01:00
Toni Uhlig
464450486b bump libnDPI to a944514ddec73f79704f55aab1423e39f4ce7a03
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-09 11:24:40 +01:00
Toni Uhlig
655393e953 nDPid: Fixed base64encode bug which lead to invalid base64 strings.
* py-semantic-validation: Decode base64 raw packet data as well
 * nDPIsrvd.py: Added PACKETS_PLEN_MAX
 * nDPIsrvd.py: Improved JSON parse error/exception handling

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2023-01-09 01:43:24 +01:00
Toni Uhlig
e9443d7618 Fix libnDPI build script.
* added ntop Webinar 2022 reference

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-26 19:35:12 +01:00
Toni Uhlig
4e19ab929c py-machine-learning / sklearn-random-forest: Quality Of Life improvments
* fixed libnDPI submodule build on some platforms

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-22 22:13:08 +01:00
Toni Uhlig
c5930e3510 Add collectd statistics diff test.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-06 19:51:53 +01:00
Toni Uhlig
d21a38cf02 Limit the size of base64 serialized raw packet data (8192 bytes per packet).
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-06 12:52:52 +01:00
Toni Uhlig
ced5f5d4b4 py-flow-info: ignore certain json lines that match various criteria
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-03 01:23:26 +01:00
Toni Uhlig
60741d5649 Strace support for diff tests.
* tiny README update

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-03 01:21:20 +01:00
Toni Uhlig
8b81b170d3 Updated Github/Gitlab CI
* instrument Clang's thread sanitizer for tests

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-02 23:24:06 +01:00
Toni Uhlig
2c95b31210 nDPId-test: Reworked I/O handling to prevent some endless loop scenarios. Fixed a race condition in the memory wrapper as well.
* nDPId: Instead of sending too long JSON strings, log an error and some parts.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-12-02 22:11:57 +01:00
Toni Uhlig
532961af33 Fixed MD format issues.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-21 11:34:10 +01:00
Toni Uhlig
64f6abfdbe Unified nDPId/nDPIsrvd command line argument storage.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-21 11:26:05 +01:00
Toni Uhlig
77ee336cc9 Added Network Buffer Size CI Check.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-20 22:42:06 +01:00
Toni Uhlig
9b78939096 Updated README's.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-20 22:25:18 +01:00
Toni Uhlig
57c5d8532b Test for diff's in flow-analyse CSV generator daemon.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-16 23:06:37 +01:00
Toni Uhlig
869d4de271 Improved make daemon / daemon.sh to accept nDPId / nDPIsrvd arguments via env.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-15 07:05:29 +01:00
Toni Uhlig
ce567ae5b7 Improved the point of time when to append the raw packet base64 data to the serializer.
* nDPId-test: Increased the max-packets-per-flow-to-send from 3 to 5.
   This is quite useful for TCP as the first 3 packets are usually part of the three-way-handshake.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-15 06:25:16 +01:00
Toni Uhlig
36e428fc89 Sync unit tests.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-13 16:19:07 +01:00
Toni Uhlig
ea1698504c nDPIsrvd: Provide workaround for change user/group.
* nDPId/nDPIsrvd/c-examples: Parameter parsing needs to be improved
                              if `strdup()` in combination with static strings is used.
 * Other non-critical fixes.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-13 11:21:55 +01:00
Toni Uhlig
bc346a28f4 nDPId: Fixed base64 encoding issue.
* The issue can result in an error message like:
   `Base64 encoding failed with: Buffer too small.`
   and also in too big JSON strings generated by nDPId
   which nDPIsrvd does not like as it's length is
   greater than `NETWORK_BUFFER_MAX_SIZE`.
 * nDPId will now obey `NETWORK_BUFFER_MAX_SIZE` while
   trying to base64 encode raw packet data.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-13 09:26:04 +01:00
Toni Uhlig
e629dd59cd nDPIsrvd.h: Provide two additional convenient API functions.
* nDPIsrvd_json_buffer_string
 * nDPIsrvd_json_buffer_length

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-06 13:19:29 +01:00
Toni Uhlig
7515c8aeec Experimental systemd support.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-06 12:58:55 +01:00
Toni Uhlig
25f4ef74ac Improved examples.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-11-02 00:01:57 +01:00
Toni Uhlig
d55e397929 bump libnDPI to db9f6ec1b4018164e5bff05f115dc60711bb711b
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-31 13:53:02 +01:00
Toni Uhlig
d3f99f21e6 Create pidfile iff daemon mode enabled.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-31 12:45:49 +01:00
Toni Uhlig
c63cbec26d Improved nDPIsrvd-collectd statistics.
* Improved RRD-Graph generation script and static WWW html files.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-31 12:45:15 +01:00
Toni Uhlig
805aef5de8 Increased network buffer size to 33792 bytes.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-30 22:13:07 +01:00
Toni Uhlig
2d14509f04 nDPid-test: add buffer test
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-30 20:12:17 +01:00
Toni Uhlig
916d2df6ea nDPId-test: Fixed thread sync/lock issue.
* rarely happens in CI

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-22 01:45:14 +02:00
Toni Uhlig
46c8fc5219 Merge branch 'main' of github.com:utoni/nDPId 2022-10-20 16:13:27 +02:00
Toni Uhlig
e5f4af4890 Special Thanks to Damiano Verzulli (@verzulli).
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-20 16:12:40 +02:00
lns
cd22d56056 Add ArchLinux PKGBUILD.
Signed-off-by: lns <matzeton@googlemail.com>
2022-10-19 18:40:52 +02:00
Toni Uhlig
49352698a0 nDPId: Added error event threshold to prevent event spamming which may be abused.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-17 06:36:30 +02:00
Toni Uhlig
6292102f93 py-machine-learning: load and save trained models
* added link to a pre-trained model

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-15 11:59:39 +02:00
Toni Uhlig
80f8448834 Removed discontinued examples from the ReadMe.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-13 16:47:03 +02:00
Toni Uhlig
9bf4f31418 Removed example py-ja3-checker.
* renamed sklearn-ml.py to sklearn-random-forest.py (there is more to come!)
 * force all protocol classes to lower case

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-13 00:12:22 +02:00
Toni Uhlig
4069816d69 Improved py-machine-learning example.
* colorize/prettify output
 * added sklearn controls/tuning options
 * disable IAT/Packet-Length features as default

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-11 20:20:01 +02:00
Toni Uhlig
bb633bde22 daemon.sh: fixed race condition
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-10 17:54:49 +02:00
Toni Uhlig
20fc74f527 Improved py-machine-learning example.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-10 16:44:12 +02:00
Toni Uhlig
2ede930eec daemon.sh: cat nDPId / nDPIsrvd log on failure
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-09 19:11:37 +02:00
Toni Uhlig
4654faf381 Improved py-machine-learning example.
* c-analysed: fixed quoting bug
 * nDPId: fixed invalid iat storing/serialisation
 * nDPId: free data analysis after event was sent

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
Signed-off-by: lns <matzeton@googlemail.com>
2022-10-09 18:31:45 +02:00
Toni Uhlig
b7a17d62c7 Improved OpenWrt UCI/Initscript
* c-analysed: chuser()/chgroup()

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-06 06:54:01 +02:00
Toni Uhlig
ac46f3841f Fixed heap overflow on shutdown caused by missing remotes size/used reset.
* introduced with 22a8d04c74

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-05 00:14:46 +02:00
Toni Uhlig
be3f466373 OpenWrt UCI/Initscript
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-03 15:57:16 +02:00
lns
b7d8564b65 Generate code coverage w/o external shell script, use CMake.
* upload codecov/dist artifacts

Signed-off-by: lns <matzeton@googlemail.com>
2022-10-03 15:45:17 +02:00
lns
49ea4f8474 Small fixes.
Signed-off-by: lns <matzeton@googlemail.com>
2022-10-01 22:37:25 +02:00
Toni Uhlig
b6060b897e c-analysed: improved feature extraction from "analyse" events
* c-captured: update detected risks on "detection-update" events
 * c-collectd: added missing flow breed
 * c-collectd: PUTVAL macros are more flexible now

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-10-01 18:01:56 +02:00
Toni Uhlig
14f6b87551 Added nDPIsrvd-analysed to generate CSV files from analyse events.
* nDPIsrvd.h: iterate over JSON arrays
 * nDPId: calculate l3 payload packet entropies for analysis

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-30 19:28:49 +02:00
Toni Uhlig
74f71643da nDPId-test: Force collector blocking mode.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-25 16:24:05 +02:00
Toni Uhlig
2103ee0811 Refactored client distributor C API.
* Still not perfect, but the code before was not even able to deal with JSON arrays.
   Use common "speaking" function names for all functions in nDPIsrvd.h
 * Provide a more or less generic and easy extendable JSON walk function.
 * Modified C examples to align with the changed C API.
 * c-collectd: Reduced lot's of code duplication by providing mapping tables.
 * nDPId: IAT array requires one slot less (first packet has always an IAT of 0).

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-25 00:54:39 +02:00
Toni Uhlig
36f1786bde nDPIsrvd.h: Fixed bug during token parsing/hashing. Do not hash array contents.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-23 00:13:19 +02:00
Toni Uhlig
9a28475bba Improved flown analyse event:
* store packet directions
 * merged direction based IATs
 * merged direction based PKTLENs

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-22 19:07:08 +02:00
Toni Uhlig
28971cd764 flow-info.py: Command line arguments --no-color, --no-statusbar (both useful for tests/CI) and --print-analyse-results.
* run_tests.sh: Use flow-info.py for additional DIFF tests.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-22 08:00:21 +02:00
Toni Uhlig
3c7bd6a4ba Merge branch 'main' of github.com:utoni/nDPId 2022-09-19 19:39:54 +02:00
Toni Uhlig
08f263e409 nDPId: Reduced flow-updates for TCP flows to 1/4 of the timeout value.
* nDPId: Fixed broken validation tests.
 * nDPId: Removed TICK_RESOLUTION, not required anymore.
 * c-collectd: Improved total layer4 payload calculation/update handling.
 * c-collectd: Updated RRD Graph script according to total layer4 payload changes.
 * py-flow-info.py: Fixed several bugs and syntax errors.
 * Python scripts: Added dirname(argv[0]) as search path for nDPIsrvd.py.
 * nDPIsrvd&nDPId-test: Fixed missing EPOLLERR check.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-19 19:39:49 +02:00
Damiano Verzulli
ab7f7d05f3 Improve README
- link to already-existing JSON-schemas have been added
- a graphical schema detailing flow-events timeline have
  been added in both PNG and source-Drawio formats.
  Link to PNG have been included in the README
2022-09-19 17:23:11 +02:00
Toni Uhlig
015a739efd Added layer4 payload length bins.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-19 10:14:37 +02:00
Toni Uhlig
31715295d9 bump libnDPI to 174cd739dbb1358ab012c4779e42e0221bef835c
* ReadMe stuff
 * OpenWrt Makefile: set NEED_LINKING_AGAINST_LIBM=ON

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-18 13:34:43 +02:00
Toni Uhlig
06bce24c0e Add -Werror to OpenWrt package TARGET_CFLAGS.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-17 18:53:17 +02:00
Toni Uhlig
efaa76e978 Provide thread sync via locking on architectures that do not support Compare&Swap.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-17 18:27:17 +02:00
Toni Uhlig
b3e9af495c Add OpenWrt CI via Github Actions.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-17 10:31:26 +02:00
lns
b8cfe1d6d3 Fixed last pkt time.
Signed-off-by: lns <matzeton@googlemail.com>
2022-09-14 11:22:41 +02:00
Toni Uhlig
d4633c1192 New flow event: 'analysis'.
* The goal was to provide a separate event for extracted feature that are not required
   and only useful for a few (e.g. someone who wants do ML).
 * Increased network buffer size to 32kB (8192 * 4).
 * Switched timestamp precision from ms to us for *ALL* timestamps.

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-13 22:05:08 +02:00
Toni Uhlig
aca1615dc1 OpenWrt packaging support.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-13 13:53:48 +02:00
Toni Uhlig
94aa02b298 nDPIsrvd-collectd: Stdout should be unbuffered.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-12 13:32:50 +02:00
Toni Uhlig
20ced3e636 nDPIsrvd-collectd: RRD Graph generation script and a basic static HTML5 website for viewing the generated image files.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-12 13:23:50 +02:00
Toni Uhlig
83409e5b79 Use CMake XCompile and collect host-triplet from ${CC}.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-11 16:50:56 +02:00
Toni Uhlig
3bc6627dcc nDPId: Removed thread_id nonsense as it does not provide any useful information and is not portable at all, not even on Linux systems ..
* nDPId: Removed blocking I/O warning, which causes logspams..

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-10 23:11:03 +02:00
Toni Uhlig
7594180301 include fix
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-10 22:55:06 +02:00
Toni Uhlig
a992c79ab6 Fixed compilation warnings on linux32 platforms.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-10 22:45:12 +02:00
Toni Uhlig
6fe5d1da69 Do not use pthread_t as numeric value. Some systems define pthread_t as struct *
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-10 22:34:00 +02:00
Toni Uhlig
38c71af2f4 nDPIsrvd: Fixed NUL pointer deref during logging attempt.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-10 16:46:25 +02:00
Toni Uhlig
ac2e5ed796 CI: fix minimum supported libnDPI version
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-06 16:56:35 +02:00
Toni Uhlig
f9bd7d29ce Bump libnDPI to 37f918322c0a489b5143a987c8f1a44a6f78a6f3 and updated flow json schema file.
* export env vars AR / CMAKE_C_COMPILER_AR and RANLIB / CMAKE_C_COMPILER_RANLIB while building libnDPI
 * nDPId check API version during startup (macro vs. function call) and print a warning if they are different

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-06 14:50:46 +02:00
Toni Uhlig
c5c7d83c97 Added https://gitlab.com/verzulli/ndpid-rt-analyzer.git to examples.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-06 11:08:01 +02:00
Toni Uhlig
70f517b040 Merge branch 'main' of github.com:utoni/nDPId 2022-09-04 17:26:21 +02:00
Toni Uhlig
dcf78ad3ed Disable timestamp generation in nDPIsrvd-collectd as default.
* collectd's rrdtool write plugin does silently fail with those ones (dunno why)

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-09-04 17:24:10 +02:00
lns
d646ec5ab4 nDPId: Fixed fcntl() issue; invalid fcntl() set after a blocking-write.
* nDPId: imrpvoed collector socket error messages on connect/write/etc failures
 * reverted `netcat` parts of the README

Signed-off-by: lns <matzeton@googlemail.com>
2022-08-29 15:29:07 +02:00
lns
dea30501a4 Add documentation about events and flow states.
Signed-off-by: lns <matzeton@googlemail.com>
2022-08-27 14:18:59 +02:00
lns
d9fadae718 nDPId: improved error messages if UNIX/UDP endpoint refuses connections/datagrams
Signed-off-by: lns <matzeton@googlemail.com>
2022-08-27 14:18:59 +02:00
Toni Uhlig
5e09a00062 nDPId: support for custom UDP endpoints
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-08-27 14:18:59 +02:00
lns
d0b0a50609 nDPId: improved error messages if UNIX/UDP endpoint refuses connections/datagrams
Signed-off-by: lns <matzeton@googlemail.com>
2022-08-27 13:04:17 +02:00
Toni Uhlig
e2e7c82d7f nDPId: support for custom UDP endpoints
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-08-27 13:04:17 +02:00
Toni Uhlig
0fd59f060e Split *_l4_payload_len' into *_src_l4_payload_len' and `*_dst_l4_payload_len'.
Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-08-15 22:55:19 +02:00
lns
905545487d Split flow_packets_processed' into flow_src_packets_processed' and `flow_dst_packets_processed'.
* no use for `flow_avg_l4_payload_len' -> removed
 * test/run_tests.sh does not fail if git-worktree's are used

Signed-off-by: lns <matzeton@googlemail.com>
2022-08-15 18:36:49 +02:00
Toni Uhlig
2cb2c86cb5 c-collectd: fixed incorrect PUTVAL
* get rid of types.db

Signed-off-by: Toni Uhlig <matzeton@googlemail.com>
2022-08-15 16:42:59 +02:00
4634 changed files with 371462 additions and 75566 deletions

19
.circleci/config.yml Normal file
View File

@@ -0,0 +1,19 @@
version: 2.1
jobs:
build:
docker:
- image: ubuntu:latest
steps:
- checkout
- run: export DEBIAN_FRONTEND=noninteractive
- run: apt-get update -qq
- run: |
env DEBIAN_FRONTEND=noninteractive \
apt-get install -y -qq \
coreutils wget git unzip make cmake binutils gcc g++ autoconf automake flex bison texinfo \
libtool pkg-config gettext libjson-c-dev flex bison libpcap-dev zlib1g-dev
- run: |
cmake -S . -B build -DENABLE_SYSTEMD=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON
- run: |
cmake --build build --verbose

38
.github/workflows/build-archlinux.yml vendored Normal file
View File

@@ -0,0 +1,38 @@
name: ArchLinux PKGBUILD
on:
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
build:
runs-on: ubuntu-latest
env:
CMAKE_C_FLAGS: -Werror
steps:
- uses: actions/checkout@v4
with:
submodules: false
fetch-depth: 1
- name: Prepare for ArchLinux packaging
run: |
sudo chmod -R 0777 .
mv -v packages/archlinux packages/ndpid-testing
- uses: 2m/arch-pkgbuild-builder@v1.16
with:
debug: true
target: 'pkgbuild'
pkgname: 'packages/ndpid-testing'
- name: Upload PKG
uses: actions/upload-artifact@v4
with:
name: nDPId-archlinux-packages
path: packages/ndpid-testing/*.pkg.tar.zst

59
.github/workflows/build-centos.yml vendored Normal file
View File

@@ -0,0 +1,59 @@
name: CentOs
on:
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
centos8:
runs-on: ubuntu-latest
container: 'centos:8'
steps:
- uses: actions/checkout@v4
with:
submodules: false
fetch-depth: 1
- name: Install CentOs Prerequisites
run: |
sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-*
sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-*
yum -y update
yum -y install curl gpg
curl 'https://packages.ntop.org/centos/ntop.repo' > /etc/yum.repos.d/ntop.repo
curl 'https://packages.ntop.org/centos/RPM-GPG-KEY-deri' | gpg --import
yum -y install yum-utils dnf-plugins-core epel-release
dnf config-manager --set-enabled powertools
yum -y update
yum -y install rpm-build gcc gcc-c++ autoconf automake make cmake flex bison gettext pkg-config libtool ndpi-dev libpcap-devel zlib-devel python3.8 git wget unzip /usr/lib64/libasan.so.5.0.0 /usr/lib64/libubsan.so.1.0.0
repoquery -l ndpi-dev
- name: Configure nDPId
run: |
mkdir build && cd build
cmake .. -DENABLE_SYSTEMD=ON -DBUILD_EXAMPLES=ON -DENABLE_SANITIZER=ON -DNDPI_NO_PKGCONFIG=ON -DSTATIC_LIBNDPI_INSTALLDIR=/usr
- name: Build nDPId
run: |
make -C build all VERBOSE=1
- name: CPack RPM
run: |
cd ./build && cpack -G RPM && cd ..
- name: Upload RPM
uses: actions/upload-artifact@v4
with:
name: nDPId-centos-packages
path: build/*.rpm
- name: Upload on Failure
uses: actions/upload-artifact@v4
if: failure()
with:
name: autoconf-config-log
path: |
build/CMakeCache.txt
libnDPI/config.log

25
.github/workflows/build-docker.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
name: Docker Build
on:
push:
branches:
- 'main'
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v4
with:
push: true
tags: utoni/ndpid:latest

39
.github/workflows/build-freebsd.yml vendored Normal file
View File

@@ -0,0 +1,39 @@
name: FreeBSD Build
on:
schedule:
# At the end of every day
- cron: '0 0 * * *'
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
test:
runs-on: ubuntu-latest
name: Build and Test
steps:
- uses: actions/checkout@v4
- name: Test in FreeBSD
id: test
uses: vmactions/freebsd-vm@main
with:
usesh: true
prepare: |
pkg install -y bash autoconf automake cmake gmake libtool gettext pkgconf gcc \
git wget unzip flock \
json-c flex bison libpcap curl openssl dbus
run: |
echo "Working Directory: $(pwd)"
echo "User.............: $(whoami)"
echo "FreeBSD Version..: $(freebsd-version)"
# TODO: Make examples I/O event agnostic i.e. use nio
cmake -S . -B build -DBUILD_NDPI=ON -DBUILD_EXAMPLES=OFF #-DENABLE_CURL=ON -DENABLE_DBUS=ON
cmake --build build

54
.github/workflows/build-openwrt.yml vendored Normal file
View File

@@ -0,0 +1,54 @@
name: OpenWrt Build
on:
schedule:
# At the end of every day
- cron: '0 0 * * *'
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
build:
name: ${{ matrix.arch }} ${{ matrix.target }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
include:
- arch: arm_cortex-a9_vfpv3-d16
target: mvebu-cortexa9
- arch: arm_cortex-a15_neon-vfpv4
target: armvirt-32
- arch: x86_64
target: x86-64
steps:
- uses: actions/checkout@v4
with:
submodules: false
fetch-depth: 1
- name: Build
uses: openwrt/gh-action-sdk@main
env:
ARCH: ${{ matrix.arch }}-snapshot
FEED_DIR: ${{ github.workspace }}/packages/openwrt
FEEDNAME: ndpid_openwrt_packages_ci
PACKAGES: nDPId-testing
V: s
- name: Store packages
uses: actions/upload-artifact@v4
with:
name: nDPId-${{ matrix.arch}}-${{ matrix.target }}
path: bin/packages/${{ matrix.arch }}/ndpid_openwrt_packages_ci/*.ipk

45
.github/workflows/build-rpm.yml vendored Normal file
View File

@@ -0,0 +1,45 @@
name: RPM Build
on:
schedule:
# At the end of every day
- cron: '0 0 * * *'
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Ubuntu Prerequisites
run: |
sudo apt-get update
sudo apt-get install fakeroot alien autoconf automake cmake libtool pkg-config gettext libjson-c-dev flex bison libpcap-dev zlib1g-dev libcurl4-openssl-dev libdbus-1-dev
- name: Build RPM package
run: |
cmake -S . -B build-rpm -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DCMAKE_BUILD_TYPE=Release
cmake --build build-rpm --parallel
cd build-rpm
cpack -G RPM
cd ..
- name: Convert/Install RPM package
run: |
fakeroot alien --scripts --to-deb --verbose ./build-rpm/nDPId-*.rpm
sudo dpkg -i ./ndpid_*.deb
- name: Upload RPM
uses: actions/upload-artifact@v4
with:
name: nDPId-rpm-packages
path: build-rpm/*.rpm

View File

@@ -1,88 +1,409 @@
name: Build
on:
schedule:
# At the end of every day
- cron: '0 0 * * *'
push:
branches:
- main
- tmp
pull_request:
branches:
- main
types: [opened, synchronize, reopened]
release:
types: [created]
jobs:
test:
name: ${{ matrix.os }} ${{ matrix.gcrypt }}
name: ${{ matrix.os }} ${{ matrix.compiler }}
runs-on: ${{ matrix.os }}
env:
CMAKE_C_COMPILER: ${{ matrix.compiler }}
CMAKE_C_FLAGS: -Werror
CMAKE_C_FLAGS: -Werror ${{ matrix.cflags }}
CMAKE_C_EXE_LINKER_FLAGS: ${{ matrix.ldflags }}
CMAKE_MODULE_LINKER_FLAGS: ${{ matrix.ldflags }}
DYLD_LIBRARY_PATH: /usr/local/lib
strategy:
fail-fast: true
matrix:
os: ["ubuntu-latest", "ubuntu-18.04"]
ndpid_gcrypt: ["-DNDPI_WITH_GCRYPT=OFF", "-DNDPI_WITH_GCRYPT=ON"]
ndpid_zlib: ["-DENABLE_ZLIB=OFF", "-DENABLE_ZLIB=ON"]
ndpi_min_version: ["4.4"]
include:
- compiler: "default-cc"
- compiler: "gcc"
os: "ubuntu-latest"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: "-DBUILD_RUST_EXAMPLES=ON"
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: "-DENABLE_CRYPTO=ON"
sanitizer: "-DENABLE_SANITIZER=OFF -DENABLE_SANITIZER_THREAD=OFF"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=OFF"
upload: true
upload_suffix: ""
ndpi_min_version: "5.0"
- compiler: "gcc"
os: "ubuntu-latest"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=ON"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: "-DENABLE_CRYPTO=ON -DNDPI_WITH_MAXMINDDB=ON -DNDPI_WITH_PCRE=ON -DENABLE_MEMORY_PROFILING=ON"
sanitizer: "-DENABLE_SANITIZER=OFF -DENABLE_SANITIZER_THREAD=OFF"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=OFF"
upload: true
upload_suffix: "-host-gcrypt"
ndpi_min_version: "5.0"
- compiler: "clang"
os: "ubuntu-latest"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=OFF"
ndpid_extras: ""
sanitizer: "-DENABLE_SANITIZER=OFF -DENABLE_SANITIZER_THREAD=OFF"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=OFF"
upload: true
upload_suffix: "-no-zlib"
ndpi_min_version: "5.0"
- compiler: "gcc"
os: "ubuntu-latest"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: ""
sanitizer: "-DENABLE_SANITIZER=ON"
coverage: "-DENABLE_COVERAGE=ON"
poll: "-DFORCE_POLL=ON"
upload: false
ndpi_min_version: "5.0"
- compiler: "clang"
os: "ubuntu-latest"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: "-DENABLE_CRYPTO=ON"
sanitizer: "-DENABLE_SANITIZER=ON"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=OFF"
upload: false
ndpi_min_version: "5.0"
- compiler: "clang-12"
os: "ubuntu-latest"
os: "ubuntu-22.04"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: ""
sanitizer: "-DENABLE_SANITIZER_THREAD=ON"
coverage: "-DENABLE_COVERAGE=OFF"
poll:
upload: false
ndpi_min_version: "5.0"
- compiler: "gcc-10"
os: "ubuntu-latest"
- compiler: "gcc-7"
os: "ubuntu-latest"
os: "ubuntu-22.04"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=OFF"
ndpid_extras: ""
sanitizer: "-DENABLE_SANITIZER=ON"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=ON"
upload: false
ndpi_min_version: "5.0"
- compiler: "gcc-9"
os: "ubuntu-22.04"
ndpi_build: "-DBUILD_NDPI=ON"
ndpid_examples: "-DBUILD_EXAMPLES=ON"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: ""
sanitizer: "-DENABLE_SANITIZER=ON"
coverage: "-DENABLE_COVERAGE=OFF"
poll: "-DFORCE_POLL=OFF"
upload: false
ndpi_min_version: "5.0"
- compiler: "cc"
os: "macOS-13"
ndpi_build: "-DBUILD_NDPI=OFF"
ndpid_examples: "-DBUILD_EXAMPLES=OFF"
ndpid_rust_examples: ""
ndpid_gcrypt: "-DNDPI_WITH_GCRYPT=OFF"
ndpid_zlib: "-DENABLE_ZLIB=ON"
ndpid_extras: ""
examples: "-DBUILD_EXAMPLES=OFF"
sanitizer: "-DENABLE_SANITIZER=OFF"
coverage: "-DENABLE_COVERAGE=OFF"
poll:
upload: false
ndpi_min_version: "5.0"
steps:
- uses: actions/checkout@v2
- name: Print Matrix
run: |
echo '----------------------------------------'
echo '| OS.......: ${{ matrix.os }}'
echo '| CC.......: ${{ matrix.compiler }}'
echo "| CFLAGS...: $CMAKE_C_FLAGS"
echo "| LDFLAGS..: $CMAKE_C_EXE_LINKER_FLAGS"
echo '|---------------------------------------'
echo '| nDPI min.: ${{ matrix.ndpi_min_version }}'
echo '| GCRYPT...: ${{ matrix.ndpid_gcrypt }}'
echo '| ZLIB.....: ${{ matrix.ndpid_zlib }}'
echo '| Extras...: ${{ matrix.ndpid_extras }}'
echo '| ForcePoll: ${{ matrix.poll }}'
echo '|---------------------------------------'
echo '| SANITIZER: ${{ matrix.sanitizer }}'
echo '| COVERAGE.: ${{ matrix.coverage }}'
echo '|---------------------------------------'
echo '| UPLOAD...: ${{ matrix.upload }}'
echo '----------------------------------------'
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
submodules: false
fetch-depth: 1
- name: Install MacOS Prerequisites
if: startsWith(matrix.os, 'macOS')
run: |
brew install coreutils automake make unzip
wget 'https://www.tcpdump.org/release/libpcap-1.10.4.tar.gz'
tar -xzvf libpcap-1.10.4.tar.gz
cd libpcap-1.10.4
./configure && make install
cd ..
wget 'https://github.com/ntop/nDPI/archive/refs/heads/dev.zip' -O libndpi-dev.zip
unzip libndpi-dev.zip
cd nDPI-dev
./autogen.sh
./configure --prefix=/usr/local --with-only-libndpi && make install
- name: Fix kernel mmap rnd bits on Ubuntu
if: startsWith(matrix.os, 'ubuntu')
run: |
# Workaround for compatinility between latest kernel and sanitizer
# See https://github.com/actions/runner-images/issues/9491
sudo sysctl vm.mmap_rnd_bits=28
- name: Install Ubuntu Prerequisites
if: startsWith(matrix.os, 'ubuntu')
run: |
sudo apt-get update
sudo apt-get install autoconf automake cmake libtool pkg-config gettext libjson-c-dev flex bison libpcap-dev zlib1g-dev
sudo apt-get install ${{ matrix.compiler }} lcov
sudo apt-get install autoconf automake cmake libtool pkg-config gettext libjson-c-dev flex bison libpcap-dev zlib1g-dev libcurl4-openssl-dev libdbus-1-dev
sudo apt-get install ${{ matrix.compiler }} lcov iproute2
- name: Install Ubuntu Prerequisites (Rust/Cargo)
if: startsWith(matrix.os, 'ubuntu') && startsWith(matrix.ndpid_rust_examples, '-DBUILD_RUST_EXAMPLES=ON')
run: |
sudo apt-get install cargo
- name: Install Ubuntu Prerequisites (libgcrypt)
if: startsWith(matrix.os, 'ubuntu') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=ON')
run: |
sudo apt-get install libgcrypt20-dev
- name: Install Ubuntu Prerequisities (zlib)
- name: Install Ubuntu Prerequisites (zlib)
if: startsWith(matrix.os, 'ubuntu') && startsWith(matrix.ndpid_zlib, '-DENABLE_ZLIB=ON')
run: |
sudo apt-get install zlib1g-dev
- name: Install Ubuntu Prerequisites (libmaxminddb, libpcre2)
if: startsWith(matrix.ndpid_extras, '-D')
run: |
sudo apt-get install libmaxminddb-dev libpcre2-dev
- name: Install Ubuntu Prerequisites (libnl-genl-3-dev)
if: startsWith(matrix.ndpi_build, '-DBUILD_NDPI=ON') && startsWith(matrix.coverage, '-DENABLE_COVERAGE=OFF') && startsWith(matrix.sanitizer, '-DENABLE_SANITIZER=ON') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF') && startsWith(matrix.ndpid_zlib, '-DENABLE_ZLIB=ON')
run: |
sudo apt-get install libnl-genl-3-dev
- name: Checking Network Buffer Size
run: |
C_VAL=$(cat config.h | sed -n 's/^#define\s\+NETWORK_BUFFER_MAX_SIZE\s\+\([0-9]\+\).*$/\1/gp')
PY_VAL=$(cat dependencies/nDPIsrvd.py | sed -n 's/^NETWORK_BUFFER_MAX_SIZE = \([0-9]\+\).*$/\1/gp')
test ${C_VAL} = ${PY_VAL}
- name: Configure nDPId
run: |
mkdir build && cd build
cmake .. -DENABLE_COVERAGE=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DENABLE_SANITIZER=ON ${{ matrix.ndpid_zlib }} ${{ matrix.ndpid_gcrypt }}
cmake -S . -B build -Werror=dev -Werror=deprecated -DCMAKE_C_COMPILER="$CMAKE_C_COMPILER" -DCMAKE_C_FLAGS="$CMAKE_C_FLAGS" -DCMAKE_MODULE_LINKER_FLAGS="$CMAKE_MODULE_LINKER_FLAGS" -DCMAKE_C_EXE_LINKER_FLAGS="$CMAKE_C_EXE_LINKER_FLAGS" \
-DENABLE_DBUS=ON -DENABLE_CURL=ON -DENABLE_SYSTEMD=ON \
${{ matrix.poll }} ${{ matrix.coverage }} ${{ matrix.sanitizer }} ${{ matrix.ndpi_build }} \
${{ matrix.ndpid_examples }} ${{ matrix.ndpid_rust_examples }} ${{ matrix.ndpid_zlib }} ${{ matrix.ndpid_gcrypt }} ${{ matrix.ndpid_extras }}
- name: Build nDPId
run: |
make -C build all VERBOSE=1
cmake --build build --verbose
- name: Build single nDPId/nDPIsrvd executables (invoke CC directly - dynamic nDPI lib)
if: startsWith(matrix.ndpi_build, '-DBUILD_NDPI=OFF') && startsWith(matrix.coverage, '-DENABLE_COVERAGE=OFF') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF')
run: |
pkg-config --cflags --libs libndpi
cc -Wall -Wextra -std=gnu99 \
${{ matrix.poll }} -DENABLE_MEMORY_PROFILING=1 \
nDPId.c nio.c utils.c \
$(pkg-config --cflags libndpi) -I. -I./dependencies -I./dependencies/jsmn -I./dependencies/uthash/include \
-o /tmp/a.out \
-lpcap $(pkg-config --libs libndpi) -pthread -lm
cc -Wall -Wextra -std=gnu99 \
${{ matrix.poll }} -DENABLE_MEMORY_PROFILING=1 \
nDPIsrvd.c nio.c utils.c \
-I. -I./dependencies -I./dependencies/jsmn -I./dependencies/uthash/include \
-o /tmp/a.out
- name: Build single nDPId/nDPIsrvd executables (invoke CC directly - static nDPI lib)
if: startsWith(matrix.ndpi_build, '-DBUILD_NDPI=ON') && startsWith(matrix.coverage, '-DENABLE_COVERAGE=OFF') && startsWith(matrix.sanitizer, '-DENABLE_SANITIZER=ON') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF') && startsWith(matrix.ndpid_zlib, '-DENABLE_ZLIB=ON')
run: |
cc -Wall -Wextra -std=gnu99 ${{ matrix.poll }} -DENABLE_ZLIB=1 -DENABLE_MEMORY_PROFILING=1 \
-fsanitize=address -fsanitize=undefined -fno-sanitize=alignment -fsanitize=enum -fsanitize=leak \
nDPId.c nio.c utils.c \
-I./build/libnDPI/include/ndpi -I. -I./dependencies -I./dependencies/jsmn -I./dependencies/uthash/include \
-o /tmp/a.out \
-lpcap ./build/libnDPI/lib/libndpi.a -pthread -lm -lz
- name: Test EXEC
run: |
./build/nDPId-test || test $? -eq 1
./build/nDPId-test
./build/nDPId -h || test $? -eq 1
./build/nDPIsrvd -h || test $? -eq 1
- name: Test DIFF
if: startsWith(matrix.os, 'ubuntu') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF')
if: startsWith(matrix.os, 'macOS') == false && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF')
run: |
./test/run_tests.sh ./libnDPI ./build/nDPId-test
./test/run_config_tests.sh ./libnDPI ./build/nDPId-test
- name: Daemon
if: startsWith(matrix.compiler, 'gcc') || endsWith(matrix.compiler, 'clang')
run: |
make -C ./build daemon VERBOSE=1
make -C ./build daemon VERBOSE=1
- name: Coverage
if: startsWith(matrix.coverage, '-DENABLE_COVERAGE=ON')
run: |
make -C ./build coverage
- name: Dist
if: startsWith(matrix.os, 'macOS') == false && matrix.upload == false
run: |
make -C ./build dist
RAND_ID=$(( ( RANDOM ) + 1 ))
mkdir "nDPId-dist-${RAND_ID}"
cd "nDPId-dist-${RAND_ID}"
tar -xjf ../nDPId-*.tar.bz2
cd ./nDPId-*
cmake -S . -B ./build \
-DENABLE_DBUS=ON -DENABLE_CURL=ON -DENABLE_SYSTEMD=ON \
${{ matrix.poll }} ${{ matrix.coverage }} ${{ matrix.sanitizer }} ${{ matrix.ndpi_build }} \
${{ matrix.ndpid_examples }} ${{ matrix.ndpid_rust_examples }} ${{ matrix.ndpid_zlib }} ${{ matrix.ndpid_gcrypt }} ${{ matrix.ndpid_extras }}
cd ../..
rm -rf "nDPId-dist-${RAND_ID}"
- name: CPack DEB
if: startsWith(matrix.os, 'macOS') == false
run: |
cd ./build && cpack -G DEB && cd ..
cd ./build && cpack -G DEB && \
sudo dpkg -i nDPId-*.deb && \
sudo apt purge ndpid && \
sudo dpkg -i nDPId-*.deb && cd ..
- name: Upload DEB
if: startsWith(matrix.os, 'macOS') == false && matrix.upload
uses: actions/upload-artifact@v4
with:
name: nDPId-debian-packages_${{ matrix.compiler }}${{ matrix.upload_suffix }}
path: build/*.deb
- name: Test systemd
if: startsWith(matrix.os, 'ubuntu') && startsWith(matrix.compiler, 'gcc')
run: |
ip -c address
sudo systemctl daemon-reload
sudo systemctl enable ndpid@lo
sudo systemctl start ndpid@lo
SYSTEMCTL_RET=3; while (( $SYSTEMCTL_RET == 3 )); do systemctl is-active ndpid@lo.service; SYSTEMCTL_RET=$?; sleep 1; done
sudo systemctl status ndpisrvd.service ndpid@lo.service
sudo systemctl show ndpisrvd.service ndpid@lo.service -p SubState,ActiveState
sudo dpkg -i ./build/nDPId-*.deb
sudo systemctl status ndpisrvd.service ndpid@lo.service
sudo systemctl show ndpisrvd.service ndpid@lo.service -p SubState,ActiveState
sudo systemctl stop ndpisrvd.service
journalctl --no-tail --no-pager -u ndpisrvd.service -u ndpid@lo.service
- name: Build PF_RING and nDPId (invoke CC directly - dynamic nDPI lib)
if: startsWith(matrix.ndpi_build, '-DBUILD_NDPI=ON') && startsWith(matrix.coverage, '-DENABLE_COVERAGE=OFF') && startsWith(matrix.sanitizer, '-DENABLE_SANITIZER=ON') && startsWith(matrix.ndpid_gcrypt, '-DNDPI_WITH_GCRYPT=OFF') && startsWith(matrix.ndpid_zlib, '-DENABLE_ZLIB=ON')
run: |
git clone --depth=1 https://github.com/ntop/PF_RING.git
cd PF_RING/userland && ./configure && make && sudo make install prefix=/usr
cd ../..
cc -Wall -Wextra -std=gnu99 ${{ matrix.poll }} -DENABLE_PFRING=1 -DENABLE_ZLIB=1 -DENABLE_MEMORY_PROFILING=1 \
-fsanitize=address -fsanitize=undefined -fno-sanitize=alignment -fsanitize=enum -fsanitize=leak \
nDPId.c npfring.c nio.c utils.c \
-I. -I./dependencies -I./dependencies/jsmn -I./dependencies/uthash/include \
-I./build/libnDPI/include/ndpi \
-I./PF_RING/userland/lib -I./PF_RING/kernel \
-o /tmp/a.out \
-ldl /usr/lib/libpfring.a -lpcap ./build/libnDPI/lib/libndpi.a -pthread -lm -lz
- name: Build against libnDPI-${{ matrix.ndpi_min_version }}
if: startsWith(matrix.os, 'ubuntu')
run: |
mkdir build-local-ndpi && cd build-local-ndpi
wget 'https://github.com/ntop/nDPI/archive/refs/tags/${{ matrix.ndpi_min_version }}.tar.gz'
tar -xzvf ${{ matrix.ndpi_min_version }}.tar.gz && cd nDPI-${{ matrix.ndpi_min_version }} && ./autogen.sh --prefix=/usr --with-only-libndpi CC=${{ matrix.compiler }} CXX=false CFLAGS='-Werror' && sudo make install && cd ..
cmake .. -DENABLE_COVERAGE=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=OFF -DENABLE_SANITIZER=ON ${{ matrix.ndpi_min_version }}
make all VERBOSE=1
WGET_RET=0
wget 'https://github.com/ntop/nDPI/archive/refs/tags/${{ matrix.ndpi_min_version }}.tar.gz' || { WGET_RET=$?; true; }
echo "wget returned: ${WGET_RET}"
test $WGET_RET -ne 8 && { \
tar -xzvf ${{ matrix.ndpi_min_version }}.tar.gz; }
test $WGET_RET -ne 8 || { \
echo "::warning file=nDPId.c::New libnDPI release required to build against release tarball, falling back to dev branch."; \
wget 'http://github.com/ntop/nDPI/archive/refs/heads/dev.tar.gz'; \
WGET_RET=$?; \
tar -xzvf dev.tar.gz; \
mv -v 'nDPI-dev' 'nDPI-${{ matrix.ndpi_min_version }}'; }
test $WGET_RET -ne 0 || { cd nDPI-${{ matrix.ndpi_min_version }}; \
NDPI_CONFIGURE_ARGS=''; \
test 'x${{ matrix.ndpid_gcrypt }}' != 'x-DNDPI_WITH_GCRYPT=ON' || NDPI_CONFIGURE_ARGS="$NDPI_CONFIGURE_ARGS --with-local-libgcrypt"; \
test 'x${{ matrix.sanitizer }}' != 'x-DENABLE_SANITIZER=ON' || NDPI_CONFIGURE_ARGS="$NDPI_CONFIGURE_ARGS --with-sanitizer"; \
echo "Configure arguments: '$NDPI_CONFIGURE_ARGS'"; \
./autogen.sh; \
./configure --prefix=/usr --with-only-libndpi $NDPI_CONFIGURE_ARGS CC="${{ matrix.compiler }}" CXX=false \
CFLAGS="$CMAKE_C_FLAGS" && make && sudo make install; cd ..; }
ls -alhR /usr/include/ndpi
cd ..
test $WGET_RET -ne 0 || { echo "Running CMake.. (pkgconfig)"; \
cmake -S . -B ./build-local-pkgconfig \
-DCMAKE_C_COMPILER="$CMAKE_C_COMPILER" -DCMAKE_C_FLAGS="$CMAKE_C_FLAGS" \
-DCMAKE_C_EXE_LINKER_FLAGS="$CMAKE_C_EXE_LINKER_FLAGS" \
-DBUILD_NDPI=OFF -DBUILD_EXAMPLES=ON \
-DENABLE_DBUS=ON -DENABLE_CURL=ON -DENABLE_SYSTEMD=ON \
${{ matrix.poll }} ${{ matrix.coverage }} \
${{ matrix.sanitizer }} ${{ matrix.ndpid_examples }} ${{ matrix.ndpid_rust_examples }}; }
test $WGET_RET -ne 0 || { echo "Running Make.. (pkgconfig)"; \
cmake --build ./build-local-pkgconfig --verbose; }
test $WGET_RET -ne 0 || { echo "Testing Executable.. (pkgconfig)"; \
./build-local-pkgconfig/nDPId-test; \
./build-local-pkgconfig/nDPId -h || test $? -eq 1; \
./build-local-pkgconfig/nDPIsrvd -h || test $? -eq 1; }
test $WGET_RET -ne 0 || { echo "Running CMake.. (static)"; \
cmake -S . -B ./build-local-static \
-DCMAKE_C_COMPILER="$CMAKE_C_COMPILER" -DCMAKE_C_FLAGS="$CMAKE_C_FLAGS" \
-DCMAKE_C_EXE_LINKER_FLAGS="$CMAKE_C_EXE_LINKER_FLAGS" \
-DBUILD_NDPI=OFF -DBUILD_EXAMPLES=ON \
-DENABLE_DBUS=ON -DENABLE_CURL=ON -DENABLE_SYSTEMD=ON \
-DNDPI_NO_PKGCONFIG=ON -DSTATIC_LIBNDPI_INSTALLDIR=/usr \
${{ matrix.poll }} ${{ matrix.coverage }} ${{ matrix.ndpid_gcrypt }} \
${{ matrix.sanitizer }} ${{ matrix.ndpid_examples }} ${{ matrix.ndpid_rust_examples }}; }
test $WGET_RET -ne 0 || { echo "Running Make.. (static)"; \
cmake --build ./build-local-static --verbose; }
test $WGET_RET -ne 0 || { echo "Testing Executable.. (static)"; \
./build-local-static/nDPId-test; \
./build-local-static/nDPId -h || test $? -eq 1; \
./build-local-static/nDPIsrvd -h || test $? -eq 1; }
test $WGET_RET -ne 0 || test ! -d ./PF_RING || { echo "Running CMake.. (PF_RING)"; \
cmake -S . -B ./build-local-pfring \
-DCMAKE_C_COMPILER="$CMAKE_C_COMPILER" -DCMAKE_C_FLAGS="$CMAKE_C_FLAGS" \
-DCMAKE_C_EXE_LINKER_FLAGS="$CMAKE_C_EXE_LINKER_FLAGS" \
-DBUILD_NDPI=OFF -DBUILD_EXAMPLES=ON -DENABLE_PFRING=ON \
-DENABLE_DBUS=ON -DENABLE_CURL=ON -DENABLE_SYSTEMD=ON \
-DNDPI_NO_PKGCONFIG=ON -DSTATIC_LIBNDPI_INSTALLDIR=/usr \
-DPFRING_LINK_STATIC=OFF \
-DPFRING_INSTALLDIR=/usr -DPFRING_KERNEL_INC="$(realpath ./PF_RING/kernel)" \
${{ matrix.poll }} ${{ matrix.coverage }} ${{ matrix.ndpid_gcrypt }} \
${{ matrix.sanitizer }} ${{ matrix.ndpid_examples }} ${{ matrix.ndpid_rust_examples }}; }
test $WGET_RET -ne 0 || test ! -d ./PF_RING || { echo "Running Make.. (PF_RING)"; \
cmake --build ./build-local-pfring --verbose; }
test $WGET_RET -ne 0 || test ! -d ./PF_RING || { echo "Testing Executable.. (PF_RING)"; \
./build-local-pfring/nDPId-test; \
./build-local-pfring/nDPId -h || test $? -eq 1; \
./build-local-pfring/nDPIsrvd -h || test $? -eq 1; }
test $WGET_RET -eq 0 -o $WGET_RET -eq 8

66
.github/workflows/sonarcloud.yml vendored Normal file
View File

@@ -0,0 +1,66 @@
on:
push:
branches:
- main
- tmp
pull_request:
types: [opened, synchronize, reopened]
name: Sonarcloud Scan
jobs:
sonarcloud:
runs-on: ubuntu-latest
env:
BUILD_WRAPPER_OUT_DIR: build_wrapper_output_directory
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python 3.8 for gcovr
uses: actions/setup-python@v4
with:
python-version: 3.8
- name: install gcovr 5.0
run: |
pip install gcovr==5.0 # 5.1 is not supported
- name: Install sonar-scanner and build-wrapper
uses: SonarSource/sonarcloud-github-c-cpp@v3.2.0
- name: Install Prerequisites
run: |
sudo apt-get update
sudo apt-get install autoconf automake cmake lcov \
libtool pkg-config gettext \
libjson-c-dev flex bison \
libcurl4-openssl-dev libpcap-dev zlib1g-dev
- name: Run build-wrapper
run: |
build-wrapper-linux-x86-64 --out-dir ${{ env.BUILD_WRAPPER_OUT_DIR }} ./scripts/build-sonarcloud.sh
- name: Run tests
run: |
for file in $(ls libnDPI/tests/cfgs/*/pcap/*.pcap libnDPI/tests/cfgs/*/pcap/*.pcapng libnDPI/tests/cfgs/*/pcap/*.cap); do \
echo -n "${file} "; \
cd ./build-sonarcloud; \
./nDPId-test "../${file}" >/dev/null 2>/dev/null; \
cd ..; \
echo "[ok]"; \
done
mkdir -p gcov_report
cd gcov_report
gcov ../build-sonarcloud/CMakeFiles/nDPId-test.dir/nDPId-test.c.o
cd ..
- name: Run sonar-scanner
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
run: |
sonar-scanner \
--define sonar.projectName=nDPId \
--define sonar.projectVersion=1.7 \
--define sonar.sourceEncoding=UTF-8 \
--define sonar.branch.name=${GITHUB_HEAD_REF:-${GITHUB_REF#refs/heads/}} \
--define sonar.organization=lnslbrty \
--define sonar.projectKey=lnslbrty_nDPId \
--define sonar.python.version=3.8 \
--define sonar.cfamily.compile-commands=${{ env.BUILD_WRAPPER_OUT_DIR }}/compile_commands.json \
--define sonar.cfamily.gcov.reportsPath=gcov_report \
--define sonar.exclusions=build-sonarcloud/**,libnDPI/**,test/results/**,dependencies/jsmn/**,dependencies/uthash/**,examples/js-rt-analyzer-frontend/**,examples/js-rt-analyzer/**,examples/c-collectd/www/**,examples/py-flow-dashboard/assets/**

View File

@@ -3,47 +3,126 @@ image: debian:stable
stages:
- build_and_test
variables:
GIT_CLONE_PATH: '$CI_BUILDS_DIR/$CI_JOB_ID/$CI_PROJECT_NAME'
before_script:
- export DEBIAN_FRONTEND=noninteractive
- apt-get update -qq
- >
apt-get install -y -qq \
coreutils sudo \
build-essential make cmake binutils gcc autoconf automake \
libtool pkg-config git \
build-essential make cmake binutils gcc clang autoconf automake \
libtool pkg-config git wget unzip \
libpcap-dev libgpg-error-dev libjson-c-dev zlib1g-dev \
netcat-openbsd python3 python3-jsonschema tree lcov
netcat-openbsd python3 python3-jsonschema tree lcov iproute2
after_script:
- cat /tmp/nDPIsrvd.log
- cat /tmp/nDPId.log
- test -r /tmp/nDPIsrvd.log && cat /tmp/nDPIsrvd.log
- test -r /tmp/nDPId.log && cat /tmp/nDPId.log
build_and_test:
build_and_test_static_libndpi_tsan:
script:
# test for NETWORK_BUFFER_MAX_SIZE C and Python value equality
- C_VAL=$(cat config.h | sed -n 's/^#define\s\+NETWORK_BUFFER_MAX_SIZE\s\+\([0-9]\+\).*$/\1/gp')
- PY_VAL=$(cat dependencies/nDPIsrvd.py | sed -n 's/^NETWORK_BUFFER_MAX_SIZE = \([0-9]\+\).*$/\1/gp')
- test ${C_VAL} = ${PY_VAL}
# test for nDPId_PACKETS_PLEN_MAX C and Python value equality
- C_VAL=$(cat config.h | sed -n 's/^#define\s\+nDPId_PACKETS_PLEN_MAX\s\+\([0-9]\+\).*$/\1/gp')
- PY_VAL=$(cat dependencies/nDPIsrvd.py | sed -n 's/^nDPId_PACKETS_PLEN_MAX = \([0-9]\+\).*$/\1/gp')
- test ${C_VAL} = ${PY_VAL}
# static linked build
- mkdir build-clang-tsan
- cd build-clang-tsan
- env CMAKE_C_FLAGS='-Werror' CMAKE_C_COMPILER='clang' cmake .. -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DBUILD_NDPI_FORCE_GIT_UPDATE=ON -DENABLE_SANITIZER_THREAD=ON -DENABLE_ZLIB=ON
- make clean-libnDPI
- make libnDPI
- tree libnDPI
- make install VERBOSE=1 DESTDIR="$(realpath ../_install)"
- cd ..
- ./_install/usr/local/bin/nDPId-test
- ./test/run_tests.sh ./libnDPI ./_install/usr/local/bin/nDPId-test
artifacts:
expire_in: 1 week
paths:
- _install/
stage: build_and_test
build_and_test_static_libndpi:
script:
- mkdir build-cmake-submodule
- cd build-cmake-submodule
- env CMAKE_C_FLAGS='-Werror' cmake .. -DENABLE_COVERAGE=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DENABLE_SANITIZER=ON -DENABLE_ZLIB=ON
- env CMAKE_C_FLAGS='-Werror' cmake .. -DENABLE_SYSTEMD=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DBUILD_NDPI_FORCE_GIT_UPDATE=ON -DENABLE_ZLIB=ON
- make clean-libnDPI
- make libnDPI
- tree libnDPI
- make install VERBOSE=1 DESTDIR="$(realpath ../_install)"
- cpack -G DEB
- sudo dpkg -i nDPId-*.deb
- cd ..
- test -x /bin/systemctl && sudo systemctl daemon-reload
- test -x /bin/systemctl && sudo systemctl enable ndpid@lo
- test -x /bin/systemctl && sudo systemctl start ndpid@lo
- test -x /bin/systemctl && sudo systemctl status ndpisrvd.service ndpid@lo.service
- test -x /bin/systemctl && sudo systemctl stop ndpid@lo
- ./build-cmake-submodule/nDPId-test
- ./test/run_tests.sh ./libnDPI ./build-cmake-submodule/nDPId-test
- >
if ldd ./build-cmake-submodule/nDPId | grep -qoEi libndpi; then \
echo 'nDPId linked against a static libnDPI should not contain a shared linked libnDPI.' >&2; false; fi
- cc -Wall -Wextra -std=gnu99 nDPId.c nio.c utils.c -I./build-cmake-submodule/libnDPI/include/ndpi -I. -I./dependencies -I./dependencies/jsmn -I./dependencies/uthash/include -o /tmp/a.out -lpcap ./build-cmake-submodule/libnDPI/lib/libndpi.a -pthread -lm -lz
artifacts:
expire_in: 1 week
paths:
- build-cmake-submodule/*.deb
- _install/
stage: build_and_test
build_and_test_static_libndpi_coverage:
script:
- mkdir build-cmake-submodule
- cd build-cmake-submodule
- env CMAKE_C_FLAGS='-Werror' cmake .. -DENABLE_SYSTEMD=ON -DENABLE_COVERAGE=ON -DBUILD_EXAMPLES=ON -DBUILD_NDPI=ON -DBUILD_NDPI_FORCE_GIT_UPDATE=ON -DENABLE_SANITIZER=ON -DENABLE_ZLIB=ON
- make clean-libnDPI
- make libnDPI
- tree libnDPI
- make install VERBOSE=1 DESTDIR="$(realpath ../_install)"
- cd ..
- ./build-cmake-submodule/nDPId-test
- ./test/run_tests.sh ./libnDPI ./build-cmake-submodule/nDPId-test
# generate coverage report
- make -C ./build-cmake-submodule coverage
- make -C ./build-cmake-submodule coverage || true
- >
if ldd build/nDPId | grep -qoEi libndpi; then \
echo 'nDPId linked against a static libnDPI should not contain a shared linked libnDPI.' >&2; false; fi
artifacts:
expire_in: 1 week
paths:
- build-cmake-submodule/coverage_report
- _install/
stage: build_and_test
build_dynamic_libndpi:
script:
# pkg-config dynamic linked build
- git clone https://github.com/ntop/nDPI.git
- cd nDPI
- ./autogen.sh
- ./configure --prefix="$(realpath ../_install)" --enable-option-checking=fatal
- make install V=s
- cd ..
- tree ./_install
- mkdir build
- cd build
- export PKG_CONFIG_PATH="$(realpath ../build-cmake-submodule/libnDPI/lib/pkgconfig)"
- export CMAKE_PREFIX_PATH="$(realpath ../_install)"
- env CMAKE_C_FLAGS='-Werror' cmake .. -DBUILD_EXAMPLES=ON -DENABLE_SANITIZER=ON -DENABLE_MEMORY_PROFILING=ON -DENABLE_ZLIB=ON
- make all VERBOSE=1
- make install VERBOSE=1 DESTDIR="$(realpath ../_install)"
- cd ..
- ./build/nDPId-test || test $? -eq 1
- tree ./_install
- ./build/nDPId-test
- ./build/nDPId -h || test $? -eq 1
- ./build/nDPIsrvd -h || test $? -eq 1
# dameon start/stop test
- NUSER=nobody make -C ./build daemon VERBOSE=1
- NUSER=nobody make -C ./build daemon VERBOSE=1

9
.gitmodules vendored
View File

@@ -3,3 +3,12 @@
url = https://github.com/ntop/nDPI
branch = dev
update = rebase
[submodule "examples/js-rt-analyzer"]
path = examples/js-rt-analyzer
url = https://gitlab.com/verzulli/ndpid-rt-analyzer.git
[submodule "examples/js-rt-analyzer-frontend"]
path = examples/js-rt-analyzer-frontend
url = https://gitlab.com/verzulli/ndpid-rt-analyzer-frontend.git
[submodule "examples/cxx-graph"]
path = examples/cxx-graph
url = https://github.com/utoni/nDPId-Graph.git

83
CHANGELOG.md Normal file
View File

@@ -0,0 +1,83 @@
# CHANGELOG
#### nDPId 1.7 (Oct 2024)
- Read and parse configuration files for nDPId (+ libnDPI) and nDPIsrvd
- Added loading risk domains from a file (`-R`, thanks to @UnveilTech)
- Added Filebeat configuration file
- Improved hostname handling; will now always be part of `analyse`/`end`/`idle` events (if dissected)
- Improved Documentation (INSTALL / Schema)
- Added PF\_RING support
- Improved nDPIsrvd-analyse to write global stats to a CSV
- Added global (heap) memory stats for daemon status events (if enabled)
- Fixed IPv6 address/netmask retrieval on some systems
- Improved nDPIsrvd-collect; gauges and counters are now handled the right way
- Added nDPId Grafana dashboard
- Fixed `detection-update` event bug; was thrown even if nothing changed
- Fixed `not-detected` event spam if detection not completed (in some rare cases)
- Improved InfluxDB push daemon (severity parsing / gauge handling)
- Improved zLib compression
- Fixed nDPIsrvd-collectd missing escape character
#### nDPId 1.6 (Nov 2023)
- Added Event I/O abstraction layer (supporting only poll/epoll by now)
- Support for OSX and *BSD systems
- Added proper DLT_RAW dissection for IPv4 and IPv6
- Improved TCP timeout handling if FIN/RST seen which caused Midstream TCP flows when there shouldn't be any
- Fixed a crash if `nDPId -o value=''` was used
- Added OpenWrt packaging
- Added new flow event "analyse" used to give some statistical information about active flows
- Added new analyse event daemon which generates CSV files from such events
- Fixed a crash in nDPIsrvd if a collector closes a connection
- Support `nDPId` to send it's data to a UDP endpoint instead of a nDPIsrvd collector
- Added events and flow states documentation
- Added basic systemd support
- Fixed a bug in base64 encoding which could lead to invalid base64 strings
- Added some machine learning examples
- Fixed various smaller bugs
- Fixed nDPIsrvd bug which causes invalid JSON messages sent to Distributors
#### nDPId 1.5 (Apr 2022)
- Improved nDPId cross compilation
- zLib flow memory compression (Experimental!)
- Memory profiling for nDPId-test
- JSMN with parent link support for subtoken iteration
- Refactored nDPIsrvd buffer and buffer bloat handling
- Upgraded JSMN/uthash
- Improved nDPIsrvd.(h|py) debugging capability for client apps
- Advanced flow usage logging usable for memory profiling
- Support for dissection additional layer2/layer3 protocols
- Serialize more JSON information
- Add TCP/IP support for nDPIsrvd
- Improved nDPIsrvd connection lost behaviour
- Reworked Python/C distributor API
- Support read()/recv() timeouts and nonblocking I/O
#### nDPId 1.4 (Jun 2021)
- Use layer4 specific flow timeouts for nDPId
- Reworked layer4 flow length names and calculations (use only layer4 payload w/o any previous headers) for nDPId
- Build system cleanup and cosmetics
#### nDPId 1.3 (May 2021)
- Added missing datalink layer types
#### nDPId 1.2 (May 2021)
- OpenWrt compatible build system
#### nDPId 1.1 (May 2021)
- Added License information
#### nDPId 1.0 (May 2021)
- First public release

View File

@@ -1,37 +1,85 @@
cmake_minimum_required(VERSION 3.12.4)
project(nDPId C)
if(CMAKE_COMPILER_IS_GNUCXX)
execute_process(COMMAND ${CMAKE_C_COMPILER} -dumpversion OUTPUT_VARIABLE GCC_VERSION)
if (GCC_VERSION VERSION_GREATER 4.7 OR GCC_VERSION VERSION_EQUAL 4.7)
message(STATUS "${CMAKE_C_COMPILER} supports C11 standard.")
else ()
message(FATAL_ERROR "C Compiler with C11 standard needed. Therefore a gcc compiler with a version equal or higher than 4.7 is needed.")
endif()
endif(CMAKE_COMPILER_IS_GNUCXX)
set(CMAKE_C_STANDARD 11)
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -std=c11 -D_DEFAULT_SOURCE=1 -D_GNU_SOURCE=1")
if("${PROJECT_SOURCE_DIR}" STREQUAL "${PROJECT_BINARY_DIR}")
message(FATAL_ERROR "In-source builds are not allowed.\n"
"Please remove ${PROJECT_SOURCE_DIR}/CMakeCache.txt\n"
"and\n"
"${PROJECT_SOURCE_DIR}/CMakeFiles\n"
"Create a build directory somewhere and run CMake again.")
"Create a build directory somewhere and run CMake again.\n"
"Or run: 'cmake -S ${PROJECT_SOURCE_DIR} -B ./your-custom-build-dir [CMAKE-OPTIONS]'")
endif()
set(CMAKE_MODULE_PATH ${CMAKE_SOURCE_DIR}/cmake)
find_package(PkgConfig REQUIRED)
set(CPACK_PACKAGE_CONTACT "toni@impl.cc")
set(CPACK_DEBIAN_PACKAGE_NAME "nDPId")
set(CPACK_DEBIAN_PACKAGE_SECTION "network")
set(CPACK_DEBIAN_PACKAGE_DESCRIPTION "nDPId is a set of daemons and tools to capture, process and classify network traffic.")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "Toni Uhlig <toni@impl.cc>")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/packages/debian/preinst;${CMAKE_SOURCE_DIR}/packages/debian/prerm;${CMAKE_SOURCE_DIR}/packages/debian/postrm")
set(CPACK_DEBIAN_PACKAGE_CONTROL_STRICT_PERMISSION TRUE)
set(CPACK_DEBIAN_PACKAGE_SHLIBDEPS ON)
set(CPACK_DEBIAN_DEBUGINFO_PACKAGE ON)
set(CPACK_RPM_PACKAGE_LICENSE "GPL-3")
set(CPACK_RPM_PACKAGE_VENDOR "Toni Uhlig")
set(CPACK_RPM_PACKAGE_URL "https://www.github.com/utoni/nDPId.git")
set(CPACK_RPM_PACKAGE_DESCRIPTION "nDPId is a set of daemons and tools to capture, process and classify network traffic.")
set(CPACK_RPM_PRE_INSTALL_SCRIPT_FILE "${CMAKE_SOURCE_DIR}/packages/redhat/pre_install")
set(CPACK_RPM_PRE_UNINSTALL_SCRIPT_FILE "${CMAKE_SOURCE_DIR}/packages/redhat/pre_uninstall")
set(CPACK_RPM_POST_UNINSTALL_SCRIPT_FILE "${CMAKE_SOURCE_DIR}/packages/redhat/post_uninstall")
set(CPACK_STRIP_FILES ON)
set(CPACK_PACKAGE_VERSION_MAJOR 1)
set(CPACK_PACKAGE_VERSION_MINOR 5)
set(CPACK_PACKAGE_VERSION_MINOR 7)
set(CPACK_PACKAGE_VERSION_PATCH 0)
# Note: CPACK_PACKAGING_INSTALL_PREFIX and CMAKE_INSTALL_PREFIX are *not* the same.
# It is used only to ease environment file loading via systemd.
set(CPACK_PACKAGING_INSTALL_PREFIX "${CMAKE_INSTALL_PREFIX}")
set(CMAKE_MACOSX_RPATH 1)
include(CPack)
include(CheckFunctionExists)
include(CheckLibraryExists)
include(CheckEpoll)
check_epoll(HAS_EPOLL)
if(HAS_EPOLL)
option(FORCE_POLL "Force the use of poll() instead of epoll()." OFF)
if(NOT FORCE_POLL)
set(EPOLL_DEFS "-DENABLE_EPOLL=1")
endif()
else()
if(BUILD_EXAMPLES)
message(FATAL_ERROR "Examples are using epoll event I/O. Without epoll available, you can not build/run those.")
endif()
endif()
if(NOT MATH_FUNCTION_EXISTS AND NOT NEED_LINKING_AGAINST_LIBM)
CHECK_FUNCTION_EXISTS(log2f MATH_FUNCTION_EXISTS)
if(NOT MATH_FUNCTION_EXISTS)
unset(MATH_FUNCTION_EXISTS CACHE)
list(APPEND CMAKE_REQUIRED_LIBRARIES m)
CHECK_FUNCTION_EXISTS(log2f MATH_FUNCTION_EXISTS)
if(MATH_FUNCTION_EXISTS)
set(NEED_LINKING_AGAINST_LIBM TRUE CACHE BOOL "" FORCE)
else()
message(FATAL_ERROR "Failed making the log2f() function available")
endif()
endif()
list(APPEND CMAKE_REQUIRED_LIBRARIES m)
CHECK_FUNCTION_EXISTS(log2f MATH_FUNCTION_EXISTS)
if(MATH_FUNCTION_EXISTS)
set(NEED_LINKING_AGAINST_LIBM TRUE CACHE BOOL "" FORCE)
else()
check_library_exists(m sqrt "" NEED_LINKING_AGAINST_LIBM)
if(NOT NEED_LINKING_AGAINST_LIBM)
# Was not able to figure out if explicit linkage against libm is required.
# Forcing libm linkage. Good idea?
set(NEED_LINKING_AGAINST_LIBM TRUE CACHE BOOL "" FORCE)
endif()
endif()
endif()
endif()
if(NEED_LINKING_AGAINST_LIBM)
@@ -45,9 +93,85 @@ option(ENABLE_SANITIZER "Enable ASAN/LSAN/UBSAN." OFF)
option(ENABLE_SANITIZER_THREAD "Enable TSAN (does not work together with ASAN)." OFF)
option(ENABLE_MEMORY_PROFILING "Enable dynamic memory tracking." OFF)
option(ENABLE_ZLIB "Enable zlib support for nDPId (experimental)." OFF)
option(ENABLE_SYSTEMD "Install systemd components." OFF)
option(ENABLE_CRYPTO "Enable OpenSSL cryptographic support in nDPId/nDPIsrvd." OFF)
option(BUILD_EXAMPLES "Build C examples." ON)
option(BUILD_RUST_EXAMPLES "Build Rust examples." OFF)
if(BUILD_EXAMPLES)
option(ENABLE_DBUS "Build DBus notification example." OFF)
option(ENABLE_CURL "Build influxdb data write example." OFF)
endif()
option(ENABLE_PFRING "Enable PF_RING support for nDPId (experimental)" OFF)
option(BUILD_NDPI "Clone and build nDPI from github." OFF)
if(ENABLE_PFRING)
option(PFRING_LINK_STATIC "Link against a static version of pfring." ON)
set(PFRING_KERNEL_INC "" CACHE STRING "Path to PFRING kernel module include directory.")
set(PFRING_DEFS "-DENABLE_PFRING=1")
if(PFRING_KERNEL_INC STREQUAL "")
message(FATAL_ERROR "PFRING_KERNEL_INC needs to be set to the PFRING kernel module include directory.")
endif()
if(NOT EXISTS "${PFRING_KERNEL_INC}/linux/pf_ring.h")
message(FATAL_ERROR "Expected to find <linux/pf_ring.h> below ${PFRING_KERNEL_INC}, but none found.")
endif()
set(PFRING_INSTALLDIR "/opt/PF_RING/usr" CACHE STRING "")
set(PFRING_INC "${PFRING_INSTALLDIR}/include")
if(NOT EXISTS "${PFRING_INC}")
message(FATAL_ERROR "Include directory \"${PFRING_INC}\" does not exist!")
endif()
if(PFRING_LINK_STATIC)
if(CMAKE_SIZEOF_VOID_P EQUAL 8)
if(EXISTS "${PFRING_INSTALLDIR}/lib64")
set(STATIC_PFRING_LIB "${PFRING_INSTALLDIR}/lib64/libpfring.a")
else()
set(STATIC_PFRING_LIB "${PFRING_INSTALLDIR}/lib/libpfring.a")
endif()
else()
if(EXISTS "${PFRING_INSTALLDIR}/lib32")
set(STATIC_PFRING_LIB "${PFRING_INSTALLDIR}/lib32/libpfring.a")
else()
set(STATIC_PFRING_LIB "${PFRING_INSTALLDIR}/lib/libpfring.a")
endif()
endif()
if(NOT EXISTS "${STATIC_PFRING_LIB}")
message(FATAL_ERROR "Static library \"${STATIC_PFRING_LIB}\" does not exist!")
endif()
else()
if(CMAKE_SIZEOF_VOID_P EQUAL 8)
if(EXISTS "${PFRING_INSTALLDIR}/lib64")
find_library(PF_RING_LIB pfring PATHS "${PFRING_INSTALLDIR}/lib64")
else()
find_library(PF_RING_LIB pfring PATHS "${PFRING_INSTALLDIR}/lib")
endif()
else()
if(EXISTS "${PFRING_INSTALLDIR}/lib32")
find_library(PF_RING_LIB pfring PATHS "${PFRING_INSTALLDIR}/lib32")
else()
find_library(PF_RING_LIB pfring PATHS "${PFRING_INSTALLDIR}/lib")
endif()
endif()
if(NOT PF_RING_LIB)
message(FATAL_ERROR "libpfring.so not found below ${PFRING_INSTALLDIR}/{lib,lib32,lib64}")
endif()
endif()
if(NOT EXISTS "${PFRING_INSTALLDIR}/include/pfring.h")
message(FATAL_ERROR "Expected to find <include/pfring.h> inside ${PFRING_INSTALLDIR}, but none found.")
endif()
else()
unset(PFRING_INSTALLDIR CACHE)
unset(PFRING_INC CACHE)
unset(STATIC_PFRING_LIB CACHE)
unset(PFRING_LINK_STATIC CACHE)
endif()
if(BUILD_NDPI)
option(BUILD_NDPI_FORCE_GIT_UPDATE "Forcefully instruments nDPI build script to update the git submodule." OFF)
unset(NDPI_NO_PKGCONFIG CACHE)
unset(STATIC_LIBNDPI_INSTALLDIR CACHE)
else()
@@ -72,27 +196,40 @@ else()
unset(NDPI_WITH_MAXMINDDB CACHE)
endif()
set(CROSS_COMPILE_TRIPLET "" CACHE STRING "Host triplet used to enable cross compiling.")
if(ENABLE_PFRING)
set(NDPID_PFRING_SRCS npfring.c)
endif()
if(ENABLE_CRYPTO)
set(CRYPTO_SRCS ncrypt.c)
endif()
add_executable(nDPId nDPId.c ${NDPID_PFRING_SRCS} ${CRYPTO_SRCS} nio.c utils.c)
add_executable(nDPIsrvd nDPIsrvd.c nio.c utils.c)
add_executable(nDPId-test nDPId-test.c ${NDPID_PFRING_SRCS} ${CRYPTO_SRCS})
add_executable(nDPId nDPId.c utils.c)
add_executable(nDPIsrvd nDPIsrvd.c utils.c)
add_executable(nDPId-test nDPId-test.c)
add_custom_target(umask_check)
add_custom_command(
TARGET umask_check
PRE_BUILD
COMMAND ${CMAKE_SOURCE_DIR}/scripts/umask-check.sh
)
add_dependencies(nDPId umask_check)
add_custom_target(dist)
add_custom_command(
TARGET dist
PRE_BUILD
COMMAND "${CMAKE_SOURCE_DIR}/scripts/make-dist.sh"
)
add_custom_target(daemon)
add_custom_command(
TARGET daemon
COMMAND "${CMAKE_SOURCE_DIR}/scripts/daemon.sh" "$<TARGET_FILE:nDPId>" "$<TARGET_FILE:nDPIsrvd>"
DEPENDS nDPId nDPIsrvd
TARGET daemon
POST_BUILD
COMMAND env nDPIsrvd_ARGS='-C 1024' "${CMAKE_SOURCE_DIR}/scripts/daemon.sh" "$<TARGET_FILE:nDPId>" "$<TARGET_FILE:nDPIsrvd>"
)
add_dependencies(daemon nDPId nDPIsrvd)
if(NOT CROSS_COMPILE_TRIPLET STREQUAL "")
set(CMAKE_C_COMPILER_TARGET ${CROSS_COMPILE_TRIPLET})
if(CMAKE_CROSSCOMPILING)
set(CMAKE_FIND_ROOT_PATH_MODE_PROGRAM NEVER)
set(CMAKE_FIND_ROOT_PATH_MODE_LIBRARY ONLY)
set(CMAKE_FIND_ROOT_PATH_MODE_INCLUDE ONLY)
@@ -115,14 +252,29 @@ if(ENABLE_COVERAGE)
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -fprofile-arcs -ftest-coverage")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} --coverage")
set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} --coverage")
add_custom_target(coverage)
add_custom_target(coverage DEPENDS "${CMAKE_BINARY_DIR}/coverage_report/nDPId/index.html")
add_custom_command(
TARGET coverage
COMMAND "${CMAKE_SOURCE_DIR}/scripts/code-coverage.sh"
DEPENDS nDPId nDPIsrvd nDPId-test
OUTPUT "${CMAKE_BINARY_DIR}/coverage_report/nDPId/index.html"
COMMAND lcov --directory "${CMAKE_BINARY_DIR}" --directory "${CMAKE_SOURCE_DIR}/libnDPI" --capture --output-file "${CMAKE_BINARY_DIR}/lcov.info"
COMMAND genhtml -o "${CMAKE_BINARY_DIR}/coverage_report" "${CMAKE_BINARY_DIR}/lcov.info"
DEPENDS nDPId nDPId-test nDPIsrvd
)
add_custom_target(coverage-clean)
add_custom_command(
TARGET coverage-clean
COMMAND find "${CMAKE_BINARY_DIR}" "${CMAKE_SOURCE_DIR}/libnDPI" -name "*.gcda" -delete
POST_BUILD
)
add_custom_target(coverage-view)
add_custom_command(
TARGET coverage-view
COMMAND cd "${CMAKE_BINARY_DIR}/coverage_report" && python3 -m http.server
POST_BUILD
)
add_dependencies(coverage-view coverage)
endif()
if(ENABLE_SANITIZER)
# TODO: Check for `-fsanitize-memory-track-origins` and add if available?
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -fsanitize=address -fsanitize=undefined -fno-sanitize=alignment -fsanitize=enum -fsanitize=leak")
endif()
if(ENABLE_SANITIZER_THREAD)
@@ -132,17 +284,35 @@ if(ENABLE_ZLIB)
set(ZLIB_DEFS "-DENABLE_ZLIB=1")
pkg_check_modules(ZLIB REQUIRED zlib)
endif()
if(NDPI_WITH_GCRYPT)
message(STATUS "Enable GCRYPT")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-local-libgcrypt")
if(BUILD_EXAMPLES)
if(ENABLE_DBUS)
pkg_check_modules(DBUS REQUIRED dbus-1)
endif()
if(ENABLE_CURL)
pkg_check_modules(CURL REQUIRED libcurl)
endif()
endif()
if(NDPI_WITH_PCRE)
message(STATUS "Enable PCRE")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-pcre")
endif()
if(NDPI_WITH_MAXMINDDB)
message(STATUS "Enable MAXMINDDB")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-maxminddb")
if(BUILD_NDPI)
if(NDPI_WITH_GCRYPT)
message(STATUS "nDPI: Enable GCRYPT")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-local-libgcrypt")
endif()
if(NDPI_WITH_PCRE)
message(STATUS "nDPI: Enable PCRE")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-pcre2")
endif()
if(NDPI_WITH_MAXMINDDB)
message(STATUS "nDPI: Enable MAXMINDDB")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --with-maxminddb")
endif()
if(ENABLE_COVERAGE)
message(STATUS "nDPI: Enable Coverage")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --enable-code-coverage")
endif()
if(CMAKE_BUILD_TYPE STREQUAL "Debug" OR CMAKE_BUILD_TYPE STREQUAL "")
message(STATUS "nDPI: Enable Debug Build")
set(NDPI_ADDITIONAL_ARGS "${NDPI_ADDITIONAL_ARGS} --enable-debug-build --enable-debug-messages")
endif()
endif()
execute_process(
@@ -157,6 +327,7 @@ if(GIT_VERSION STREQUAL "" OR NOT IS_DIRECTORY "${CMAKE_SOURCE_DIR}/.git")
set(GIT_VERSION "${CPACK_PACKAGE_VERSION}-release")
endif()
endif()
set(PKG_VERSION "${CPACK_PACKAGE_VERSION}")
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -Wall -Wextra")
set(NDPID_DEFS -DJSMN_STATIC=1 -DJSMN_STRICT=1 -DJSMN_PARENT_LINKS=1)
@@ -164,8 +335,13 @@ set(NDPID_DEPS_INC "${CMAKE_SOURCE_DIR}"
"${CMAKE_SOURCE_DIR}/dependencies"
"${CMAKE_SOURCE_DIR}/dependencies/jsmn"
"${CMAKE_SOURCE_DIR}/dependencies/uthash/src")
if(CMAKE_CROSSCOMPILING)
add_definitions("-DCROSS_COMPILATION=1")
endif()
if(ENABLE_MEMORY_PROFILING)
message(WARNING "ENABLE_MEMORY_PROFILING should not be used in production environments.")
if(NOT CMAKE_BUILD_TYPE STREQUAL "Debug" AND NOT CMAKE_BUILD_TYPE STREQUAL "")
message(WARNING "ENABLE_MEMORY_PROFILING should not be used in production environments.")
endif()
add_definitions("-DENABLE_MEMORY_PROFILING=1"
"-Duthash_malloc=nDPIsrvd_uthash_malloc"
"-Duthash_free=nDPIsrvd_uthash_free")
@@ -190,13 +366,15 @@ if(BUILD_NDPI)
CONFIGURE_COMMAND env
CC=${CMAKE_C_COMPILER}
CXX=false
AR=${CMAKE_AR}
RANLIB=${CMAKE_RANLIB}
PKG_CONFIG=${PKG_CONFIG_EXECUTABLE}
CFLAGS=${CMAKE_C_FLAGS}
LDFLAGS=${CMAKE_MODULE_LINKER_FLAGS}
CROSS_COMPILE_TRIPLET=${CROSS_COMPILE_TRIPLET}
ADDITIONAL_ARGS=${NDPI_ADDITIONAL_ARGS}
MAKE_PROGRAM=${CMAKE_MAKE_PROGRAM}
DEST_INSTALL=${CMAKE_BINARY_DIR}/libnDPI
FORCE_GIT_UPDATE=${BUILD_NDPI_FORCE_GIT_UPDATE}
${CMAKE_CURRENT_SOURCE_DIR}/scripts/get-and-build-libndpi.sh
BUILD_BYPRODUCTS ${CMAKE_BINARY_DIR}/libnDPI/lib/libndpi.a
BUILD_COMMAND ""
@@ -204,7 +382,9 @@ if(BUILD_NDPI)
BUILD_IN_SOURCE 1)
add_custom_target(clean-libnDPI
COMMAND rm -rf ${CMAKE_BINARY_DIR}/libnDPI ${CMAKE_BINARY_DIR}/libnDPI-prefix
COMMAND ${CMAKE_BUILD_TOOL} clean
COMMAND rm -rf ${CMAKE_BINARY_DIR}/libnDPI
COMMAND rm -f ${CMAKE_BINARY_DIR}/libnDPI-prefix/src/libnDPI-stamp/libnDPI-configure
)
set(STATIC_LIBNDPI_INSTALLDIR "${CMAKE_BINARY_DIR}/libnDPI")
@@ -212,13 +392,19 @@ if(BUILD_NDPI)
add_dependencies(nDPId-test libnDPI)
endif()
if(ENABLE_CRYPTO)
find_package(OpenSSL REQUIRED)
set(OSSL_DEFS "-DENABLE_CRYPTO=1")
set(OSSL_LIBRARY "${OPENSSL_SSL_LIBRARY}" "${OPENSSL_CRYPTO_LIBRARY}")
endif()
if(STATIC_LIBNDPI_INSTALLDIR OR BUILD_NDPI OR NDPI_NO_PKGCONFIG)
if(NDPI_WITH_GCRYPT)
find_package(GCRYPT "1.4.2" REQUIRED)
endif()
if(NDPI_WITH_PCRE)
pkg_check_modules(PCRE REQUIRED libpcre>=8.39)
pkg_check_modules(PCRE REQUIRED libpcre2-8)
endif()
if(NDPI_WITH_MAXMINDDB)
@@ -229,7 +415,19 @@ endif()
if(STATIC_LIBNDPI_INSTALLDIR OR BUILD_NDPI)
add_definitions("-DLIBNDPI_STATIC=1")
set(STATIC_LIBNDPI_INC "${STATIC_LIBNDPI_INSTALLDIR}/include/ndpi")
set(STATIC_LIBNDPI_LIB "${STATIC_LIBNDPI_INSTALLDIR}/lib/libndpi.a")
if(CMAKE_SIZEOF_VOID_P EQUAL 8)
if(EXISTS "${STATIC_LIBNDPI_INSTALLDIR}/lib64/libndpi.a")
set(STATIC_LIBNDPI_LIB "${STATIC_LIBNDPI_INSTALLDIR}/lib64/libndpi.a")
else()
set(STATIC_LIBNDPI_LIB "${STATIC_LIBNDPI_INSTALLDIR}/lib/libndpi.a")
endif()
else()
if(EXISTS "${STATIC_LIBNDPI_INSTALLDIR}/lib32/libndpi.a")
set(STATIC_LIBNDPI_LIB "${STATIC_LIBNDPI_INSTALLDIR}/lib32/libndpi.a")
else()
set(STATIC_LIBNDPI_LIB "${STATIC_LIBNDPI_INSTALLDIR}/lib/libndpi.a")
endif()
endif()
if(STATIC_LIBNDPI_INSTALLDIR AND NOT BUILD_NDPI)
if(NOT EXISTS "${STATIC_LIBNDPI_INC}" OR NOT EXISTS "${STATIC_LIBNDPI_LIB}")
@@ -239,9 +437,13 @@ if(STATIC_LIBNDPI_INSTALLDIR OR BUILD_NDPI)
endif()
unset(DEFAULT_NDPI_INCLUDE CACHE)
unset(pkgcfg_lib_NDPI_ndpi CACHE)
else()
if(NOT NDPI_NO_PKGCONFIG)
pkg_check_modules(NDPI REQUIRED libndpi>=4.3.0)
pkg_check_modules(NDPI REQUIRED libndpi>=5.0.0)
if(NOT pkgcfg_lib_NDPI_ndpi)
find_package(NDPI "5.0.0" REQUIRED)
endif()
unset(STATIC_LIBNDPI_INC CACHE)
unset(STATIC_LIBNDPI_LIB CACHE)
@@ -250,34 +452,62 @@ else()
set(DEFAULT_NDPI_INCLUDE ${NDPI_INCLUDE_DIRS})
endif()
find_package(PCAP "1.8.1" REQUIRED)
pkg_check_modules(PCAP libpcap>=1.9.0) # no *.pc file before 1.9.0
if(NOT pkgcfg_lib_PCAP_pcap)
pkg_check_modules(PCAP libpcap>=1.8.1) # seems like some distributions provide their own *.pc file for 1.8.1 (e.g. Ubuntu-18.04)
endif()
if(NOT pkgcfg_lib_PCAP_pcap)
find_package(PCAP "1.9.0" REQUIRED)
endif()
target_compile_options(nDPId PRIVATE "-pthread")
target_compile_definitions(nDPId PRIVATE -D_GNU_SOURCE=1 -DGIT_VERSION=\"${GIT_VERSION}\" ${NDPID_DEFS} ${ZLIB_DEFS})
target_include_directories(nDPId PRIVATE "${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" ${NDPID_DEPS_INC})
target_link_libraries(nDPId "${STATIC_LIBNDPI_LIB}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre}" "${pkgcfg_lib_MAXMINDDB_maxminddb}" "${pkgcfg_lib_ZLIB_z}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}" "${LIBM_LIB}"
"-pthread")
target_compile_definitions(nDPId PRIVATE -D_GNU_SOURCE=1 -DPKG_VERSION=\"${PKG_VERSION}\" -DGIT_VERSION=\"${GIT_VERSION}\"
${NDPID_DEFS} ${EPOLL_DEFS} ${ZLIB_DEFS} ${PFRING_DEFS} ${OSSL_DEFS})
target_include_directories(nDPId PRIVATE "${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" ${NDPID_DEPS_INC} ${PFRING_KERNEL_INC} ${PFRING_INC})
target_link_libraries(nDPId "${STATIC_LIBNDPI_LIB}" "${STATIC_PFRING_LIB}" "${pkgcfg_lib_PCAP_pcap}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre2-8}" "${pkgcfg_lib_MAXMINDDB_maxminddb}" "${pkgcfg_lib_ZLIB_z}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}" "${LIBM_LIB}" "${PF_RING_LIB}"
"${OSSL_LIBRARY}" "-pthread")
target_compile_definitions(nDPIsrvd PRIVATE -D_GNU_SOURCE=1 -DGIT_VERSION=\"${GIT_VERSION}\" ${NDPID_DEFS})
target_compile_definitions(nDPIsrvd PRIVATE -D_GNU_SOURCE=1 -DPKG_VERSION=\"${PKG_VERSION}\" -DGIT_VERSION=\"${GIT_VERSION}\" ${NDPID_DEFS} ${EPOLL_DEFS})
target_include_directories(nDPIsrvd PRIVATE ${NDPID_DEPS_INC})
target_include_directories(nDPId-test PRIVATE ${NDPID_DEPS_INC})
target_compile_options(nDPId-test PRIVATE "-Wno-unused-function" "-pthread")
target_compile_definitions(nDPId-test PRIVATE -D_GNU_SOURCE=1 -DNO_MAIN=1 -DGIT_VERSION=\"${GIT_VERSION}\"
${NDPID_DEFS} ${ZLIB_DEFS} ${NDPID_TEST_MPROF_DEFS})
target_compile_definitions(nDPId-test PRIVATE -D_GNU_SOURCE=1 -DNO_MAIN=1 -DPKG_VERSION=\"${PKG_VERSION}\" -DGIT_VERSION=\"${GIT_VERSION}\"
${NDPID_DEFS} ${EPOLL_DEFS} ${ZLIB_DEFS} ${PFRING_DEFS} ${OSSL_DEFS} ${NDPID_TEST_MPROF_DEFS})
target_include_directories(nDPId-test PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" ${NDPID_DEPS_INC})
target_link_libraries(nDPId-test "${STATIC_LIBNDPI_LIB}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre}" "${pkgcfg_lib_MAXMINDDB_maxminddb}" "${pkgcfg_lib_ZLIB_z}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}" "${LIBM_LIB}"
"-pthread")
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" ${NDPID_DEPS_INC} ${PFRING_KERNEL_INC} ${PFRING_INC})
target_link_libraries(nDPId-test "${STATIC_LIBNDPI_LIB}" "${STATIC_PFRING_LIB}" "${pkgcfg_lib_PCAP_pcap}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre2-8}" "${pkgcfg_lib_MAXMINDDB_maxminddb}" "${pkgcfg_lib_ZLIB_z}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}" "${LIBM_LIB}" "${PF_RING_LIB}"
"${OSSL_LIBRARY}" "-pthread")
if(CMAKE_C_COMPILER_ID STREQUAL "Clang")
add_executable(fuzz_ndpi_process_packet test/fuzz_ndpi_process_packet.c)
if(BUILD_NDPI)
add_dependencies(fuzz_ndpi_process_packet libnDPI)
endif()
target_compile_options(fuzz_ndpi_process_packet PRIVATE "-Wno-unused-function" "-fsanitize=fuzzer" "-pthread")
target_compile_definitions(fuzz_ndpi_process_packet PRIVATE -D_GNU_SOURCE=1
-DPKG_VERSION=\"${PKG_VERSION}\" -DGIT_VERSION=\"${GIT_VERSION}\"
${NDPID_DEFS} ${EPOLL_DEFS} ${ZLIB_DEFS} ${PFRING_DEFS})
target_include_directories(fuzz_ndpi_process_packet PRIVATE "${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}"
${NDPID_DEPS_INC} ${PFRING_KERNEL_INC} ${PFRING_INC})
target_link_libraries(fuzz_ndpi_process_packet "${STATIC_LIBNDPI_LIB}" "${STATIC_PFRING_LIB}" "${pkgcfg_lib_PCAP_pcap}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre2-8}" "${pkgcfg_lib_MAXMINDDB_maxminddb}" "${pkgcfg_lib_ZLIB_z}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}" "${LIBM_LIB}" "${PF_RING_LIB}"
"-pthread")
target_link_options(fuzz_ndpi_process_packet PRIVATE "-fsanitize=fuzzer")
endif()
if(BUILD_EXAMPLES)
add_executable(nDPIsrvd-collectd examples/c-collectd/c-collectd.c)
add_executable(nDPIsrvd-collectd examples/c-collectd/c-collectd.c utils.c)
if(BUILD_NDPI)
add_dependencies(nDPIsrvd-collectd libnDPI)
endif()
target_compile_definitions(nDPIsrvd-collectd PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-collectd PRIVATE ${NDPID_DEPS_INC})
target_include_directories(nDPIsrvd-collectd PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" "${CMAKE_SOURCE_DIR}" ${NDPID_DEPS_INC})
add_executable(nDPIsrvd-captured examples/c-captured/c-captured.c utils.c)
if(BUILD_NDPI)
@@ -286,81 +516,171 @@ if(BUILD_EXAMPLES)
target_compile_definitions(nDPIsrvd-captured PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-captured PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" "${CMAKE_SOURCE_DIR}" ${NDPID_DEPS_INC})
target_link_libraries(nDPIsrvd-captured "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre}" "${pkgcfg_lib_MAXMINDDB_maxminddb}"
target_link_libraries(nDPIsrvd-captured "${pkgcfg_lib_PCAP_pcap}" "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre2-8}" "${pkgcfg_lib_MAXMINDDB_maxminddb}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}")
add_executable(nDPIsrvd-json-dump examples/c-json-stdout/c-json-stdout.c)
target_compile_definitions(nDPIsrvd-json-dump PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-json-dump PRIVATE ${NDPID_DEPS_INC})
add_executable(nDPIsrvd-analysed examples/c-analysed/c-analysed.c utils.c)
if(BUILD_NDPI)
add_dependencies(nDPIsrvd-analysed libnDPI)
endif()
target_compile_definitions(nDPIsrvd-analysed PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-analysed PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" "${CMAKE_SOURCE_DIR}" ${NDPID_DEPS_INC})
add_executable(nDPIsrvd-simple examples/c-simple/c-simple.c)
target_compile_definitions(nDPIsrvd-simple PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-simple PRIVATE ${NDPID_DEPS_INC})
target_link_libraries(nDPIsrvd-simple "${pkgcfg_lib_NDPI_ndpi}"
"${pkgcfg_lib_PCRE_pcre}" "${pkgcfg_lib_MAXMINDDB_maxminddb}"
"${GCRYPT_LIBRARY}" "${GCRYPT_ERROR_LIBRARY}" "${PCAP_LIBRARY}")
if(ENABLE_COVERAGE)
add_dependencies(coverage nDPIsrvd-collectd nDPIsrvd-captured nDPIsrvd-json-dump nDPIsrvd-simple)
add_dependencies(coverage nDPIsrvd-analysed nDPIsrvd-collectd nDPIsrvd-captured nDPIsrvd-simple)
if(BUILD_NDPI)
add_dependencies(coverage libnDPI)
endif()
endif()
install(TARGETS nDPIsrvd-collectd nDPIsrvd-captured nDPIsrvd-json-dump nDPIsrvd-simple DESTINATION bin)
if(ENABLE_DBUS)
add_executable(nDPIsrvd-notifyd examples/c-notifyd/c-notifyd.c utils.c)
if(BUILD_NDPI)
add_dependencies(nDPIsrvd-notifyd libnDPI)
endif()
target_compile_definitions(nDPIsrvd-notifyd PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-notifyd PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" "${CMAKE_SOURCE_DIR}" "${NDPID_DEPS_INC}"
"${DBUS_INCLUDE_DIRS}")
target_link_libraries(nDPIsrvd-notifyd "${DBUS_LIBRARIES}")
install(TARGETS nDPIsrvd-notifyd DESTINATION bin)
endif()
if(ENABLE_CURL)
add_executable(nDPIsrvd-influxd examples/c-influxd/c-influxd.c utils.c)
if(BUILD_NDPI)
add_dependencies(nDPIsrvd-influxd libnDPI)
endif()
target_compile_definitions(nDPIsrvd-influxd PRIVATE ${NDPID_DEFS})
target_include_directories(nDPIsrvd-influxd PRIVATE
"${STATIC_LIBNDPI_INC}" "${DEFAULT_NDPI_INCLUDE}" "${CMAKE_SOURCE_DIR}" "${NDPID_DEPS_INC}"
"${CURL_INCLUDE_DIRS}")
target_link_libraries(nDPIsrvd-influxd "${CURL_LIBRARIES}")
install(TARGETS nDPIsrvd-influxd DESTINATION bin)
endif()
install(TARGETS nDPIsrvd-analysed nDPIsrvd-collectd nDPIsrvd-captured nDPIsrvd-simple DESTINATION bin)
install(FILES examples/c-collectd/plugin_nDPIsrvd.conf examples/c-collectd/rrdgraph.sh DESTINATION share/nDPId/nDPIsrvd-collectd)
install(DIRECTORY examples/c-collectd/www DESTINATION share/nDPId/nDPIsrvd-collectd)
endif()
if(BUILD_RUST_EXAMPLES)
add_custom_command(
OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/target/release/rs-simple
COMMAND cargo build --release
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/examples/rs-simple
COMMENT "Build Rust executable with cargo: rs-simple"
)
add_custom_target(rs-simple ALL
DEPENDS ${CMAKE_CURRENT_BINARY_DIR}/target/release/rs-simple
)
endif()
if(ENABLE_SYSTEMD)
configure_file(packages/systemd/ndpisrvd.service.in ndpisrvd.service @ONLY)
configure_file(packages/systemd/ndpid@.service.in ndpid@.service @ONLY)
install(DIRECTORY DESTINATION etc/nDPId)
install(FILES "ndpid.conf.example" DESTINATION share/nDPId)
install(FILES "ndpisrvd.conf.example" DESTINATION share/nDPId)
install(FILES "${CMAKE_BINARY_DIR}/ndpisrvd.service" DESTINATION lib/systemd/system)
install(FILES "${CMAKE_BINARY_DIR}/ndpid@.service" DESTINATION lib/systemd/system)
endif()
install(FILES config.h
dependencies/nDPIsrvd.h
dependencies/jsmn/jsmn.h
dependencies/uthash/src/utarray.h
dependencies/uthash/src/uthash.h
dependencies/uthash/src/utlist.h
dependencies/uthash/src/utringbuffer.h
dependencies/uthash/src/utstack.h
dependencies/uthash/src/utstring.h
DESTINATION include/nDPId)
install(TARGETS nDPId DESTINATION sbin)
install(TARGETS nDPIsrvd nDPId-test DESTINATION bin)
install(FILES dependencies/nDPIsrvd.py examples/py-flow-dashboard/plotly_dash.py
DESTINATION share/nDPId)
install(FILES examples/py-flow-info/flow-info.py
DESTINATION bin RENAME nDPIsrvd-flow-info.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-flow-dashboard/flow-dash.py
DESTINATION bin RENAME nDPIsrvd-flow-dash.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-ja3-checker/py-ja3-checker.py
DESTINATION bin RENAME nDPIsrvd-ja3-checker.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-json-stdout/json-stdout.py
DESTINATION bin RENAME nDPIsrvd-json-stdout.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-schema-validation/py-schema-validation.py
DESTINATION bin RENAME nDPIsrvd-schema-validation.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-semantic-validation/py-semantic-validation.py
DESTINATION bin RENAME nDPIsrvd-semantic-validation.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
if(BUILD_EXAMPLES)
install(FILES dependencies/nDPIsrvd.py
DESTINATION share/nDPId)
install(FILES examples/py-flow-info/flow-info.py
DESTINATION bin RENAME nDPIsrvd-flow-info.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-json-stdout/json-stdout.py
DESTINATION bin RENAME nDPIsrvd-json-stdout.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-schema-validation/py-schema-validation.py
DESTINATION bin RENAME nDPIsrvd-schema-validation.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-semantic-validation/py-semantic-validation.py
DESTINATION bin RENAME nDPIsrvd-semantic-validation.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
install(FILES examples/py-machine-learning/sklearn-random-forest.py
DESTINATION bin RENAME nDPIsrvd-sklearn.py
PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE)
endif()
install(FILES schema/error_event_schema.json schema/daemon_event_schema.json
schema/flow_event_schema.json schema/packet_event_schema.json DESTINATION share/nDPId)
schema/flow_event_schema.json schema/packet_event_schema.json DESTINATION share/nDPId/json-schema)
message(STATUS "--------------------------")
message(STATUS "nDPId GIT_VERSION........: ${GIT_VERSION}")
message(STATUS "CROSS_COMPILE_TRIPLET....: ${CROSS_COMPILE_TRIPLET}")
message(STATUS "Cross Compilation........: ${CMAKE_CROSSCOMPILING}")
message(STATUS "CMAKE_BUILD_TYPE.........: ${CMAKE_BUILD_TYPE}")
message(STATUS "CMAKE_C_FLAGS............: ${CMAKE_C_FLAGS}")
message(STATUS "NDPID_DEFS...............: ${NDPID_DEFS}")
message(STATUS "FORCE_POLL...............: ${FORCE_POLL}")
message(STATUS "ENABLE_PFRING............: ${ENABLE_PFRING}")
if(ENABLE_PFRING)
message(STATUS "PFRING_LINK_STATIC.......: ${PFRING_LINK_STATIC}")
endif()
message(STATUS "ENABLE_CRYPTO............: ${ENABLE_CRYPTO}")
message(STATUS "ENABLE_COVERAGE..........: ${ENABLE_COVERAGE}")
message(STATUS "ENABLE_SANITIZER.........: ${ENABLE_SANITIZER}")
message(STATUS "ENABLE_SANITIZER_THREAD..: ${ENABLE_SANITIZER_THREAD}")
message(STATUS "ENABLE_MEMORY_PROFILING..: ${ENABLE_MEMORY_PROFILING}")
message(STATUS "ENABLE_ZLIB..............: ${ENABLE_ZLIB}")
if(STATIC_LIBNDPI_INSTALLDIR)
message(STATUS "STATIC_LIBNDPI_INSTALLDIR: ${STATIC_LIBNDPI_INSTALLDIR}")
endif()
message(STATUS "BUILD_NDPI...............: ${BUILD_NDPI}")
message(STATUS "BUILD_EXAMPLES...........: ${BUILD_EXAMPLES}")
message(STATUS "BUILD_RUST_EXAMPLES......: ${BUILD_RUST_EXAMPLES}")
if(BUILD_EXAMPLES)
message(STATUS "ENABLE_DBUS..............: ${ENABLE_DBUS}")
message(STATUS "ENABLE_CURL..............: ${ENABLE_CURL}")
endif()
if(BUILD_NDPI)
message(STATUS "NDPI_ADDITIONAL_ARGS.....: ${NDPI_ADDITIONAL_ARGS}")
endif()
message(STATUS "NDPI_NO_PKGCONFIG........: ${NDPI_NO_PKGCONFIG}")
if(STATIC_LIBNDPI_INSTALLDIR OR BUILD_NDPI OR NDPI_NO_PKGCONFIG)
message(STATUS "--------------------------")
message(STATUS "- STATIC_LIBNDPI_INC....: ${STATIC_LIBNDPI_INC}")
message(STATUS "- STATIC_LIBNDPI_LIB....: ${STATIC_LIBNDPI_LIB}")
message(STATUS "- NDPI_WITH_GCRYPT......: ${NDPI_WITH_GCRYPT}")
message(STATUS "- NDPI_WITH_PCRE........: ${NDPI_WITH_PCRE}")
message(STATUS "- NDPI_WITH_MAXMINDDB...: ${NDPI_WITH_MAXMINDDB}")
if(PFRING_INSTALLDIR)
message(STATUS "PFRING_INSTALLDIR........: ${PFRING_INSTALLDIR}")
message(STATUS "- PFRING_INC.............: ${PFRING_INC}")
message(STATUS "- PFRING_KERNEL_INC......: ${PFRING_KERNEL_INC}")
message(STATUS "- STATIC_PFRING_LIB......: ${STATIC_PFRING_LIB}")
message(STATUS "- SHARED_PFRING_LIB......: ${PF_RING_LIB}")
message(STATUS "--------------------------")
endif()
if(STATIC_LIBNDPI_INSTALLDIR)
message(STATUS "STATIC_LIBNDPI_INSTALLDIR: ${STATIC_LIBNDPI_INSTALLDIR}")
endif()
if(STATIC_LIBNDPI_INSTALLDIR OR BUILD_NDPI OR NDPI_NO_PKGCONFIG)
message(STATUS "- STATIC_LIBNDPI_INC.....: ${STATIC_LIBNDPI_INC}")
message(STATUS "- STATIC_LIBNDPI_LIB.....: ${STATIC_LIBNDPI_LIB}")
message(STATUS "- NDPI_WITH_GCRYPT.......: ${NDPI_WITH_GCRYPT}")
message(STATUS "- NDPI_WITH_PCRE.........: ${NDPI_WITH_PCRE}")
message(STATUS "- NDPI_WITH_MAXMINDDB....: ${NDPI_WITH_MAXMINDDB}")
endif()
if(NOT STATIC_LIBNDPI_INSTALLDIR AND NOT BUILD_NDPI)
message(STATUS "- DEFAULT_NDPI_INCLUDE..: ${DEFAULT_NDPI_INCLUDE}")
message(STATUS "- DEFAULT_NDPI_INCLUDE...: ${DEFAULT_NDPI_INCLUDE}")
endif()
if(NOT NDPI_NO_PKGCONFIG)
message(STATUS "- pkgcfg_lib_NDPI_ndpi...: ${pkgcfg_lib_NDPI_ndpi}")
endif()
message(STATUS "--------------------------")
if(CMAKE_C_COMPILER_ID STREQUAL "Clang")
message(STATUS "Fuzzing enabled")
endif()

46
Dockerfile Normal file
View File

@@ -0,0 +1,46 @@
FROM ubuntu:22.04 AS builder-ubuntu-2204
WORKDIR /root
RUN apt-get -y update \
&& apt-get install -y --no-install-recommends \
autoconf automake build-essential ca-certificates cmake git \
libpcap-dev libcurl4-openssl-dev libdbus-1-dev libtool make pkg-config unzip wget \
&& apt-get clean \
&& git clone https://github.com/utoni/nDPId.git
WORKDIR /root/nDPId
RUN cmake -S . -B build -DBUILD_NDPI=ON -DBUILD_EXAMPLES=ON \
-DENABLE_DBUS=ON -DENABLE_CURL=ON \
&& cmake --build build --verbose
FROM ubuntu:22.04
USER root
WORKDIR /
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPId /usr/sbin/nDPId
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd /usr/bin/nDPIsrvd
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPId-test /usr/bin/nDPId-test
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-collectd /usr/bin/nDPIsrvd-collectd
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-captured /usr/bin/nDPIsrvd-captured
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-analysed /usr/bin/nDPIsrvd-anaylsed
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-analysed /usr/bin/nDPIsrvd-anaylsed
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-notifyd /usr/bin/nDPIsrvd-notifyd
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-influxd /usr/bin/nDPIsrvd-influxd
COPY --from=builder-ubuntu-2204 /root/nDPId/build/nDPIsrvd-simple /usr/bin/nDPIsrvd-simple
RUN apt-get -y update \
&& apt-get install -y --no-install-recommends libpcap-dev \
&& apt-get clean
USER nobody
RUN /usr/bin/nDPIsrvd -h || { RC=$?; test ${RC} -eq 1; }; \
/usr/sbin/nDPId -h || { RC=$?; test ${RC} -eq 1; }
FROM archlinux:base-devel AS builder-archlinux
WORKDIR /root
RUN pacman --noconfirm -Sy cmake git unzip wget && mkdir /build && chown nobody /build && cd /build \
&& runuser -u nobody git clone https://github.com/utoni/nDPId.git
WORKDIR /build/nDPId/packages/archlinux
RUN runuser -u nobody makepkg

254
README.md
View File

@@ -1,22 +1,41 @@
[![Build](https://github.com/utoni/nDPId/actions/workflows/build.yml/badge.svg)](https://github.com/utoni/nDPId/actions/workflows/build.yml)
[![Gitlab-CI](https://gitlab.com/utoni/nDPId/badges/master/pipeline.svg)](https://gitlab.com/utoni/nDPId/-/pipelines)
[![Gitlab-CI](https://gitlab.com/utoni/nDPId/badges/main/pipeline.svg)](https://gitlab.com/utoni/nDPId/-/pipelines)
[![Circle-CI](https://circleci.com/gh/utoni/nDPId.svg?style=shield "Circle-CI")](https://app.circleci.com/pipelines/github/utoni/nDPId)
[![Lines of Code](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=ncloc)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
[![Code Smells](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=code_smells)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
[![Bugs](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=bugs)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
[![Vulnerabilities](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=vulnerabilities)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
[![Reliability Rating](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=reliability_rating)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=lnslbrty_nDPId&metric=alert_status)](https://sonarcloud.io/summary/new_code?id=lnslbrty_nDPId)
![Docker Automated build](https://img.shields.io/docker/automated/utoni/ndpid)
# References
[ntop Webinar 2022](https://www.ntop.org/webinar/ntop-webinar-on-dec-14th-community-meeting-and-future-plans/)
[ntopconf 2023](https://www.ntop.org/ntopconf2023/)
# Disclaimer
Please respect & protect the privacy of others.
The purpose of this software is not to spy on others, but to detect network anomalies and malicious traffic.
# Abstract
nDPId is a set of daemons and tools to capture, process and classify network traffic.
It's minimal dependencies (besides a half-way modern c library and POSIX threads) are libnDPI (>= 4.4.0 or current github dev branch) and libpcap.
Its minimal dependencies (besides a to some extent modern C library and POSIX threads) are libnDPI (>=5.0.0 or current github dev branch) and libpcap.
The daemon `nDPId` is capable of multithreading for packet processing, but w/o mutexes for performance reasons.
Instead synchronization is achieved by a packet distribution mechanism.
To balance all workload to all threads (more or less) equally a unique identifier represented as hash value is calculated using a 3-tuple consisting of IPv4/IPv6 src/dst address, IP header value of the layer4 protocol and (for TCP/UDP) src/dst port. Other protocols e.g. ICMP/ICMPv6 are lacking relevance for DPI, thus nDPId does not distinguish between different ICMP/ICMPv6 flows coming from the same host. Saves memory and performance, but might change in the future.
Instead, synchronization is achieved by a packet distribution mechanism.
To balance the workload to all threads (more or less) equally, a unique identifier represented as hash value is calculated using a 3-tuple consisting of: IPv4/IPv6 src/dst address; IP header value of the layer4 protocol; and (for TCP/UDP) src/dst port. Other protocols e.g. ICMP/ICMPv6 lack relevance for DPI, thus nDPId does not distinguish between different ICMP/ICMPv6 flows coming from the same host. This saves memory and performance, but might change in the future.
`nDPId` uses libnDPI's JSON serialization interface to generate a JSON strings for each event it receive from the library and which it then sends out to a UNIX-socket (default: /tmp/ndpid-collector.sock ). From such a socket, `nDPIsrvd` (or other custom applications) can retrieve incoming JSON-messages and further proceed working/distributing messages to higher-level applications.
`nDPId` uses libnDPI's JSON serialization interface to generate a JSON messages for each event it receives from the library and which it then sends out to a UNIX-socket (default: `/tmp/ndpid-collector.sock` ). From such a socket, `nDPIsrvd` (or other custom applications) can retrieve incoming JSON-messages and further proceed working/distributing messages to higher-level applications.
Unfortunately `nDPIsrvd` does currently not support any encryption/authentication for TCP connections (TODO!).
Unfortunately, `nDPIsrvd` does not yet support any encryption/authentication for TCP connections (TODO!).
# Architecture
This project uses some kind of microservice architecture.
This project uses a kind of microservice architecture.
```text
connect to UNIX socket [1] connect to UNIX/TCP socket [2]
@@ -36,12 +55,15 @@ _______________________ | | ________________
```
where:
* `nDPId` capture traffic, extract traffic data (with libnDPI) and send a JSON-serialized output stream to an already existing UNIX-socket;
* `nDPIsrvd`:
* create and manage an "incoming" UNIX-socket (ref [1] above), to fetch data from a local `nDPId`;
* apply a filtering logic to received data to select "flow_event_id" related JSONs;
* apply a buffering logic to received data;
* create and manage an "outgoing" UNIX or TCP socket (ref [2] above) to relay matched events
to connected clients
to connected clients
* `consumers` are common/custom applications being able to receive selected flows/events, via both UNIX-socket or TCP-socket.
@@ -49,26 +71,98 @@ where:
JSON messages streamed by both `nDPId` and `nDPIsrvd` are presented with:
* a 5-digit-number describing (as decimal number) of the **entire** JSON string including the newline `\n` at the end;
* a 5-digit-number describing (as decimal number) the **entire** JSON message including the newline `\n` at the end;
* the JSON messages
```text
[5-digit-number][JSON string]
[5-digit-number][JSON message]
```
as with the following example:
```text
01223{"flow_event_id":7,"flow_event_name":"detection-update","thread_id":12,"packet_id":307,"source":"wlan0",[...]}
00458{"packet_event_id":2,"packet_event_name":"packet-flow","thread_id":11,"packet_id":324,"source":"wlan0",[...]]}
00572{"flow_event_id":1,"flow_event_name":"new","thread_id":11,"packet_id":324,"source":"wlan0",[...]}
01223{"flow_event_id":7,"flow_event_name":"detection-update","thread_id":12,"packet_id":307,"source":"wlan0", ...snip...}
00458{"packet_event_id":2,"packet_event_name":"packet-flow","thread_id":11,"packet_id":324,"source":"wlan0", ...snip...}
00572{"flow_event_id":1,"flow_event_name":"new","thread_id":11,"packet_id":324,"source":"wlan0", ...snip...}
```
The full stream of `nDPId` generated JSON-events can be retrieved directly from `nDPId`, without relying on `nDPIsrvd`, by providing a properly managed UNIX-socket.
Technical details about JSON-messages format can be obtained from related `.schema` file included in the `schema` directory
Technical details about the JSON-message format can be obtained from the related `.schema` file included in the `schema` directory
# Events
`nDPId` generates JSON messages whereby each string is assigned to a certain event.
Those events specify the contents (key-value-pairs) of the JSON message.
They are divided into four categories, each with a number of subevents.
## Error Events
They are 17 distinct events, indicating that layer2 or layer3 packet processing failed or not enough flow memory available:
1. Unknown datalink layer packet
2. Unknown L3 protocol
3. Unsupported datalink layer
4. Packet too short
5. Unknown packet type
6. Packet header invalid
7. IP4 packet too short
8. Packet smaller than IP4 header:
9. nDPI IPv4/L4 payload detection failed
10. IP6 packet too short
11. Packet smaller than IP6 header
12. nDPI IPv6/L4 payload detection failed
13. TCP packet smaller than expected
14. UDP packet smaller than expected
15. Captured packet size is smaller than expected packet size
16. Max flows to track reached
17. Flow memory allocation failed
Detailed JSON-schema is available [here](schema/error_event_schema.json)
## Daemon Events
There are 4 distinct events indicating startup/shutdown or status events as well as a reconnect event if there was a previous connection failure (collector):
1. init: `nDPId` startup
2. reconnect: (UNIX) socket connection lost previously and was established again
3. shutdown: `nDPId` terminates gracefully
4. status: statistics about the daemon itself e.g. memory consumption, zLib compressions (if enabled)
Detailed JSON-schema is available [here](schema/daemon_event_schema.json)
## Packet Events
There are 2 events containing base64 encoded packet payloads either belonging to a flow or not:
1. packet: does not belong to any flow
2. packet-flow: belongs to a flow e.g. TCP/UDP or ICMP
Detailed JSON-schema is available [here](schema/packet_event_schema.json)
## Flow Events
There are 9 distinct events related to a flow:
1. new: a new TCP/UDP/ICMP flow seen which will be tracked
2. end: a TCP connection terminates
3. idle: a flow timed out, because there was no packet on the wire for a certain amount of time
4. update: inform nDPIsrvd or other apps about a long-lasting flow, whose detection was finished a long time ago but is still active
5. analyse: provide some information about extracted features of a flow (Experimental; disabled per default, enable with `-A`)
6. guessed: `libnDPI` was not able to reliably detect a layer7 protocol and falls back to IP/Port based detection
7. detected: `libnDPI` sucessfully detected a layer7 protocol
8. detection-update: `libnDPI` dissected more layer7 protocol data (after detection already done)
9. not-detected: neither detected nor guessed
Detailed JSON-schema is available [here](schema/flow_event_schema.json). Also, a graphical representation of *Flow Events* timeline is available [here](schema/flow_events_diagram.png).
# Flow States
A flow can have three different states while it is been tracked by `nDPId`.
1. skipped: the flow will be tracked, but no detection will happen to reduce memory usage.
See command line argument `-I` and `-E`
2. finished: detection finished and the memory used for the detection is freed
3. info: detection is in progress and all flow memory required for `libnDPI` is allocated (this state consumes most memory)
# Build (CMake)
`nDPId` build system is based on [CMake](https://cmake.org/)
@@ -88,7 +182,7 @@ see below for a full/test live-session
![](examples/ndpid_install_and_run.gif)
Based on your building environment and/or desiderata, you could need:
Based on your build environment and/or desiderata, you could need:
```shell
mkdir build
@@ -99,45 +193,62 @@ ccmake ..
or to build with a staticially linked libnDPI:
```shell
mkdir build
cd build
cmake .. -DSTATIC_LIBNDPI_INSTALLDIR=[path/to/your/libnDPI/installdir]
cmake -S . -B ./build \
-DSTATIC_LIBNDPI_INSTALLDIR=[path/to/your/libnDPI/installdir] \
-DNDPI_NO_PKGCONFIG=ON
cmake --build ./build
```
If you're using the latter one, make sure that you've configured libnDPI with `./configure --prefix=[path/to/your/libnDPI/installdir]`
and do not forget to set the all necessary CMake variables to link against shared libraries used by your nDPI build.
If you use the latter, make sure that you've configured libnDPI with `./configure --prefix=[path/to/your/libnDPI/installdir]`
and remember to set the all-necessary CMake variables to link against shared libraries used by your nDPI build.
You'll also need to use `-DNDPI_NO_PKGCONFIG=ON` if `STATIC_LIBNDPI_INSTALLDIR` does not contain a pkg-config file.
e.g.:
```shell
mkdir build
cd build
cmake .. -DSTATIC_LIBNDPI_INSTALLDIR=[path/to/your/libnDPI/installdir] -DNDPI_WITH_GCRYPT=ON -DNDPI_WITH_PCRE=OFF -DNDPI_WITH_MAXMINDDB=OFF
cmake -S . -B ./build \
-DSTATIC_LIBNDPI_INSTALLDIR=[path/to/your/libnDPI/installdir] \
-DNDPI_NO_PKGCONFIG=ON \
-DNDPI_WITH_GCRYPT=ON -DNDPI_WITH_PCRE=OFF -DNDPI_WITH_MAXMINDDB=OFF
cmake --build ./build
```
Or let a shell script do the work for you:
```shell
mkdir build
cd build
cmake .. -DBUILD_NDPI=ON
cmake -S . -B ./build \
-DBUILD_NDPI=ON
cmake --build ./build
```
The CMake cache variable `-DBUILD_NDPI=ON` builds a version of `libnDPI` residing as git submodule in this repository.
The CMake cache variable `-DBUILD_NDPI=ON` builds a version of `libnDPI` residing as a git submodule in this repository.
# run
As mentioned above, in order to run `nDPId` a UNIX-socket need to be provided in order to stream our related JSON-data.
As mentioned above, in order to run `nDPId`, a UNIX-socket needs to be provided in order to stream our related JSON-data.
Such a UNIX-socket can be provided by both the included `nDPIsrvd` daemon, or, if you simply need a quick check, with the [ncat](https://nmap.org/book/ncat-man.html) utility, with a simple `ncat -U /tmp/listen.sock -l -k`
Such a UNIX-socket can be provided by both the included `nDPIsrvd` daemon, or, if you simply need a quick check, with the [ncat](https://nmap.org/book/ncat-man.html) utility, with a simple `ncat -U /tmp/listen.sock -l -k`. Remember that OpenBSD `netcat` is not able to handle multiple connections reliably.
Once the socket is ready, you can run `nDPId` capturing and analyzing your own traffic, with something similar to:
Once the socket is ready, you can run `nDPId` capturing and analyzing your own traffic, with something similar to: `sudo nDPId -c /tmp/listen.sock`
If you're using OpenBSD `netcat`, you need to run: `sudo nDPId -c /tmp/listen.sock -o max-reader-threads=1`
Make sure that the UNIX socket is accessible by the user (see -u) to whom nDPId changes to, default: nobody.
Of course, both `ncat` and `nDPId` need to point to the same UNIX-socket (`nDPId` provides the `-c` option, exactly for this. As a default, `nDPId` refer to `/tmp/ndpid-collector.sock`, and the same default-path is also used by `nDPIsrvd` as for the incoming socket)
Of course, both `ncat` and `nDPId` need to point to the same UNIX-socket (`nDPId` provides the `-c` option, exactly for this. By default, `nDPId` refers to `/tmp/ndpid-collector.sock`, and the same default-path is also used by `nDPIsrvd` for the incoming socket).
You also need to provide `nDPId` some real-traffic. You can capture your own traffic, with something similar to:
Give `nDPId` some real-traffic. You can capture your own traffic, with something similar to:
./nDPId -c /tmp/listen.sock -i wlan0 -l
```shell
socat -u UNIX-Listen:/tmp/listen.sock,fork - # does the same as `ncat`
sudo chown nobody:nobody /tmp/listen.sock # default `nDPId` user/group, see `-u` and `-g`
sudo ./nDPId -c /tmp/listen.sock -l
```
`nDPId` supports also UDP collector endpoints:
```shell
nc -d -u 127.0.0.1 7000 -l -k
sudo ./nDPId -c 127.0.0.1:7000 -l
```
or you can generate a nDPId-compatible JSON dump with:
@@ -152,7 +263,7 @@ Daemons:
make -C [path-to-a-build-dir] daemon
```
Or you can proceed with a manual approach with:
Or a manual approach with:
```shell
./nDPIsrvd -d
@@ -170,34 +281,69 @@ And why not a flow-info example?
./examples/py-flow-info/flow-info.py
```
or
```shell
./nDPIsrvd-json-dump
```
or anything below `./examples`.
# nDPId tuning
It is possible to change `nDPId` internals w/o recompiling by using `-o subopt=value`.
But be careful: changing the default values may render `nDPId` useless and is not well tested.
Suboptions for `-o`:
Format: `subopt` (unit, comment): description
* `max-flows-per-thread` (N, caution advised): affects max. memory usage
* `max-idle-flows-per-thread` (N, safe): max. allowed idle flows whose memory gets freed after `flow-scan-interval`
* `max-reader-threads` (N, safe): amount of packet processing threads, every thread can have a max. of `max-flows-per-thread` flows
* `daemon-status-interval` (ms, safe): specifies how often daemon event `status` is generated
* `compression-scan-interval` (ms, untested): specifies how often `nDPId` scans for inactive flows ready for compression
* `compression-flow-inactivity` (ms, untested): the shortest period of time elapsed before `nDPId` considers compressing a flow (e.g. nDPI flow struct) that neither sent nor received any data
* `flow-scan-interval` (ms, safe): min. amount of time after which `nDPId` scans for idle or long-lasting flows
* `generic-max-idle-time` (ms, untested): time after which a non TCP/UDP/ICMP flow times out
* `icmp-max-idle-time` (ms, untested): time after which an ICMP flow times out
* `udp-max-idle-time` (ms, caution advised): time after which an UDP flow times out
* `tcp-max-idle-time` (ms, caution advised): time after which a TCP flow times out
* `tcp-max-post-end-flow-time` (ms, caution advised): a TCP flow that received a FIN or RST waits this amount of time before flow tracking stops and the flow memory is freed
* `max-packets-per-flow-to-send` (N, safe): max. `packet-flow` events generated for the first N packets of each flow
* `max-packets-per-flow-to-process` (N, caution advised): max. amount of packets processed by `libnDPI`
* `max-packets-per-flow-to-analyze` (N, safe): max. packets to analyze before sending an `analyse` event, requires `-A`
* `error-event-threshold-n` (N, safe): max. error events to send until threshold time has passed
* `error-event-threshold-time` (N, safe): time after which the error event threshold resets
# test
The recommended way to run integration / diff tests:
The recommended way to run regression / diff tests:
```shell
mkdir build
cd build
cmake .. -DBUILD_NDPI=ON
make nDPId-test test
cmake -S . -B ./build-like-ci \
-DBUILD_NDPI=ON -DENABLE_ZLIB=ON -DBUILD_EXAMPLES=ON
# optional: -DENABLE_CURL=ON -DENABLE_SANITIZER=ON
./test/run_tests.sh ./libnDPI ./build-like-ci/nDPId-test
# or: make -C ./build-like-ci test
```
Alternatively you can run some integration tests manually:
`./test/run_tests.sh [/path/to/libnDPI/root/directory] [/path/to/nDPId-test]`
e.g.:
`./test/run_tests.sh [${HOME}/git/nDPI] [${HOME}/git/nDPId/build/nDPId-test]`
Run `./test/run_tests.sh` to see some usage information.
Remember that all test results are tied to a specific libnDPI commit hash
as part of the `git submodule`. Using `test/run_tests.sh` for other commit hashes
will most likely result in PCAP diff's.
will most likely result in PCAP diffs.
Why not use `examples/py-flow-dashboard/flow-dash.py` to visualize nDPId's output.
# Code Coverage
You may generate code coverage by using:
```shell
cmake -S . -B ./build-coverage \
-DENABLE_COVERAGE=ON -DENABLE_ZLIB=ON
# optional: -DBUILD_NDPI=ON
make -C ./build-coverage coverage-clean
make -C ./build-coverage clean
make -C ./build-coverage all
./test/run_tests.sh ./libnDPI ./build-coverage/nDPId-test
make -C ./build-coverage coverage
make -C ./build-coverage coverage-view
```
# Contributors
Special thanks to Damiano Verzulli ([@verzulli](https://github.com/verzulli)) from [GARRLab](https://www.garrlab.it) for providing server and test infrastructure.

18
SECURITY.md Normal file
View File

@@ -0,0 +1,18 @@
# Security Policy
I encourage you to submit a pull request if you have a solution or fix for anything even security vulnerabilities.
Your contributions help advance and enhance safety for all users :star:.
## Reporting a Bug :bug: :bug:
Simply use GitHub issues to report a bug with related information to debug the issue :pencil:.
## Reporting a Vulnerability :closed_lock_with_key: :eyes:
For sensitive security issues, please email <toni@impl.cc> including the following information:
- Description of the vulnerability
- Steps to reproduce the issue
- Affected versions i.e. release tags, git commit hashes or git branch
- If applicable, a data sample (preferably `pcap/pcapng`) to reproduce
- If known, any mitigations or fixes for the issue

21
TODO.md
View File

@@ -1,5 +1,20 @@
# TODOs
1. improve UDP/TCP timeout handling by reading netfilter conntrack timeouts from /proc (or just read conntrack table entries)
2. detect interface / timeout changes and apply them to nDPId
3. implement AEAD crypto via libsodium (at least for TCP communication)
1.8:
* let nDPIsrvd (collector) connect to other nDPIsrvd instances (as distributor)
* nDPIsrvd GnuTLS support for TCP/IP distributor connections
* provide nDPId-exportd daemon which will only send captured packets to an nDPId instance running on a different machine
2.0.0:
* switch to semantic versioning for the greater good ;)
no release plan:
* merge flow end/idle event into idle event (end is not really useful..)
* provide a shared library for C / C++ for distributor application developers
* improve UDP/TCP timeout handling by reading netfilter conntrack timeouts from /proc (or just read conntrack table entries)
* detect interface / timeout changes and apply them to nDPId
* switch to MIT or BSD License
* libdaq integration

28
cmake/CheckEpoll.cmake Normal file
View File

@@ -0,0 +1,28 @@
# - Check if the system supports epoll.
# CHECK_EPOLL(<var>)
# <var> - variable to store the result
# (1 for success, empty for failure)
#=============================================================================
# This software is in the public domain, furnished "as is", without technical
# support, and with no warranty, express or implied, as to its usefulness for
# any purpose.
#=============================================================================
macro(CHECK_EPOLL VARIABLE)
if(UNIX)
if("${VARIABLE}" MATCHES "^${VARIABLE}$")
message(STATUS "Check if the system supports epoll")
include(CheckSymbolExists)
check_symbol_exists(epoll_create "sys/epoll.h" EPOLL_PROTOTYPE_EXISTS)
if(EPOLL_PROTOTYPE_EXISTS)
message(STATUS "Check if the system supports epoll - yes")
set(${VARIABLE} 1 CACHE INTERNAL "Result of CHECK_EPOLL" FORCE)
else(EPOLL_PROTOTYPE_EXISTS)
message(STATUS "Check if the system supports epoll - no")
set(${VARIABLE} "" CACHE INTERNAL "Result of CHECK_EPOLL" FORCE)
endif(EPOLL_PROTOTYPE_EXISTS)
endif("${VARIABLE}" MATCHES "^${VARIABLE}$")
endif(UNIX)
endmacro(CHECK_EPOLL)

View File

@@ -2,6 +2,7 @@
#define CONFIG_H 1
/* macros shared across multiple executables */
#define DEFAULT_CHUSER "nobody"
#define COLLECTOR_UNIX_SOCKET "/tmp/ndpid-collector.sock"
#define DISTRIBUTOR_UNIX_SOCKET "/tmp/ndpid-distributor.sock"
#define DISTRIBUTOR_HOST "127.0.0.1"
@@ -11,34 +12,44 @@
* NOTE: Buffer size needs to keep in sync with other implementations
* e.g. dependencies/nDPIsrvd.py
*/
#define NETWORK_BUFFER_MAX_SIZE 16384u /* 8192 + 8192 */
#define NETWORK_BUFFER_MAX_SIZE 33792u /* 8192 + 8192 + 8192 + 8192 + 1024 */
#define NETWORK_BUFFER_LENGTH_DIGITS 5u
#define NETWORK_BUFFER_LENGTH_DIGITS_STR "5"
#define PFRING_BUFFER_SIZE 65536u
#define TIME_S_TO_US(s) (s * 1000llu * 1000llu)
/* nDPId default config options */
#define nDPId_PIDFILE "/tmp/ndpid.pid"
#define nDPId_MAX_FLOWS_PER_THREAD 4096u
#define nDPId_MAX_FLOWS_PER_THREAD 65536u
#define nDPId_MAX_IDLE_FLOWS_PER_THREAD (nDPId_MAX_FLOWS_PER_THREAD / 32u)
#define nDPId_TICK_RESOLUTION 1000u
#define nDPId_MAX_READER_THREADS 32u
#define nDPId_DAEMON_STATUS_INTERVAL 600000u /* 600 sec */
#define nDPId_MEMORY_PROFILING_LOG_INTERVAL 5000u /* 5 sec */
#define nDPId_COMPRESSION_SCAN_INTERVAL 20000u /* 20 sec */
#define nDPId_COMPRESSION_FLOW_INACTIVITY 30000u /* 30 sec */
#define nDPId_FLOW_SCAN_INTERVAL 10000u /* 10 sec */
#define nDPId_GENERIC_IDLE_TIME 600000u /* 600 sec */
#define nDPId_ICMP_IDLE_TIME 120000u /* 120 sec */
#define nDPId_TCP_IDLE_TIME 7440000u /* 7440 sec */
#define nDPId_UDP_IDLE_TIME 180000u /* 180 sec */
#define nDPId_TCP_POST_END_FLOW_TIME 120000u /* 120 sec */
#define nDPId_ERROR_EVENT_THRESHOLD_N 16u
#define nDPId_ERROR_EVENT_THRESHOLD_TIME TIME_S_TO_US(10u) /* 10 sec */
#define nDPId_DAEMON_STATUS_INTERVAL TIME_S_TO_US(600u) /* 600 sec */
#define nDPId_MEMORY_PROFILING_LOG_INTERVAL TIME_S_TO_US(5u) /* 5 sec */
#define nDPId_COMPRESSION_SCAN_INTERVAL TIME_S_TO_US(20u) /* 20 sec */
#define nDPId_COMPRESSION_FLOW_INACTIVITY TIME_S_TO_US(30u) /* 30 sec */
#define nDPId_FLOW_SCAN_INTERVAL TIME_S_TO_US(10u) /* 10 sec */
#define nDPId_GENERIC_IDLE_TIME TIME_S_TO_US(600u) /* 600 sec */
#define nDPId_ICMP_IDLE_TIME TIME_S_TO_US(120u) /* 120 sec */
#define nDPId_TCP_IDLE_TIME TIME_S_TO_US(7440u) /* 7440 sec */
#define nDPId_UDP_IDLE_TIME TIME_S_TO_US(180u) /* 180 sec */
#define nDPId_TCP_POST_END_FLOW_TIME TIME_S_TO_US(120u) /* 120 sec */
#define nDPId_THREAD_DISTRIBUTION_SEED 0x03dd018b
#define nDPId_PACKETS_PLEN_MAX 8192u /* 8kB */
#define nDPId_PACKETS_PER_FLOW_TO_SEND 15u
#define nDPId_PACKETS_PER_FLOW_TO_PROCESS NDPI_DEFAULT_MAX_NUM_PKTS_PER_FLOW_TO_DISSECT
#define nDPId_PACKETS_PER_FLOW_TO_PROCESS 32u
#define nDPId_PACKETS_PER_FLOW_TO_ANALYZE 32u
#define nDPId_ANALYZE_PLEN_MAX 1504u
#define nDPId_ANALYZE_PLEN_BIN_LEN 32u
#define nDPId_ANALYZE_PLEN_NUM_BINS 48u
#define nDPId_FLOW_STRUCT_SEED 0x5defc104
/* nDPIsrvd default config options */
#define nDPIsrvd_PIDFILE "/tmp/ndpisrvd.pid"
#define nDPIsrvd_MAX_REMOTE_DESCRIPTORS 32
#define nDPIsrvd_MAX_REMOTE_DESCRIPTORS 128
#define nDPIsrvd_MAX_WRITE_BUFFERS 1024
#endif

View File

@@ -196,10 +196,10 @@ static int jsmn_parse_string(jsmn_parser *parser, const char *js,
jsmntok_t *token;
int start = parser->pos;
parser->pos++;
/* Skip starting quote */
parser->pos++;
for (; parser->pos < len && js[parser->pos] != '\0'; parser->pos++) {
char c = js[parser->pos];

File diff suppressed because it is too large Load Diff

View File

@@ -21,19 +21,20 @@ DEFAULT_PORT = 7000
DEFAULT_UNIX = '/tmp/ndpid-distributor.sock'
NETWORK_BUFFER_MIN_SIZE = 6 # NETWORK_BUFFER_LENGTH_DIGITS + 1
NETWORK_BUFFER_MAX_SIZE = 16384 # Please keep this value in sync with the one in config.h
NETWORK_BUFFER_MAX_SIZE = 33792 # Please keep this value in sync with the one in config.h
nDPId_PACKETS_PLEN_MAX = 8192 # Please keep this value in sync with the one in config.h
PKT_TYPE_ETH_IP4 = 0x0800
PKT_TYPE_ETH_IP6 = 0x86DD
class TermColor:
HINT = '\033[33m'
HINT = '\033[33m'
WARNING = '\033[93m'
FAIL = '\033[91m'
BOLD = '\033[1m'
END = '\033[0m'
BLINK = '\x1b[5m'
FAIL = '\033[91m'
BOLD = '\033[1m'
END = '\033[0m'
BLINK = '\x1b[5m'
if USE_COLORAMA is True:
COLOR_TUPLES = [ (Fore.BLUE, [Back.RED, Back.MAGENTA, Back.WHITE]),
@@ -51,6 +52,21 @@ class TermColor:
(Fore.LIGHTWHITE_EX, [Back.LIGHTBLACK_EX, Back.BLACK]),
(Fore.LIGHTYELLOW_EX, [Back.LIGHTRED_EX, Back.RED]) ]
@staticmethod
def disableColor():
TermColor.HINT = ''
TermColor.WARNING = ''
TermColor.FAIL = ''
TermColor.BOLD = ''
TermColor.END = ''
TermColor.BLINK = ''
global USE_COLORAMA
USE_COLORAMA = False
@staticmethod
def disableBlink():
TermColor.BLINK = ''
@staticmethod
def calcColorHash(string):
h = 0
@@ -68,9 +84,9 @@ class TermColor:
@staticmethod
def setColorByString(string):
global USE_COLORAMA
if USE_COLORAMA is True:
fg_color, bg_color = TermColor.getColorsByHash(string)
color_hash = TermColor.calcColorHash(string)
return '{}{}{}{}{}'.format(Style.BRIGHT, fg_color, bg_color, string, Style.RESET_ALL)
else:
return '{}{}{}'.format(TermColor.BOLD, string, TermColor.END)
@@ -125,9 +141,9 @@ class Instance:
if 'thread_id' not in json_dict:
return
thread_id = json_dict['thread_id']
if 'thread_ts_msec' in json_dict:
if 'thread_ts_usec' in json_dict:
mrtf = self.getMostRecentFlowTime(thread_id) if thread_id in self.thread_data else 0
self.setMostRecentFlowTime(thread_id, max(json_dict['thread_ts_msec'], mrtf))
self.setMostRecentFlowTime(thread_id, max(json_dict['thread_ts_usec'], mrtf))
class Flow:
@@ -169,13 +185,16 @@ class FlowManager:
if alias not in self.instances:
self.instances[alias] = dict()
if source not in self.instances[alias]:
self.instances[alias][source] = dict()
self.instances[alias][source] = Instance(alias, source)
self.instances[alias][source].setMostRecentFlowTimeFromJSON(json_dict)
return self.instances[alias][source]
@staticmethod
def getLastPacketTime(instance, flow_id, json_dict):
return max(int(json_dict['flow_src_last_pkt_time']), int(json_dict['flow_dst_last_pkt_time']), instance.flows[flow_id].flow_last_seen)
def getFlow(self, instance, json_dict):
if 'flow_id' not in json_dict:
return None
@@ -183,13 +202,13 @@ class FlowManager:
flow_id = int(json_dict['flow_id'])
if flow_id in instance.flows:
instance.flows[flow_id].flow_last_seen = int(json_dict['flow_last_seen'])
instance.flows[flow_id].flow_last_seen = FlowManager.getLastPacketTime(instance, flow_id, json_dict)
instance.flows[flow_id].flow_idle_time = int(json_dict['flow_idle_time'])
return instance.flows[flow_id]
thread_id = int(json_dict['thread_id'])
instance.flows[flow_id] = Flow(flow_id, thread_id)
instance.flows[flow_id].flow_last_seen = int(json_dict['flow_last_seen'])
instance.flows[flow_id].flow_last_seen = FlowManager.getLastPacketTime(instance, flow_id, json_dict)
instance.flows[flow_id].flow_idle_time = int(json_dict['flow_idle_time'])
instance.flows[flow_id].cleanup_reason = FlowManager.CLEANUP_REASON_INVALID
@@ -279,6 +298,7 @@ class nDPIsrvdException(Exception):
INVALID_LINE_RECEIVED = 4
CALLBACK_RETURNED_FALSE = 5
SOCKET_TIMEOUT = 6
JSON_DECODE_ERROR = 7
def __init__(self, etype):
self.etype = etype
@@ -325,11 +345,51 @@ class SocketTimeout(nDPIsrvdException):
def __str__(self):
return 'Socket timeout.'
class JsonDecodeError(nDPIsrvdException):
def __init__(self, json_exception, failed_line):
super().__init__(nDPIsrvdException.JSON_DECODE_ERROR)
self.json_exception = json_exception
self.failed_line = failed_line
def __str__(self):
return '{}: {}'.format(self.json_exception, self.failed_line)
class JsonFilter():
def __init__(self, filter_string):
self.filter_string = filter_string
self.filter = compile(filter_string, '<string>', 'eval')
def evaluate(self, json_dict):
if type(json_dict) is not dict:
raise nDPIsrvdException('Could not evaluate JSON Filter: expected dictionary, got {}'.format(type(json_dict)))
return eval(self.filter, {'json_dict': json_dict})
class nDPIsrvdSocket:
def __init__(self):
self.sock_family = None
self.flow_mgr = FlowManager()
self.received_bytes = 0
self.json_filter = list()
def addFilter(self, filter_str):
self.json_filter.append(JsonFilter(filter_str))
def evalFilters(self, json_dict):
for jf in self.json_filter:
try:
json_filter_retval = jf.evaluate(json_dict)
except Exception as err:
print()
sys.stderr.write('Error while evaluating expression "{}"\n'.format(jf.filter_string))
raise err
if not isinstance(json_filter_retval, bool):
print()
sys.stderr.write('Error while evaluating expression "{}"\n'.format(jf.filter_string))
raise nDPIsrvdException('JSON Filter returned an invalid type: expected bool, got {}'.format(type(json_filter_retval)))
if json_filter_retval is False:
return False
return True
def connect(self, addr):
if type(addr) is tuple:
@@ -345,6 +405,8 @@ class nDPIsrvdSocket:
self.msglen = 0
self.digitlen = 0
self.lines = []
self.failed_lines = []
self.filtered_lines = 0
def timeout(self, timeout):
self.sock.settimeout(timeout)
@@ -398,25 +460,45 @@ class nDPIsrvdSocket:
def parse(self, callback_json, callback_flow_cleanup, global_user_data):
retval = True
index = 0
for received_line in self.lines:
json_dict = json.loads(received_line[0].decode('ascii', errors='replace'), strict=True)
try:
json_dict = json.loads(received_line[0].decode('ascii', errors='replace'), strict=True)
except json.decoder.JSONDecodeError as e:
json_dict = dict()
self.failed_lines += [received_line]
self.lines = self.lines[1:]
raise JsonDecodeError(e, received_line)
instance = self.flow_mgr.getInstance(json_dict)
if instance is None:
self.failed_lines += [received_line]
retval = False
continue
if callback_json(json_dict, instance, self.flow_mgr.getFlow(instance, json_dict), global_user_data) is not True:
retval = False
current_flow = self.flow_mgr.getFlow(instance, json_dict)
filter_eval = self.evalFilters(json_dict)
if filter_eval is True:
try:
if callback_json(json_dict, instance, current_flow, global_user_data) is not True:
self.failed_lines += [received_line]
retval = False
except Exception as e:
self.failed_lines += [received_line]
self.lines = self.lines[1:]
raise(e)
else:
self.filtered_lines += 1
for _, flow in self.flow_mgr.getFlowsToCleanup(instance, json_dict).items():
if callback_flow_cleanup is None:
pass
elif callback_flow_cleanup(instance, flow, global_user_data) is not True:
elif filter_eval is True and callback_flow_cleanup(instance, flow, global_user_data) is not True:
self.failed_lines += [received_line]
self.lines = self.lines[1:]
retval = False
index += 1
self.lines = self.lines[index:]
self.lines = self.lines[1:]
return retval
@@ -440,22 +522,31 @@ class nDPIsrvdSocket:
return self.flow_mgr.doShutdown().items()
def verify(self):
if len(self.failed_lines) > 0:
raise nDPIsrvdException('Failed lines > 0: {}'.format(len(self.failed_lines)))
return self.flow_mgr.verifyFlows()
def defaultArgumentParser(desc='nDPIsrvd Python Interface',
def defaultArgumentParser(desc='nDPIsrvd Python Interface', enable_json_filter=False,
help_formatter=argparse.ArgumentDefaultsHelpFormatter):
parser = argparse.ArgumentParser(description=desc, formatter_class=help_formatter)
parser.add_argument('--host', type=str, help='nDPIsrvd host IP')
parser.add_argument('--port', type=int, default=DEFAULT_PORT, help='nDPIsrvd TCP port')
parser.add_argument('--unix', type=str, help='nDPIsrvd unix socket path')
if enable_json_filter is True:
parser.add_argument('--filter', type=str, action='append',
help='Set a filter string which if evaluates to True will invoke the JSON callback.\n'
'Example: json_dict[\'flow_event_name\'] == \'detected\' will only process \'detected\' events.')
return parser
def toSeconds(usec):
return usec / (1000 * 1000)
def validateAddress(args):
tcp_addr_set = False
address = None
if args.host is None:
address_tcpip = (DEFAULT_HOST, DEFAULT_PORT)
address_tcpip = (DEFAULT_HOST, args.port)
else:
address_tcpip = (args.host, args.port)
tcp_addr_set = True
@@ -477,6 +568,23 @@ def validateAddress(args):
return address
def prepareJsonFilter(args, nsock):
# HowTo use JSON Filters:
# Add `--filter [FILTER_STRING]` to the Python scripts that support JSON filtering.
# Examples:
# ./examples/py-json-stdout/json-stdout.py --filter '"ndpi" in json_dict and "proto" in json_dict["ndpi"]'
# The command above will print only JSONs that have the subobjects json_dict["ndpi"] and json_dict["ndpi"]["proto"] available.
# ./examples/py-flow-info/flow-info.py --filter 'json_dict["source"] == "eth0"' --filter '"flow_event_name" in json_dict and json_dict["flow_event_name"] == "analyse"'
# Multiple JSON filter will be ANDed together.
# Note: You may *only* use the global "json_dict" in your expressions.
try:
json_filter = args.filter
if json_filter is not None:
for jf in json_filter:
nsock.addFilter(jf)
except AttributeError:
pass
global schema
schema = {'packet_event_schema' : None, 'error_event_schema' : None, 'daemon_event_schema' : None, 'flow_event_schema' : None}

View File

@@ -0,0 +1,41 @@
name: build # This name shows up in badge.svg
on:
push: # any branch
pull_request:
branches: [ "master" ]
jobs:
build-gcc:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- run: make -C tests EXTRA_CFLAGS="-W -Wall -Wextra -Wswitch-default"
- run: make -C tests clean ; make -C tests pedantic
- run: make -C tests clean ; make -C tests pedantic EXTRA_CFLAGS=-DNO_DECLTYPE
- run: make -C tests clean ; make -C tests cplusplus
- run: make -C tests clean ; make -C tests cplusplus EXTRA_CFLAGS=-DNO_DECLTYPE
build-clang:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}
env:
CC: clang
CXX: clang++
steps:
- uses: actions/checkout@v3
- run: make -C tests EXTRA_CFLAGS="-W -Wall -Wextra -Wswitch-default"
- run: make -C tests clean ; make -C tests pedantic
- run: make -C tests clean ; make -C tests pedantic EXTRA_CFLAGS=-DNO_DECLTYPE
- run: make -C tests clean ; make -C tests cplusplus
- run: make -C tests clean ; make -C tests cplusplus EXTRA_CFLAGS=-DNO_DECLTYPE
build-asciidoc:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- run: sudo apt-get update && sudo apt-get install asciidoc -y
- run: make -C doc

View File

@@ -1,4 +1,4 @@
Copyright (c) 2005-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2005-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -1,5 +1,6 @@
[![Build status](https://api.travis-ci.org/troydhanson/uthash.svg?branch=master)](https://travis-ci.org/troydhanson/uthash)
[![GitHub CI status](https://github.com/troydhanson/uthash/actions/workflows/build.yml/badge.svg)](https://github.com/troydhanson/uthash/actions/workflows/build.yml)
Documentation for uthash is available at:

View File

@@ -13,8 +13,8 @@
</div> <!-- banner -->
<div id="topnav">
<a href="http://github.com/troydhanson/uthash">GitHub page</a> &gt;
uthash home <!-- http://troydhanson.github.io/uthash/ -->
<a href="https://github.com/troydhanson/uthash">GitHub page</a> &gt;
uthash home <!-- https://troydhanson.github.io/uthash/ -->
<a href="https://twitter.com/share" class="twitter-share-button" data-via="troydhanson">Tweet</a>
<script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");</script>
@@ -43,7 +43,7 @@
<h2>developer</h2>
<div><a href="http://troydhanson.github.io/">Troy D. Hanson</a></div>
<div><a href="https://troydhanson.github.io/">Troy D. Hanson</a></div>
<h2>maintainer</h2>
<div><a href="https://github.com/Quuxplusone">Arthur O'Dwyer</a></div>

View File

@@ -13,7 +13,7 @@
</div> <!-- banner -->
<div id="topnav">
<a href="http://troydhanson.github.io/uthash/">uthash home</a> &gt;
<a href="https://troydhanson.github.io/uthash/">uthash home</a> &gt;
BSD license
</div>
@@ -21,7 +21,7 @@
<div id="mid">
<div id="main">
<pre>
Copyright (c) 2005-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2005-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -29,7 +29,6 @@ h1,p { margin: 0; } /* non-0 margin on firefox */
background-repeat: repeat-y;
/* background-color: #ffddaa; */
padding-top: 20px;
padding-top: 20px;
margin-bottom: 10px;
}

View File

@@ -5,7 +5,7 @@ v2.3.0, February 2021
To download uthash, follow this link back to the
https://github.com/troydhanson/uthash[GitHub project page].
Back to my http://troydhanson.github.io/[other projects].
Back to my https://troydhanson.github.io/[other projects].
A hash in C
-----------
@@ -805,7 +805,7 @@ Here is a simple example where a structure has a pointer member, called `key`.
.A pointer key
----------------------------------------------------------------------
#include <stdio.h>
#include <assert.h>
#include <stdlib.h>
#include "uthash.h"
@@ -816,17 +816,16 @@ typedef struct {
} el_t;
el_t *hash = NULL;
char *someaddr = NULL;
void *someaddr = &hash;
int main() {
el_t *d;
el_t *e = (el_t *)malloc(sizeof *e);
if (!e) return -1;
e->key = (void*)someaddr;
e->key = someaddr;
e->i = 1;
HASH_ADD_PTR(hash, key, e);
HASH_FIND_PTR(hash, &someaddr, d);
if (d) printf("found\n");
assert(d == e);
/* release memory */
HASH_DEL(hash, e);
@@ -835,9 +834,7 @@ int main() {
}
----------------------------------------------------------------------
This example is included in `tests/test57.c`. Note that the end of the program
deletes the element out of the hash, (and since no more elements remain in the
hash), uthash releases its internal memory.
This example is included in `tests/test57.c`.
Structure keys
~~~~~~~~~~~~~~
@@ -893,7 +890,7 @@ int main(int argc, char *argv[]) {
----------------------------------------------------------------------
This usage is nearly the same as use of a compound key explained below.
This usage is nearly the same as the usage of a compound key explained below.
Note that the general macros require the name of the `UT_hash_handle` to be
passed as the first argument (here, this is `hh`). The general macros are
@@ -1153,17 +1150,16 @@ always used with the `users_by_name` hash table).
Sorted insertion of new items
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If you would like to maintain a sorted hash you have two options. The first
option is to use the HASH_SRT() macro, which will sort any unordered list in
To maintain a sorted hash, you have two options. Your first
option is to use the `HASH_SRT` macro, which will sort any unordered list in
'O(n log(n))'. This is the best strategy if you're just filling up a hash
table with items in random order with a single final HASH_SRT() operation
when all is done. Obviously, this won't do what you want if you need
the list to be in an ordered state at times between insertion of
items. You can use HASH_SRT() after every insertion operation, but that will
yield a computational complexity of 'O(n^2 log n)'.
table with items in random order with a single final `HASH_SRT` operation
when all is done. If you need the table to remain sorted as you add and remove
items, you can use `HASH_SRT` after every insertion operation, but that gives
a computational complexity of 'O(n^2 log n)' to insert 'n' items.
The second route you can take is via the in-order add and replace macros.
The `HASH_ADD_INORDER*` macros work just like their `HASH_ADD*` counterparts, but
Your second option is to use the in-order add and replace macros.
The `HASH_ADD_*_INORDER` macros work just like their `HASH_ADD_*` counterparts, but
with an additional comparison-function argument:
int name_sort(struct my_struct *a, struct my_struct *b) {
@@ -1172,11 +1168,11 @@ with an additional comparison-function argument:
HASH_ADD_KEYPTR_INORDER(hh, items, &item->name, strlen(item->name), item, name_sort);
New items are sorted at insertion time in 'O(n)', thus resulting in a
total computational complexity of 'O(n^2)' for the creation of the hash
table with all items.
For in-order add to work, the list must be in an ordered state before
insertion of the new item.
These macros assume that the hash is already sorted according to the
comparison function, and insert the new item in its proper place.
A single insertion takes 'O(n)', resulting in a total computational
complexity of 'O(n^2)' to insert all 'n' items: slower than a single
`HASH_SRT`, but faster than doing a `HASH_SRT` after every insertion.
Several sort orders
~~~~~~~~~~~~~~~~~~~

View File

@@ -139,7 +139,7 @@ a copy of the source string and pushes that copy into the array.
About UT_icd
~~~~~~~~~~~~
Arrays be made of any type of element, not just integers and strings. The
Arrays can be made of any type of element, not just integers and strings. The
elements can be basic types or structures. Unless you're dealing with integers
and strings (which use pre-defined `ut_int_icd` and `ut_str_icd`), you'll need
to define a `UT_icd` helper structure. This structure contains everything that

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2008-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2008-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without
@@ -38,11 +38,6 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#define UTARRAY_UNUSED
#endif
#ifdef oom
#error "The name of macro 'oom' has been changed to 'utarray_oom'. Please update your code."
#define utarray_oom() oom()
#endif
#ifndef utarray_oom
#define utarray_oom() exit(-1)
#endif
@@ -234,7 +229,16 @@ typedef struct {
static void utarray_str_cpy(void *dst, const void *src) {
char *const *srcc = (char *const *)src;
char **dstc = (char**)dst;
*dstc = (*srcc == NULL) ? NULL : strdup(*srcc);
if (*srcc == NULL) {
*dstc = NULL;
} else {
*dstc = (char*)malloc(strlen(*srcc) + 1);
if (*dstc == NULL) {
utarray_oom();
} else {
strcpy(*dstc, *srcc);
}
}
}
static void utarray_str_dtor(void *elt) {
char **eltc = (char**)elt;

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2003-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2003-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without
@@ -51,6 +51,8 @@ typedef unsigned char uint8_t;
#else /* VS2008 or older (or VS2010 in C mode) */
#define NO_DECLTYPE
#endif
#elif defined(__MCST__) /* Elbrus C Compiler */
#define DECLTYPE(x) (__typeof(x))
#elif defined(__BORLANDC__) || defined(__ICCARM__) || defined(__LCC__) || defined(__WATCOMC__)
#define NO_DECLTYPE
#else /* GNU, Sun and other compilers */
@@ -157,7 +159,7 @@ do {
if (head) { \
unsigned _hf_bkt; \
HASH_TO_BKT(hashval, (head)->hh.tbl->num_buckets, _hf_bkt); \
if (HASH_BLOOM_TEST((head)->hh.tbl, hashval) != 0) { \
if (HASH_BLOOM_TEST((head)->hh.tbl, hashval)) { \
HASH_FIND_IN_BKT((head)->hh.tbl, hh, (head)->hh.tbl->buckets[ _hf_bkt ], keyptr, keylen, hashval, out); \
} \
} \
@@ -194,7 +196,7 @@ do {
} while (0)
#define HASH_BLOOM_BITSET(bv,idx) (bv[(idx)/8U] |= (1U << ((idx)%8U)))
#define HASH_BLOOM_BITTEST(bv,idx) (bv[(idx)/8U] & (1U << ((idx)%8U)))
#define HASH_BLOOM_BITTEST(bv,idx) ((bv[(idx)/8U] & (1U << ((idx)%8U))) != 0)
#define HASH_BLOOM_ADD(tbl,hashv) \
HASH_BLOOM_BITSET((tbl)->bloom_bv, ((hashv) & (uint32_t)((1UL << (tbl)->bloom_nbits) - 1U)))
@@ -206,7 +208,7 @@ do {
#define HASH_BLOOM_MAKE(tbl,oomed)
#define HASH_BLOOM_FREE(tbl)
#define HASH_BLOOM_ADD(tbl,hashv)
#define HASH_BLOOM_TEST(tbl,hashv) (1)
#define HASH_BLOOM_TEST(tbl,hashv) 1
#define HASH_BLOOM_BYTELEN 0U
#endif
@@ -450,7 +452,7 @@ do {
#define HASH_DELETE_HH(hh,head,delptrhh) \
do { \
struct UT_hash_handle *_hd_hh_del = (delptrhh); \
const struct UT_hash_handle *_hd_hh_del = (delptrhh); \
if ((_hd_hh_del->prev == NULL) && (_hd_hh_del->next == NULL)) { \
HASH_BLOOM_FREE((head)->hh.tbl); \
uthash_free((head)->hh.tbl->buckets, \
@@ -593,7 +595,9 @@ do {
/* SAX/FNV/OAT/JEN hash functions are macro variants of those listed at
* http://eternallyconfuzzled.com/tuts/algorithms/jsw_tut_hashing.aspx */
* http://eternallyconfuzzled.com/tuts/algorithms/jsw_tut_hashing.aspx
* (archive link: https://archive.is/Ivcan )
*/
#define HASH_SAX(key,keylen,hashv) \
do { \
unsigned _sx_i; \

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2007-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2007-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without
@@ -70,6 +70,8 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#else /* VS2008 or older (or VS2010 in C mode) */
#define NO_DECLTYPE
#endif
#elif defined(__MCST__) /* Elbrus C Compiler */
#define LDECLTYPE(x) __typeof(x)
#elif defined(__BORLANDC__) || defined(__ICCARM__) || defined(__LCC__) || defined(__WATCOMC__)
#define NO_DECLTYPE
#else /* GNU, Sun and other compilers */
@@ -709,7 +711,8 @@ do {
assert((del)->prev != NULL); \
if ((del)->prev == (del)) { \
(head)=NULL; \
} else if ((del)==(head)) { \
} else if ((del) == (head)) { \
assert((del)->next != NULL); \
(del)->next->prev = (del)->prev; \
(head) = (del)->next; \
} else { \

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2015-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2015-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2018-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2018-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2008-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2008-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -12,7 +12,7 @@ PROGS = test1 test2 test3 test4 test5 test6 test7 test8 test9 \
test66 test67 test68 test69 test70 test71 test72 test73 \
test74 test75 test76 test77 test78 test79 test80 test81 \
test82 test83 test84 test85 test86 test87 test88 test89 \
test90 test91 test92 test93 test94 test95 test96
test90 test91 test92 test93 test94 test95 test96 test97
CFLAGS += -I$(HASHDIR)
#CFLAGS += -DHASH_BLOOM=16
#CFLAGS += -O2

View File

@@ -98,6 +98,7 @@ test93: alt_fatal
test94: utlist with fields named other than 'next' and 'prev'
test95: utstack
test96: HASH_FUNCTION + HASH_KEYCMP
test97: deleting a const-qualified node from a hash
Other Make targets
================================================================================

View File

@@ -1,5 +1,5 @@
/*
Copyright (c) 2005-2021, Troy D. Hanson http://troydhanson.github.io/uthash/
Copyright (c) 2005-2022, Troy D. Hanson https://troydhanson.github.io/uthash/
All rights reserved.
Redistribution and use in source and binary forms, with or without

View File

@@ -1 +0,0 @@
found

View File

@@ -1,5 +1,5 @@
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
#include <stddef.h>
#include "uthash.h"
typedef struct {
@@ -8,25 +8,46 @@ typedef struct {
UT_hash_handle hh;
} el_t;
el_t *findit(el_t *hash, void *keytofind)
{
el_t *found;
HASH_FIND_PTR(hash, &keytofind, found);
return found;
}
int main()
{
el_t *d;
el_t *hash = NULL;
char *someaddr = NULL;
el_t *e = (el_t*)malloc(sizeof(el_t));
if (!e) {
return -1;
}
e->key = (void*)someaddr;
e->i = 1;
HASH_ADD_PTR(hash,key,e);
HASH_FIND_PTR(hash, &someaddr, d);
if (d != NULL) {
printf("found\n");
}
el_t e1;
el_t e2;
e1.key = NULL;
e1.i = 1;
e2.key = &e2;
e2.i = 2;
assert(findit(hash, NULL) == NULL);
assert(findit(hash, &e1) == NULL);
assert(findit(hash, &e2) == NULL);
HASH_ADD_PTR(hash, key, &e1);
assert(findit(hash, NULL) == &e1);
assert(findit(hash, &e1) == NULL);
assert(findit(hash, &e2) == NULL);
HASH_ADD_PTR(hash, key, &e2);
assert(findit(hash, NULL) == &e1);
assert(findit(hash, &e1) == NULL);
assert(findit(hash, &e2) == &e2);
HASH_DEL(hash, &e1);
assert(findit(hash, NULL) == NULL);
assert(findit(hash, &e1) == NULL);
assert(findit(hash, &e2) == &e2);
HASH_CLEAR(hh, hash);
assert(hash == NULL);
/* release memory */
HASH_DEL(hash,e);
free(e);
return 0;
}

View File

@@ -3,7 +3,7 @@
#include "uthash.h"
// this is an example of how to do a LRU cache in C using uthash
// http://troydhanson.github.io/uthash/
// https://troydhanson.github.io/uthash/
// by Jehiah Czebotar 2011 - jehiah@gmail.com
// this code is in the public domain http://unlicense.org/

0
dependencies/uthash/tests/test97.ans vendored Normal file
View File

57
dependencies/uthash/tests/test97.c vendored Normal file
View File

@@ -0,0 +1,57 @@
#include <assert.h>
#include <stdlib.h>
#include <string.h>
#include "uthash.h"
struct item {
int payload;
UT_hash_handle hh;
};
void delete_without_modifying(struct item *head, const struct item *p)
{
struct item old;
memcpy(&old, p, sizeof(struct item)); // also copy the padding bits
assert(memcmp(&old, p, sizeof(struct item)) == 0);
assert(p->hh.tbl == head->hh.tbl); // class invariant
HASH_DEL(head, p);
assert(memcmp(&old, p, sizeof(struct item)) == 0); // unmodified by HASH_DEL
}
int main()
{
struct item *items = NULL;
struct item *found = NULL;
int fortytwo = 42;
int i;
for (i=0; i < 100; i++) {
struct item *p = (struct item *)malloc(sizeof *p);
p->payload = i;
HASH_ADD_INT(items, payload, p);
}
assert(HASH_COUNT(items) == 100);
// Delete item "42" from the hash, wherever it is.
HASH_FIND_INT(items, &fortytwo, found);
assert(found != NULL);
assert(found->payload == 42);
delete_without_modifying(items, found);
assert(HASH_COUNT(items) == 99);
HASH_FIND_INT(items, &fortytwo, found);
assert(found == NULL);
// Delete the very first item in the hash.
assert(items != NULL);
i = items->payload;
delete_without_modifying(items, items);
assert(HASH_COUNT(items) == 98);
HASH_FIND_INT(items, &i, found);
assert(found == NULL);
// leak the items, we don't care
return 0;
}

View File

@@ -3,65 +3,96 @@
Some ready-2-use/ready-2-extend examples/utils.
All examples are prefixed with their used LANG.
## c-analysed
A feature extractor useful for ML/DL use cases.
It generates CSV files from flow "analyse" events.
Used also by `tests/run_tests.sh` if available.
## c-captured
A capture daemon suitable for low-resource devices.
It saves flows that were guessed/undetected/risky/midstream to a PCAP file for manual analysis.
Basicially a combination of `py-flow-undetected-to-pcap` and `py-risky-flow-to-pcap`.
Used also by `tests/run_tests.sh` if available.
## c-collectd
A collecd-exec compatible middleware that gathers statistic values from nDPId.
Used also by `tests/run_tests.sh` if available.
## c-json-stdout
## c-influxd
Tiny nDPId json dumper. Does not provide any useful funcationality besides dumping parsed JSON objects.
An InfluxDB push daemon. It aggregates various statistics gathered from nDPId.
The results are sent to a specified InfluxDB endpoint.
![](ndpid_grafana_example.png)
## c-notifyd
A notification daemon that sends information about suspicious flow events to DBUS.
## c-simple
Very tiny integration example.
Integration example that verifies flow timeouts on SIGUSR1.
## ~~go-dashboard~~ (DISCONTINUED!)
## cxx-graph
A discontinued tty UI nDPId dashboard.
Removed with commit 29c72fb30bb7d5614c0a8ebb73bee2ac7eca6608.
A standalone GLFW/OpenGL application that draws statistical data using ImWeb/ImPlot/ImGui.
## js-rt-analyzer
[nDPId-rt-analyzer](https://gitlab.com/verzulli/ndpid-rt-analyzer.git)
## js-rt-analyzer-frontend
[nDPId-rt-analyzer-frontend](https://gitlab.com/verzulli/ndpid-rt-analyzer-frontend.git)
## py-flow-info
Prints prettyfied information about flow events.
Console friendly, colorful, prettyfied event printer.
Required by `tests/run_tests.sh`
## py-flow-dashboard
## py-machine-learning
A realtime web based graph using Plotly/Dash.
Probably the most informative example.
Contains:
1. Classification via Random Forests and SciLearn
2. Anomaly Detection via Autoencoder and Keras (Work-In-Progress!)
Use sklearn together with CSVs created with **c-analysed** to train and predict DPI detections.
Try it with: `./examples/py-machine-learning/sklearn_random_forest.py --csv ./ndpi-analysed.csv --proto-class tls.youtube --proto-class tls.github --proto-class tls.spotify --proto-class tls.facebook --proto-class tls.instagram --proto-class tls.doh_dot --proto-class quic --proto-class icmp`
This way you should get 9 different classification classes.
You may notice that some classes e.g. TLS protocol classifications have a higher false-negative/false-positive rate.
Unfortunately, I can not provide any datasets due to some privacy concerns.
But you may use a [pre-trained model](https://drive.google.com/file/d/1KEwbP-Gx7KJr54wNoa63I56VI4USCAPL/view?usp=sharing) with `--load-model`.
## py-flow-multiprocess
Simple Python Multiprocess example spawning two worker processes, one connecting to nDPIsrvd and one printing flow id's to STDOUT.
## py-flow-undetected-to-pcap
Captures and saves undetected flows to a PCAP file.
## py-json-stdout
Dump received and parsed JSON strings.
## py-risky-flow-to-pcap
Captures and saves risky flows to a PCAP file.
Dump received and parsed JSON objects.
## py-schema-validation
Validate nDPId JSON strings against pre-defined JSON schema's.
Validate nDPId JSON messages against pre-defined JSON schema's.
See `schema/`.
Required by `tests/run_tests.sh`
## py-semantic-validation
Validate nDPId JSON strings against internal event semantics.
Validate nDPId JSON messages against internal event semantics.
Required by `tests/run_tests.sh`
## py-ja3-checker
## rs-simple
Captures JA3 hashes from nDPIsrvd and checks them against known hashes from [ja3er.com](https://ja3er.com).
A straight forward Rust deserialization/parsing example.
## yaml-filebeat
An example filebeat configuration to parse and send nDPId JSON
messages to Elasticsearch. Allowing long term storage and data visualization with kibana
and various other tools that interact with Elasticsearch (No logstash required).

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,14 @@
HowTo use this
==============
This HowTo assumes that the examples were sucessfully compiled and installed within the prefix `/usr` on your target machine.
1. Make sure nDPId and Collectd is running.
2. Edit `collectd.conf` usually in `/etc`.
3. Add the lines in `plugin_nDPIsrvd.conf` to your `collectd.conf`.
You may adapt this file depending what command line arguments you'd supplied to `nDPId`.
4. Reload your Collectd instance.
5. Optional: Install a http server of your choice.
Place the files in `/usr/share/nDPId/nDPIsrvd-collectd/www` somewhere in your www root.
6. Optional: Add `rrdgraph.sh` as cron job e.g. `0 * * * * /usr/share/nDPId/nDPIsrvd-collectd/rrdgraph.sh [path-to-the-collectd-rrd-directory] [path-to-your-dpi-wwwroot]`.
This will run `rrdgraph.sh` once per hour. You can adjust this until it fit your needs.

File diff suppressed because it is too large Load Diff

View File

@@ -3,12 +3,12 @@ LoadPlugin exec
<Plugin exec>
Exec "ndpi" "/usr/bin/nDPIsrvd-collectd"
# Exec "ndpi" "/usr/bin/nDPIsrvd-collectd" "-s" "/tmp/ndpid-distributor.sock"
# Exec "ndpi" "/usr/bin/nDPIsrvd-collectd" "-s" "127.0.0.1:7000"
# Exec "ndpi" "/tmp/nDPIsrvd-collectd" "-s" "127.0.0.1:7000"
</Plugin>
# Uncomment for testing
LoadPlugin write_log
LoadPlugin rrdtool
<Plugin rrdtool>
DataDir "nDPIsrvd-collectd"
</Plugin>
#LoadPlugin write_log
#LoadPlugin rrdtool
#<Plugin rrdtool>
# DataDir "nDPIsrvd-collectd"
#</Plugin>

View File

@@ -1,65 +0,0 @@
# Add those types to collectd types.db
# e.g. `cat plugin_nDPIsrvd_types.db >>/usr/share/collectd/types.db'
# flow event counters
flow_new_count value:GAUGE:0:U
flow_end_count value:GAUGE:0:U
flow_idle_count value:GAUGE:0:U
flow_guessed_count value:GAUGE:0:U
flow_detected_count value:GAUGE:0:U
flow_detection_update_count value:GAUGE:0:U
flow_not_detected_count value:GAUGE:0:U
# flow additional counters
flow_total_bytes value:GAUGE:0:U
flow_risky_count value:GAUGE:0:U
# flow breed counters
flow_breed_safe_count value:GAUGE:0:U
flow_breed_acceptable_count value:GAUGE:0:U
flow_breed_fun_count value:GAUGE:0:U
flow_breed_unsafe_count value:GAUGE:0:U
flow_breed_potentially_dangerous_count value:GAUGE:0:U
flow_breed_dangerous_count value:GAUGE:0:U
flow_breed_unrated_count value:GAUGE:0:U
flow_breed_unknown_count value:GAUGE:0:U
# flow category counters
flow_category_media_count value:GAUGE:0:U
flow_category_vpn_count value:GAUGE:0:U
flow_category_email_count value:GAUGE:0:U
flow_category_data_transfer_count value:GAUGE:0:U
flow_category_web_count value:GAUGE:0:U
flow_category_social_network_count value:GAUGE:0:U
flow_category_download_count value:GAUGE:0:U
flow_category_game_count value:GAUGE:0:U
flow_category_chat_count value:GAUGE:0:U
flow_category_voip_count value:GAUGE:0:U
flow_category_database_count value:GAUGE:0:U
flow_category_remote_access_count value:GAUGE:0:U
flow_category_cloud_count value:GAUGE:0:U
flow_category_network_count value:GAUGE:0:U
flow_category_collaborative_count value:GAUGE:0:U
flow_category_rpc_count value:GAUGE:0:U
flow_category_streaming_count value:GAUGE:0:U
flow_category_system_count value:GAUGE:0:U
flow_category_software_update_count value:GAUGE:0:U
flow_category_music_count value:GAUGE:0:U
flow_category_video_count value:GAUGE:0:U
flow_category_shopping_count value:GAUGE:0:U
flow_category_productivity_count value:GAUGE:0:U
flow_category_file_sharing_count value:GAUGE:0:U
flow_category_mining_count value:GAUGE:0:U
flow_category_malware_count value:GAUGE:0:U
flow_category_advertisment_count value:GAUGE:0:U
flow_category_other_count value:GAUGE:0:U
flow_category_unknown_count value:GAUGE:0:U
# flow l3 / l4 counters
flow_l3_ip4_count value:GAUGE:0:U
flow_l3_ip6_count value:GAUGE:0:U
flow_l3_other_count value:GAUGE:0:U
flow_l4_icmp_count value:GAUGE:0:U
flow_l4_tcp_count value:GAUGE:0:U
flow_l4_udp_count value:GAUGE:0:U
flow_l4_other_count value:GAUGE:0:U

631
examples/c-collectd/rrdgraph.sh Executable file
View File

@@ -0,0 +1,631 @@
#!/usr/bin/env sh
RRDDIR="${1}"
OUTDIR="${2}"
RRDARGS="--width=800 --height=400"
REQUIRED_RRDCNT=130
if [ -z "${RRDDIR}" ]; then
printf '%s: Missing RRD directory which contains nDPIsrvd/Collectd files.\n' "${0}"
exit 1
fi
if [ -z "${OUTDIR}" ]; then
printf '%s: Missing Output directory which contains HTML files.\n' "${0}"
exit 1
fi
if [ $(ls -al ${RRDDIR}/gauge-flow_*.rrd | wc -l) -ne ${REQUIRED_RRDCNT} ]; then
printf '%s: Missing some *.rrd files. Expected: %s, Got: %s\n' "${0}" "${REQUIRED_RRDCNT}" "$(ls -al ${RRDDIR}/gauge-flow_*.rrd | wc -l)"
exit 1
fi
if [ ! -r "${OUTDIR}/index.html" -o ! -r "${OUTDIR}/flows.html" -o ! -r "${OUTDIR}/other.html" -o ! -r "${OUTDIR}/detections.html" -o ! -r "${OUTDIR}/categories.html" ]; then
printf '%s: Missing some *.html files.\n' "${0}"
exit 1
fi
TIME_PAST_HOUR="--start=-3600 --end=-0"
TIME_PAST_12HOURS="--start=-43200 --end=-0"
TIME_PAST_DAY="--start=-86400 --end=-0"
TIME_PAST_WEEK="--start=-604800 --end=-0"
TIME_PAST_MONTH="--start=-2419200 --end=-0"
TIME_PAST_3MONTHS="--start=-8035200 --end=-0"
TIME_PAST_YEAR="--start=-31536000 --end=-0"
rrdtool_graph_colorize_missing_data() {
printf 'CDEF:offline=%s,UN,INF,* AREA:offline#B3B3B311:' "${1}"
}
rrdtool_graph_print_cur_min_max_avg() {
printf 'GPRINT:%s:LAST:Current\:%%8.2lf ' "${1}"
printf 'GPRINT:%s:MIN:Minimum\:%%8.2lf ' "${1}"
printf 'GPRINT:%s:MAX:Maximum\:%%8.2lf ' "${1}"
printf 'GPRINT:%s:AVERAGE:Average\:%%8.2lf\\n' "${1}"
}
rrdtool_graph() {
TITLE="${1}"
shift
YAXIS_NAME="${1}"
shift
OUTPNG="${1}"
shift
rrdtool graph ${RRDARGS} -t "${TITLE} (past hour)" -v ${YAXIS_NAME} -Y ${TIME_PAST_HOUR} "${OUTPNG}_past_hour.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past 12 hours)" -v ${YAXIS_NAME} -Y ${TIME_PAST_12HOURS} "${OUTPNG}_past_12hours.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past day)" -v ${YAXIS_NAME} -Y ${TIME_PAST_DAY} "${OUTPNG}_past_day.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past week)" -v ${YAXIS_NAME} -Y ${TIME_PAST_WEEK} "${OUTPNG}_past_week.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past month)" -v ${YAXIS_NAME} -Y ${TIME_PAST_MONTH} "${OUTPNG}_past_month.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past 3 months)" -v ${YAXIS_NAME} -Y ${TIME_PAST_3MONTHS} "${OUTPNG}_past_month.png" ${*}
rrdtool graph ${RRDARGS} -t "${TITLE} (past year)" -v ${YAXIS_NAME} -Y ${TIME_PAST_YEAR} "${OUTPNG}_past_year.png" ${*}
}
rrdtool_graph Flows Amount "${OUTDIR}/flows" \
DEF:flows_active=${RRDDIR}/gauge-flow_active_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data flows_active) \
AREA:flows_active#54EC48::STACK \
LINE2:flows_active#24BC14:"Active" \
$(rrdtool_graph_print_cur_min_max_avg flows_active)
rrdtool_graph Detections Amount "${OUTDIR}/detections" \
DEF:flows_detected=${RRDDIR}/gauge-flow_detected_count.rrd:value:AVERAGE \
DEF:flows_guessed=${RRDDIR}/gauge-flow_guessed_count.rrd:value:AVERAGE \
DEF:flows_not_detected=${RRDDIR}/gauge-flow_not_detected_count.rrd:value:AVERAGE \
DEF:flows_detection_update=${RRDDIR}/counter-flow_detection_update_count.rrd:value:AVERAGE \
DEF:flows_risky=${RRDDIR}/counter-flow_risky_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data flows_detected) \
AREA:flows_detected#00bfff::STACK \
AREA:flows_detection_update#a1b8c4::STACK \
AREA:flows_guessed#ffff4d::STACK \
AREA:flows_not_detected#ffa64d::STACK \
AREA:flows_risky#ff4000::STACK \
LINE2:flows_detected#0000ff:"Detected........" \
$(rrdtool_graph_print_cur_min_max_avg flows_detected) \
LINE2:flows_guessed#cccc00:"Guessed........." \
$(rrdtool_graph_print_cur_min_max_avg flows_guessed) \
LINE2:flows_not_detected#ff8000:"Not-Detected...." \
$(rrdtool_graph_print_cur_min_max_avg flows_not_detected) \
LINE2:flows_detection_update#4f6e7d:"Detection-Update" \
$(rrdtool_graph_print_cur_min_max_avg flows_detection_update) \
LINE2:flows_risky#b32d00:"Risky..........." \
$(rrdtool_graph_print_cur_min_max_avg flows_risky)
rrdtool_graph "Traffic (IN/OUT)" Bytes "${OUTDIR}/traffic" \
DEF:total_src_bytes=${RRDDIR}/counter-flow_src_total_bytes.rrd:value:AVERAGE \
DEF:total_dst_bytes=${RRDDIR}/counter-flow_dst_total_bytes.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data total_src_bytes) \
AREA:total_src_bytes#00cc99:"Total-Bytes-Source2Dest":STACK \
$(rrdtool_graph_print_cur_min_max_avg total_src_bytes) \
STACK:total_dst_bytes#669999:"Total-Bytes-Dest2Source" \
$(rrdtool_graph_print_cur_min_max_avg total_dst_bytes)
rrdtool_graph Layer3-Flows Amount "${OUTDIR}/layer3" \
DEF:layer3_ip4=${RRDDIR}/gauge-flow_l3_ip4_count.rrd:value:AVERAGE \
DEF:layer3_ip6=${RRDDIR}/gauge-flow_l3_ip6_count.rrd:value:AVERAGE \
DEF:layer3_other=${RRDDIR}/gauge-flow_l3_other_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data layer3_ip4) \
AREA:layer3_ip4#73d97d::STACK \
AREA:layer3_ip6#66b3ff::STACK \
AREA:layer3_other#bea1c4::STACK \
LINE2:layer3_ip4#21772a:"IPv4." \
$(rrdtool_graph_print_cur_min_max_avg layer3_ip4) \
LINE2:layer3_ip6#0066cc:"IPv6." \
$(rrdtool_graph_print_cur_min_max_avg layer3_ip6) \
LINE2:layer3_other#92629d:"Other" \
$(rrdtool_graph_print_cur_min_max_avg layer3_other)
rrdtool_graph Layer4-Flows Amount "${OUTDIR}/layer4" \
DEF:layer4_tcp=${RRDDIR}/gauge-flow_l4_tcp_count.rrd:value:AVERAGE \
DEF:layer4_udp=${RRDDIR}/gauge-flow_l4_udp_count.rrd:value:AVERAGE \
DEF:layer4_icmp=${RRDDIR}/gauge-flow_l4_icmp_count.rrd:value:AVERAGE \
DEF:layer4_other=${RRDDIR}/gauge-flow_l4_other_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data layer4_tcp) \
AREA:layer4_tcp#73d97d::STACK \
AREA:layer4_udp#66b3ff::STACK \
AREA:layer4_icmp#ee5d9a::STACK \
AREA:layer4_other#bea1c4::STACK \
LINE2:layer4_tcp#21772a:"TCP.." \
$(rrdtool_graph_print_cur_min_max_avg layer4_tcp) \
LINE2:layer4_udp#0066cc:"UDP.." \
$(rrdtool_graph_print_cur_min_max_avg layer4_udp) \
LINE2:layer4_icmp#d01663:"ICMP." \
$(rrdtool_graph_print_cur_min_max_avg layer4_icmp) \
LINE2:layer4_other#83588d:"Other" \
$(rrdtool_graph_print_cur_min_max_avg layer4_other)
rrdtool_graph Confidence Amount "${OUTDIR}/confidence" \
DEF:conf_ip=${RRDDIR}/gauge-flow_confidence_by_ip.rrd:value:AVERAGE \
DEF:conf_port=${RRDDIR}/gauge-flow_confidence_by_port.rrd:value:AVERAGE \
DEF:conf_aggr=${RRDDIR}/gauge-flow_confidence_dpi_aggressive.rrd:value:AVERAGE \
DEF:conf_cache=${RRDDIR}/gauge-flow_confidence_dpi_cache.rrd:value:AVERAGE \
DEF:conf_pcache=${RRDDIR}/gauge-flow_confidence_dpi_partial_cache.rrd:value:AVERAGE \
DEF:conf_part=${RRDDIR}/gauge-flow_confidence_dpi_partial.rrd:value:AVERAGE \
DEF:conf_dpi=${RRDDIR}/gauge-flow_confidence_dpi.rrd:value:AVERAGE \
DEF:conf_nbpf=${RRDDIR}/gauge-flow_confidence_nbpf.rrd:value:AVERAGE \
DEF:conf_ukn=${RRDDIR}/gauge-flow_confidence_unknown.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data conf_ip) \
AREA:conf_ip#4dff4d::STACK \
AREA:conf_port#c2ff33::STACK \
AREA:conf_aggr#ffe433::STACK \
AREA:conf_cache#ffb133::STACK \
AREA:conf_pcache#ff5f33::STACK \
AREA:conf_part#e74b5b::STACK \
AREA:conf_dpi#a5aca0::STACK \
AREA:conf_nbpf#d7c1cc::STACK \
AREA:conf_ukn#ddccbb::STACK \
LINE2:conf_ip#00e600:"By-IP................" \
$(rrdtool_graph_print_cur_min_max_avg conf_ip) \
LINE2:conf_port#8fce00:"By-Port.............." \
$(rrdtool_graph_print_cur_min_max_avg conf_port) \
LINE2:conf_aggr#e6c700:"DPI-Aggressive......." \
$(rrdtool_graph_print_cur_min_max_avg conf_aggr) \
LINE2:conf_cache#e68e00:"DPI-Cache............" \
$(rrdtool_graph_print_cur_min_max_avg conf_cache) \
LINE2:conf_pcache#e63200:"DPI-Partial-Cache...." \
$(rrdtool_graph_print_cur_min_max_avg conf_pcache) \
LINE2:conf_part#c61b2b:"DPI-Partial.........." \
$(rrdtool_graph_print_cur_min_max_avg conf_part) \
LINE2:conf_dpi#7e8877:"DPI.................." \
$(rrdtool_graph_print_cur_min_max_avg conf_dpi) \
LINE2:conf_nbpf#ae849a:"nBPF................." \
$(rrdtool_graph_print_cur_min_max_avg conf_nbpf) \
LINE2:conf_ukn#aa9988:"Unknown.............." \
$(rrdtool_graph_print_cur_min_max_avg conf_ukn)
rrdtool_graph Breeds Amount "${OUTDIR}/breed" \
DEF:breed_safe=${RRDDIR}/gauge-flow_breed_safe_count.rrd:value:AVERAGE \
DEF:breed_acceptable=${RRDDIR}/gauge-flow_breed_acceptable_count.rrd:value:AVERAGE \
DEF:breed_fun=${RRDDIR}/gauge-flow_breed_fun_count.rrd:value:AVERAGE \
DEF:breed_unsafe=${RRDDIR}/gauge-flow_breed_unsafe_count.rrd:value:AVERAGE \
DEF:breed_potentially_dangerous=${RRDDIR}/gauge-flow_breed_potentially_dangerous_count.rrd:value:AVERAGE \
DEF:breed_dangerous=${RRDDIR}/gauge-flow_breed_dangerous_count.rrd:value:AVERAGE \
DEF:breed_unrated=${RRDDIR}/gauge-flow_breed_unrated_count.rrd:value:AVERAGE \
DEF:breed_unknown=${RRDDIR}/gauge-flow_breed_unknown_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data breed_safe) \
AREA:breed_safe#4dff4d::STACK \
AREA:breed_acceptable#c2ff33::STACK \
AREA:breed_fun#ffe433::STACK \
AREA:breed_unsafe#ffb133::STACK \
AREA:breed_potentially_dangerous#ff5f33::STACK \
AREA:breed_dangerous#e74b5b::STACK \
AREA:breed_unrated#a5aca0::STACK \
AREA:breed_unknown#d7c1cc::STACK \
LINE2:breed_safe#00e600:"Safe................." \
$(rrdtool_graph_print_cur_min_max_avg breed_safe) \
LINE2:breed_acceptable#8fce00:"Acceptable..........." \
$(rrdtool_graph_print_cur_min_max_avg breed_acceptable) \
LINE2:breed_fun#e6c700:"Fun.................." \
$(rrdtool_graph_print_cur_min_max_avg breed_fun) \
LINE2:breed_unsafe#e68e00:"Unsafe..............." \
$(rrdtool_graph_print_cur_min_max_avg breed_unsafe) \
LINE2:breed_potentially_dangerous#e63200:"Potentially-Dangerous" \
$(rrdtool_graph_print_cur_min_max_avg breed_potentially_dangerous) \
LINE2:breed_dangerous#c61b2b:"Dangerous............" \
$(rrdtool_graph_print_cur_min_max_avg breed_dangerous) \
LINE2:breed_unrated#7e8877:"Unrated.............." \
$(rrdtool_graph_print_cur_min_max_avg breed_unrated) \
LINE2:breed_unknown#ae849a:"Unknown.............." \
$(rrdtool_graph_print_cur_min_max_avg breed_unknown)
rrdtool_graph Categories 'Amount(SUM)' "${OUTDIR}/categories" \
DEF:cat_adlt=${RRDDIR}/gauge-flow_category_adult_content_count.rrd:value:AVERAGE \
DEF:cat_ads=${RRDDIR}/gauge-flow_category_advertisment_count.rrd:value:AVERAGE \
DEF:cat_chat=${RRDDIR}/gauge-flow_category_chat_count.rrd:value:AVERAGE \
DEF:cat_cloud=${RRDDIR}/gauge-flow_category_cloud_count.rrd:value:AVERAGE \
DEF:cat_collab=${RRDDIR}/gauge-flow_category_collaborative_count.rrd:value:AVERAGE \
DEF:cat_conn=${RRDDIR}/gauge-flow_category_conn_check_count.rrd:value:AVERAGE \
DEF:cat_cybr=${RRDDIR}/gauge-flow_category_cybersecurity_count.rrd:value:AVERAGE \
DEF:cat_xfer=${RRDDIR}/gauge-flow_category_data_transfer_count.rrd:value:AVERAGE \
DEF:cat_db=${RRDDIR}/gauge-flow_category_database_count.rrd:value:AVERAGE \
DEF:cat_dl=${RRDDIR}/gauge-flow_category_download_count.rrd:value:AVERAGE \
DEF:cat_mail=${RRDDIR}/gauge-flow_category_email_count.rrd:value:AVERAGE \
DEF:cat_fs=${RRDDIR}/gauge-flow_category_file_sharing_count.rrd:value:AVERAGE \
DEF:cat_game=${RRDDIR}/gauge-flow_category_game_count.rrd:value:AVERAGE \
DEF:cat_gamb=${RRDDIR}/gauge-flow_category_gambling_count.rrd:value:AVERAGE \
DEF:cat_iot=${RRDDIR}/gauge-flow_category_iot_scada_count.rrd:value:AVERAGE \
DEF:cat_mal=${RRDDIR}/gauge-flow_category_malware_count.rrd:value:AVERAGE \
DEF:cat_med=${RRDDIR}/gauge-flow_category_media_count.rrd:value:AVERAGE \
DEF:cat_min=${RRDDIR}/gauge-flow_category_mining_count.rrd:value:AVERAGE \
DEF:cat_mus=${RRDDIR}/gauge-flow_category_music_count.rrd:value:AVERAGE \
DEF:cat_net=${RRDDIR}/gauge-flow_category_network_count.rrd:value:AVERAGE \
DEF:cat_prod=${RRDDIR}/gauge-flow_category_productivity_count.rrd:value:AVERAGE \
DEF:cat_rem=${RRDDIR}/gauge-flow_category_remote_access_count.rrd:value:AVERAGE \
DEF:cat_rpc=${RRDDIR}/gauge-flow_category_rpc_count.rrd:value:AVERAGE \
DEF:cat_shop=${RRDDIR}/gauge-flow_category_shopping_count.rrd:value:AVERAGE \
DEF:cat_soc=${RRDDIR}/gauge-flow_category_social_network_count.rrd:value:AVERAGE \
DEF:cat_soft=${RRDDIR}/gauge-flow_category_software_update_count.rrd:value:AVERAGE \
DEF:cat_str=${RRDDIR}/gauge-flow_category_streaming_count.rrd:value:AVERAGE \
DEF:cat_sys=${RRDDIR}/gauge-flow_category_system_count.rrd:value:AVERAGE \
DEF:cat_ukn=${RRDDIR}/gauge-flow_category_unknown_count.rrd:value:AVERAGE \
DEF:cat_uns=${RRDDIR}/gauge-flow_category_unspecified_count.rrd:value:AVERAGE \
DEF:cat_vid=${RRDDIR}/gauge-flow_category_video_count.rrd:value:AVERAGE \
DEF:cat_vrt=${RRDDIR}/gauge-flow_category_virt_assistant_count.rrd:value:AVERAGE \
DEF:cat_voip=${RRDDIR}/gauge-flow_category_voip_count.rrd:value:AVERAGE \
DEF:cat_vpn=${RRDDIR}/gauge-flow_category_vpn_count.rrd:value:AVERAGE \
DEF:cat_web=${RRDDIR}/gauge-flow_category_web_count.rrd:value:AVERAGE \
DEF:cat_banned=${RRDDIR}/gauge-flow_category_banned_site_count.rrd:value:AVERAGE \
DEF:cat_unavail=${RRDDIR}/gauge-flow_category_site_unavail_count.rrd:value:AVERAGE \
DEF:cat_allowed=${RRDDIR}/gauge-flow_category_allowed_site_count.rrd:value:AVERAGE \
DEF:cat_antimal=${RRDDIR}/gauge-flow_category_antimalware_count.rrd:value:AVERAGE \
DEF:cat_crypto=${RRDDIR}/gauge-flow_category_crypto_currency_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data cat_adlt) \
AREA:cat_adlt#f0c032:"Adult.................." \
$(rrdtool_graph_print_cur_min_max_avg cat_adlt) \
STACK:cat_ads#f1c232:"Advertisment..........." \
$(rrdtool_graph_print_cur_min_max_avg cat_ads) \
STACK:cat_chat#6fa8dc:"Chat..................." \
$(rrdtool_graph_print_cur_min_max_avg cat_chat) \
STACK:cat_cloud#2986cc:"Cloud.................." \
$(rrdtool_graph_print_cur_min_max_avg cat_cloud) \
STACK:cat_collab#3212aa:"Collaborative.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_collab) \
STACK:cat_conn#22aa11:"Connection-Check......." \
$(rrdtool_graph_print_cur_min_max_avg cat_conn) \
STACK:cat_cybr#00ff00:"Cybersecurity.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_cybr) \
STACK:cat_xfer#16537e:"Data-Transfer.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_xfer) \
STACK:cat_db#cc0000:"Database..............." \
$(rrdtool_graph_print_cur_min_max_avg cat_db) \
STACK:cat_dl#6a329f:"Download..............." \
$(rrdtool_graph_print_cur_min_max_avg cat_dl) \
STACK:cat_mail#3600cc:"Mail..................." \
$(rrdtool_graph_print_cur_min_max_avg cat_mail) \
STACK:cat_fs#c90076:"File-Sharing..........." \
$(rrdtool_graph_print_cur_min_max_avg cat_fs) \
STACK:cat_game#00ff26:"Game..................." \
$(rrdtool_graph_print_cur_min_max_avg cat_game) \
STACK:cat_gamb#aa0026:"Gambling..............." \
$(rrdtool_graph_print_cur_min_max_avg cat_gamb) \
STACK:cat_iot#227867:"IoT-Scada.............." \
$(rrdtool_graph_print_cur_min_max_avg cat_iot) \
STACK:cat_mal#f44336:"Malware................" \
$(rrdtool_graph_print_cur_min_max_avg cat_mal) \
STACK:cat_med#ff8300:"Media.................." \
$(rrdtool_graph_print_cur_min_max_avg cat_med) \
STACK:cat_min#ff0000:"Mining................." \
$(rrdtool_graph_print_cur_min_max_avg cat_min) \
STACK:cat_mus#00fff0:"Music.................." \
$(rrdtool_graph_print_cur_min_max_avg cat_mus) \
STACK:cat_net#ddff00:"Network................" \
$(rrdtool_graph_print_cur_min_max_avg cat_net) \
STACK:cat_prod#29ff00:"Productivity..........." \
$(rrdtool_graph_print_cur_min_max_avg cat_prod) \
STACK:cat_rem#b52c2c:"Remote-Access.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_rem) \
STACK:cat_rpc#e15a5a:"Remote-Procedure-Call.." \
$(rrdtool_graph_print_cur_min_max_avg cat_rpc) \
STACK:cat_shop#0065ff:"Shopping..............." \
$(rrdtool_graph_print_cur_min_max_avg cat_shop) \
STACK:cat_soc#8fce00:"Social-Network........." \
$(rrdtool_graph_print_cur_min_max_avg cat_soc) \
STACK:cat_soft#007a0d:"Software-Update........" \
$(rrdtool_graph_print_cur_min_max_avg cat_soft) \
STACK:cat_str#ff00b8:"Streaming.............." \
$(rrdtool_graph_print_cur_min_max_avg cat_str) \
STACK:cat_sys#f4ff00:"System................." \
$(rrdtool_graph_print_cur_min_max_avg cat_sys) \
STACK:cat_ukn#999999:"Unknown................" \
$(rrdtool_graph_print_cur_min_max_avg cat_ukn) \
STACK:cat_uns#999999:"Unspecified............" \
$(rrdtool_graph_print_cur_min_max_avg cat_uns) \
STACK:cat_vid#518820:"Video.................." \
$(rrdtool_graph_print_cur_min_max_avg cat_vid) \
STACK:cat_vrt#216820:"Virtual-Assistant......" \
$(rrdtool_graph_print_cur_min_max_avg cat_vrt) \
STACK:cat_voip#ffc700:"Voice-Over-IP.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_voip) \
STACK:cat_vpn#378035:"Virtual-Private-Network" \
$(rrdtool_graph_print_cur_min_max_avg cat_vpn) \
STACK:cat_web#00fffb:"Web...................." \
$(rrdtool_graph_print_cur_min_max_avg cat_web) \
STACK:cat_banned#ff1010:"Banned-Sites..........." \
$(rrdtool_graph_print_cur_min_max_avg cat_banned) \
STACK:cat_unavail#ff1010:"Sites-Unavailable......" \
$(rrdtool_graph_print_cur_min_max_avg cat_unavail) \
STACK:cat_allowed#ff1010:"Allowed-Sites.........." \
$(rrdtool_graph_print_cur_min_max_avg cat_allowed) \
STACK:cat_antimal#ff1010:"Antimalware............" \
$(rrdtool_graph_print_cur_min_max_avg cat_antimal) \
STACK:cat_crypto#afaf00:"Crypto-Currency........" \
$(rrdtool_graph_print_cur_min_max_avg cat_crypto)
rrdtool_graph JSON 'Lines' "${OUTDIR}/json_lines" \
DEF:json_lines=${RRDDIR}/counter-json_lines.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data json_lines) \
AREA:json_lines#4dff4d::STACK \
LINE2:json_lines#00e600:"JSON-lines" \
$(rrdtool_graph_print_cur_min_max_avg json_lines)
rrdtool_graph JSON 'Bytes' "${OUTDIR}/json_bytes" \
DEF:json_bytes=${RRDDIR}/counter-json_bytes.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data json_bytes) \
AREA:json_bytes#4dff4d::STACK \
LINE2:json_bytes#00e600:"JSON-bytes" \
$(rrdtool_graph_print_cur_min_max_avg json_bytes)
rrdtool_graph Events 'Amount' "${OUTDIR}/events" \
DEF:init=${RRDDIR}/counter-init_count.rrd:value:AVERAGE \
DEF:reconnect=${RRDDIR}/counter-reconnect_count.rrd:value:AVERAGE \
DEF:shutdown=${RRDDIR}/counter-shutdown_count.rrd:value:AVERAGE \
DEF:status=${RRDDIR}/counter-status_count.rrd:value:AVERAGE \
DEF:packet=${RRDDIR}/counter-packet_count.rrd:value:AVERAGE \
DEF:packet_flow=${RRDDIR}/counter-packet_flow_count.rrd:value:AVERAGE \
DEF:new=${RRDDIR}/counter-flow_new_count.rrd:value:AVERAGE \
DEF:ewd=${RRDDIR}/counter-flow_end_count.rrd:value:AVERAGE \
DEF:idle=${RRDDIR}/counter-flow_idle_count.rrd:value:AVERAGE \
DEF:update=${RRDDIR}/counter-flow_update_count.rrd:value:AVERAGE \
DEF:detection_update=${RRDDIR}/counter-flow_detection_update_count.rrd:value:AVERAGE \
DEF:guessed=${RRDDIR}/counter-flow_guessed_count.rrd:value:AVERAGE \
DEF:detected=${RRDDIR}/counter-flow_detected_count.rrd:value:AVERAGE \
DEF:not_detected=${RRDDIR}/counter-flow_not_detected_count.rrd:value:AVERAGE \
DEF:analyse=${RRDDIR}/counter-flow_analyse_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data init) \
AREA:init#f1c232:"Init..................." \
$(rrdtool_graph_print_cur_min_max_avg init) \
STACK:reconnect#63bad9:"Reconnect.............." \
$(rrdtool_graph_print_cur_min_max_avg reconnect) \
STACK:shutdown#3a6f82:"Shutdown..............." \
$(rrdtool_graph_print_cur_min_max_avg shutdown) \
STACK:status#b7cbd1:"Status................." \
$(rrdtool_graph_print_cur_min_max_avg status) \
STACK:packet#0aff3f:"Packet................." \
$(rrdtool_graph_print_cur_min_max_avg packet) \
STACK:packet_flow#00c72b:"Packet-Flow............" \
$(rrdtool_graph_print_cur_min_max_avg packet_flow) \
STACK:new#c76700:"New...................." \
$(rrdtool_graph_print_cur_min_max_avg new) \
STACK:ewd#c78500:"End...................." \
$(rrdtool_graph_print_cur_min_max_avg ewd) \
STACK:idle#c7a900:"Idle..................." \
$(rrdtool_graph_print_cur_min_max_avg idle) \
STACK:update#c7c400:"Updates................" \
$(rrdtool_graph_print_cur_min_max_avg update) \
STACK:detection_update#a2c700:"Detection-Updates......" \
$(rrdtool_graph_print_cur_min_max_avg detection_update) \
STACK:guessed#7bc700:"Guessed................" \
$(rrdtool_graph_print_cur_min_max_avg guessed) \
STACK:detected#00c781:"Detected..............." \
$(rrdtool_graph_print_cur_min_max_avg detected) \
STACK:not_detected#00bdc7:"Not-Detected..........." \
$(rrdtool_graph_print_cur_min_max_avg not_detected) \
STACK:analyse#1400c7:"Analyse................" \
$(rrdtool_graph_print_cur_min_max_avg analyse)
rrdtool_graph Error-Events 'Amount' "${OUTDIR}/error_events" \
DEF:error_0=${RRDDIR}/counter-error_unknown_datalink.rrd:value:AVERAGE \
DEF:error_1=${RRDDIR}/counter-error_unknown_l3_protocol.rrd:value:AVERAGE \
DEF:error_2=${RRDDIR}/counter-error_unsupported_datalink.rrd:value:AVERAGE \
DEF:error_3=${RRDDIR}/counter-error_packet_too_short.rrd:value:AVERAGE \
DEF:error_4=${RRDDIR}/counter-error_packet_type_unknown.rrd:value:AVERAGE \
DEF:error_5=${RRDDIR}/counter-error_packet_header_invalid.rrd:value:AVERAGE \
DEF:error_6=${RRDDIR}/counter-error_ip4_packet_too_short.rrd:value:AVERAGE \
DEF:error_7=${RRDDIR}/counter-error_ip4_size_smaller_than_header.rrd:value:AVERAGE \
DEF:error_8=${RRDDIR}/counter-error_ip4_l4_payload_detection.rrd:value:AVERAGE \
DEF:error_9=${RRDDIR}/counter-error_ip6_packet_too_short.rrd:value:AVERAGE \
DEF:error_10=${RRDDIR}/counter-error_ip6_size_smaller_than_header.rrd:value:AVERAGE \
DEF:error_11=${RRDDIR}/counter-error_ip6_l4_payload_detection.rrd:value:AVERAGE \
DEF:error_12=${RRDDIR}/counter-error_tcp_packet_too_short.rrd:value:AVERAGE \
DEF:error_13=${RRDDIR}/counter-error_udp_packet_too_short.rrd:value:AVERAGE \
DEF:error_14=${RRDDIR}/counter-error_capture_size_smaller_than_packet.rrd:value:AVERAGE \
DEF:error_15=${RRDDIR}/counter-error_max_flows_to_track.rrd:value:AVERAGE \
DEF:error_16=${RRDDIR}/counter-error_flow_memory_alloc.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data error_0) \
AREA:error_0#ff6a00:"Unknown-datalink-layer-packet............................" \
$(rrdtool_graph_print_cur_min_max_avg error_0) \
STACK:error_1#bf7540:"Unknown-L3-protocol......................................" \
$(rrdtool_graph_print_cur_min_max_avg error_1) \
STACK:error_2#ffd500:"Unsupported-datalink-layer..............................." \
$(rrdtool_graph_print_cur_min_max_avg error_2) \
STACK:error_3#bfaa40:"Packet-too-short........................................." \
$(rrdtool_graph_print_cur_min_max_avg error_3) \
STACK:error_4#bfff00:"Unknown-packet-type......................................" \
$(rrdtool_graph_print_cur_min_max_avg error_4) \
STACK:error_5#9fbf40:"Packet-header-invalid...................................." \
$(rrdtool_graph_print_cur_min_max_avg error_5) \
STACK:error_6#55ff00:"IP4-packet-too-short....................................." \
$(rrdtool_graph_print_cur_min_max_avg error_6) \
STACK:error_7#6abf40:"Packet-smaller-than-IP4-header..........................." \
$(rrdtool_graph_print_cur_min_max_avg error_7) \
STACK:error_8#00ff15:"nDPI-IPv4/L4-payload-detection-failed...................." \
$(rrdtool_graph_print_cur_min_max_avg error_8) \
STACK:error_9#40bf4a:"IP6-packet-too-short....................................." \
$(rrdtool_graph_print_cur_min_max_avg error_9) \
STACK:error_10#00ff80:"Packet-smaller-than-IP6-header..........................." \
$(rrdtool_graph_print_cur_min_max_avg error_10) \
STACK:error_11#40bf80:"nDPI-IPv6/L4-payload-detection-failed...................." \
$(rrdtool_graph_print_cur_min_max_avg error_11) \
STACK:error_12#00ffea:"TCP-packet-smaller-than-expected........................." \
$(rrdtool_graph_print_cur_min_max_avg error_12) \
STACK:error_13#40bfb5:"UDP-packet-smaller-than-expected........................." \
$(rrdtool_graph_print_cur_min_max_avg error_13) \
STACK:error_14#00aaff:"Captured-packet-size-is-smaller-than-expected-packet-size" \
$(rrdtool_graph_print_cur_min_max_avg error_14) \
STACK:error_15#4095bf:"Max-flows-to-track-reached..............................." \
$(rrdtool_graph_print_cur_min_max_avg error_15) \
STACK:error_16#0040ff:"Flow-memory-allocation-failed............................" \
$(rrdtool_graph_print_cur_min_max_avg error_16)
rrdtool_graph Risk-Severites Amount "${OUTDIR}/severities" \
DEF:sever_crit=${RRDDIR}/gauge-flow_severity_critical.rrd:value:AVERAGE \
DEF:sever_emer=${RRDDIR}/gauge-flow_severity_emergency.rrd:value:AVERAGE \
DEF:sever_high=${RRDDIR}/gauge-flow_severity_high.rrd:value:AVERAGE \
DEF:sever_low=${RRDDIR}/gauge-flow_severity_low.rrd:value:AVERAGE \
DEF:sever_med=${RRDDIR}/gauge-flow_severity_medium.rrd:value:AVERAGE \
DEF:sever_sev=${RRDDIR}/gauge-flow_severity_severe.rrd:value:AVERAGE \
DEF:sever_ukn=${RRDDIR}/gauge-flow_severity_unknown.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data sever_crit) \
AREA:sever_crit#e68e00::STACK \
AREA:sever_emer#e63200::STACK \
AREA:sever_high#e6c700::STACK \
AREA:sever_low#00e600::STACK \
AREA:sever_med#8fce00::STACK \
AREA:sever_sev#c61b2b::STACK \
AREA:sever_ukn#7e8877::STACK \
LINE2:sever_crit#e68e00:"Critical............." \
$(rrdtool_graph_print_cur_min_max_avg sever_crit) \
LINE2:sever_emer#e63200:"Emergency............" \
$(rrdtool_graph_print_cur_min_max_avg sever_emer) \
LINE2:sever_high#e6c700:"High................." \
$(rrdtool_graph_print_cur_min_max_avg sever_high) \
LINE2:sever_low#00e600:"Low.................." \
$(rrdtool_graph_print_cur_min_max_avg sever_low) \
LINE2:sever_med#8fce00:"Medium..............." \
$(rrdtool_graph_print_cur_min_max_avg sever_med) \
LINE2:sever_sev#c61b2b:"Severe..............." \
$(rrdtool_graph_print_cur_min_max_avg sever_sev) \
LINE2:sever_ukn#7e8877:"Unknown.............." \
$(rrdtool_graph_print_cur_min_max_avg sever_ukn)
rrdtool_graph Risks 'Amount' "${OUTDIR}/risky_events" \
DEF:risk_1=${RRDDIR}/gauge-flow_risk_1_count.rrd:value:AVERAGE \
DEF:risk_2=${RRDDIR}/gauge-flow_risk_2_count.rrd:value:AVERAGE \
DEF:risk_3=${RRDDIR}/gauge-flow_risk_3_count.rrd:value:AVERAGE \
DEF:risk_4=${RRDDIR}/gauge-flow_risk_4_count.rrd:value:AVERAGE \
DEF:risk_5=${RRDDIR}/gauge-flow_risk_5_count.rrd:value:AVERAGE \
DEF:risk_6=${RRDDIR}/gauge-flow_risk_6_count.rrd:value:AVERAGE \
DEF:risk_7=${RRDDIR}/gauge-flow_risk_7_count.rrd:value:AVERAGE \
DEF:risk_8=${RRDDIR}/gauge-flow_risk_8_count.rrd:value:AVERAGE \
DEF:risk_9=${RRDDIR}/gauge-flow_risk_9_count.rrd:value:AVERAGE \
DEF:risk_10=${RRDDIR}/gauge-flow_risk_10_count.rrd:value:AVERAGE \
DEF:risk_11=${RRDDIR}/gauge-flow_risk_11_count.rrd:value:AVERAGE \
DEF:risk_12=${RRDDIR}/gauge-flow_risk_12_count.rrd:value:AVERAGE \
DEF:risk_13=${RRDDIR}/gauge-flow_risk_13_count.rrd:value:AVERAGE \
DEF:risk_14=${RRDDIR}/gauge-flow_risk_14_count.rrd:value:AVERAGE \
DEF:risk_15=${RRDDIR}/gauge-flow_risk_15_count.rrd:value:AVERAGE \
DEF:risk_16=${RRDDIR}/gauge-flow_risk_16_count.rrd:value:AVERAGE \
DEF:risk_17=${RRDDIR}/gauge-flow_risk_17_count.rrd:value:AVERAGE \
DEF:risk_18=${RRDDIR}/gauge-flow_risk_18_count.rrd:value:AVERAGE \
DEF:risk_19=${RRDDIR}/gauge-flow_risk_19_count.rrd:value:AVERAGE \
DEF:risk_20=${RRDDIR}/gauge-flow_risk_20_count.rrd:value:AVERAGE \
DEF:risk_21=${RRDDIR}/gauge-flow_risk_21_count.rrd:value:AVERAGE \
DEF:risk_22=${RRDDIR}/gauge-flow_risk_22_count.rrd:value:AVERAGE \
DEF:risk_23=${RRDDIR}/gauge-flow_risk_23_count.rrd:value:AVERAGE \
DEF:risk_24=${RRDDIR}/gauge-flow_risk_24_count.rrd:value:AVERAGE \
DEF:risk_25=${RRDDIR}/gauge-flow_risk_25_count.rrd:value:AVERAGE \
DEF:risk_26=${RRDDIR}/gauge-flow_risk_26_count.rrd:value:AVERAGE \
DEF:risk_27=${RRDDIR}/gauge-flow_risk_27_count.rrd:value:AVERAGE \
DEF:risk_28=${RRDDIR}/gauge-flow_risk_28_count.rrd:value:AVERAGE \
DEF:risk_29=${RRDDIR}/gauge-flow_risk_29_count.rrd:value:AVERAGE \
DEF:risk_30=${RRDDIR}/gauge-flow_risk_30_count.rrd:value:AVERAGE \
DEF:risk_31=${RRDDIR}/gauge-flow_risk_31_count.rrd:value:AVERAGE \
DEF:risk_32=${RRDDIR}/gauge-flow_risk_32_count.rrd:value:AVERAGE \
DEF:risk_33=${RRDDIR}/gauge-flow_risk_33_count.rrd:value:AVERAGE \
DEF:risk_34=${RRDDIR}/gauge-flow_risk_34_count.rrd:value:AVERAGE \
DEF:risk_35=${RRDDIR}/gauge-flow_risk_35_count.rrd:value:AVERAGE \
DEF:risk_36=${RRDDIR}/gauge-flow_risk_36_count.rrd:value:AVERAGE \
DEF:risk_37=${RRDDIR}/gauge-flow_risk_37_count.rrd:value:AVERAGE \
DEF:risk_38=${RRDDIR}/gauge-flow_risk_38_count.rrd:value:AVERAGE \
DEF:risk_39=${RRDDIR}/gauge-flow_risk_39_count.rrd:value:AVERAGE \
DEF:risk_40=${RRDDIR}/gauge-flow_risk_40_count.rrd:value:AVERAGE \
DEF:risk_41=${RRDDIR}/gauge-flow_risk_41_count.rrd:value:AVERAGE \
DEF:risk_42=${RRDDIR}/gauge-flow_risk_42_count.rrd:value:AVERAGE \
DEF:risk_43=${RRDDIR}/gauge-flow_risk_43_count.rrd:value:AVERAGE \
DEF:risk_44=${RRDDIR}/gauge-flow_risk_44_count.rrd:value:AVERAGE \
DEF:risk_45=${RRDDIR}/gauge-flow_risk_45_count.rrd:value:AVERAGE \
DEF:risk_46=${RRDDIR}/gauge-flow_risk_46_count.rrd:value:AVERAGE \
DEF:risk_47=${RRDDIR}/gauge-flow_risk_47_count.rrd:value:AVERAGE \
DEF:risk_48=${RRDDIR}/gauge-flow_risk_48_count.rrd:value:AVERAGE \
DEF:risk_49=${RRDDIR}/gauge-flow_risk_49_count.rrd:value:AVERAGE \
DEF:risk_50=${RRDDIR}/gauge-flow_risk_50_count.rrd:value:AVERAGE \
DEF:risk_51=${RRDDIR}/gauge-flow_risk_51_count.rrd:value:AVERAGE \
DEF:risk_52=${RRDDIR}/gauge-flow_risk_52_count.rrd:value:AVERAGE \
DEF:risk_53=${RRDDIR}/gauge-flow_risk_53_count.rrd:value:AVERAGE \
DEF:risk_unknown=${RRDDIR}/gauge-flow_risk_unknown_count.rrd:value:AVERAGE \
$(rrdtool_graph_colorize_missing_data risk_1) \
AREA:risk_1#ff0000:"XSS-Attack..............................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_1) \
STACK:risk_2#ff5500:"SQL-Injection............................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_2) \
STACK:risk_3#ffaa00:"RCE-Injection............................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_3) \
STACK:risk_4#ffff00:"Binary-App-Transfer......................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_4) \
STACK:risk_5#aaff00:"Known-Proto-on-Non-Std-Port.............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_5) \
STACK:risk_6#55ff00:"Self-signed-Cert........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_6) \
STACK:risk_7#00ff55:"Obsolete-TLS-v1.1-or-older..............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_7) \
STACK:risk_8#00ffaa:"Weak-TLS-Cipher.........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_8) \
STACK:risk_9#00ffff:"TLS-Cert-Expired........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_9) \
STACK:risk_10#00aaff:"TLS-Cert-Mismatch........................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_10) \
STACK:risk_11#0055ff:"HTTP-Suspicious-User-Agent..............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_11) \
STACK:risk_12#0000ff:"HTTP-Numeric-IP-Address.................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_12) \
STACK:risk_13#5500ff:"HTTP-Suspicious-URL......................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_13) \
STACK:risk_14#aa00ff:"HTTP-Suspicious-Header..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_14) \
STACK:risk_15#ff00ff:"TLS-probably-Not-Carrying-HTTPS.........................." \
$(rrdtool_graph_print_cur_min_max_avg risk_15) \
STACK:risk_16#ff00aa:"Suspicious-DGA-Domain-name..............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_16) \
STACK:risk_17#ff0055:"Malformed-Packet........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_17) \
STACK:risk_18#602020:"SSH-Obsolete-Client-Version/Cipher......................." \
$(rrdtool_graph_print_cur_min_max_avg risk_18) \
STACK:risk_19#603a20:"SSH-Obsolete-Server-Version/Cipher......................." \
$(rrdtool_graph_print_cur_min_max_avg risk_19) \
STACK:risk_20#605520:"SMB-Insecure-Version....................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_20) \
STACK:risk_21#506020:"TLS-Suspicious-ESNI-Usage................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_21) \
STACK:risk_22#356020:"Unsafe-Protocol.........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_22) \
STACK:risk_23#206025:"Suspicious-DNS-Traffic..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_23) \
STACK:risk_24#206040:"Missing-SNI-TLS-Extension................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_24) \
STACK:risk_25#20605a:"HTTP-Suspicious-Content.................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_25) \
STACK:risk_26#204a60:"Risky-ASN................................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_26) \
STACK:risk_27#203060:"Risky-Domain-Name........................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_27) \
STACK:risk_28#2a2060:"Malicious-JA3-Fingerprint................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_28) \
STACK:risk_29#452060:"Malicious-SSL-Cert/SHA1-Fingerprint......................" \
$(rrdtool_graph_print_cur_min_max_avg risk_29) \
STACK:risk_30#602060:"Desktop/File-Sharing....................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_30) \
STACK:risk_31#602045:"Uncommon-TLS-ALPN........................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_31) \
STACK:risk_32#df2020:"TLS-Cert-Validity-Too-Long..............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_32) \
STACK:risk_33#df6020:"TLS-Suspicious-Extension................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_33) \
STACK:risk_34#df9f20:"TLS-Fatal-Alert.........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_34) \
STACK:risk_35#dfdf20:"Suspicious-Entropy......................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_35) \
STACK:risk_36#9fdf20:"Clear-Text-Credentials..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_36) \
STACK:risk_37#60df20:"Large-DNS-Packet........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_37) \
STACK:risk_38#20df20:"Fragmented-DNS-Message..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_38) \
STACK:risk_39#20df60:"Text-With-Non-Printable-Chars............................" \
$(rrdtool_graph_print_cur_min_max_avg risk_39) \
STACK:risk_40#20df9f:"Possible-Exploit........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_40) \
STACK:risk_41#20dfdf:"TLS-Cert-About-To-Expire................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_41) \
STACK:risk_42#209fdf:"IDN-Domain-Name.........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_42) \
STACK:risk_43#2060df:"Error-Code..............................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_43) \
STACK:risk_44#2020df:"Crawler/Bot.............................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_44) \
STACK:risk_45#6020df:"Anonymous-Subscriber....................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_45) \
STACK:risk_46#9f20df:"Unidirectional-Traffic..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_46) \
STACK:risk_47#df20df:"HTTP-Obsolete-Server....................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_47) \
STACK:risk_48#df68df:"Periodic-Flow............................................" \
$(rrdtool_graph_print_cur_min_max_avg risk_48) \
STACK:risk_49#dfffdf:"Minor-Issues............................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_49) \
STACK:risk_50#ef20df:"TCP-Connection-Issues...................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_50) \
STACK:risk_51#ef60df:"Fully-Encrypted.........................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_51) \
STACK:risk_52#efa0df:"Invalid-ALPN/SNI-combination............................." \
$(rrdtool_graph_print_cur_min_max_avg risk_52) \
STACK:risk_53#efffdf:"Malware-Host-Contacted..................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_53) \
STACK:risk_unknown#df2060:"Unknown.................................................." \
$(rrdtool_graph_print_cur_min_max_avg risk_unknown)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,198 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

View File

@@ -0,0 +1,93 @@
body {
font-size: .875rem;
}
.feather {
width: 16px;
height: 16px;
vertical-align: text-bottom;
}
/*
* Sidebar
*/
.sidebar {
position: fixed;
top: 0;
bottom: 0;
left: 0;
z-index: 100; /* Behind the navbar */
padding: 0;
box-shadow: inset -1px 0 0 rgba(0, 0, 0, .1);
}
.sidebar-sticky {
position: -webkit-sticky;
position: sticky;
top: 48px; /* Height of navbar */
height: calc(100vh - 48px);
padding-top: .5rem;
overflow-x: hidden;
overflow-y: auto; /* Scrollable contents if viewport is shorter than content. */
}
.sidebar .nav-link {
font-weight: 500;
color: #333;
}
.sidebar .nav-link .feather {
margin-right: 4px;
color: #999;
}
.sidebar .nav-link.active {
color: #007bff;
}
.sidebar .nav-link:hover .feather,
.sidebar .nav-link.active .feather {
color: inherit;
}
.sidebar-heading {
font-size: .75rem;
text-transform: uppercase;
}
/*
* Navbar
*/
.navbar-brand {
padding-top: .75rem;
padding-bottom: .75rem;
font-size: 1rem;
background-color: rgba(0, 0, 0, .25);
box-shadow: inset -1px 0 0 rgba(0, 0, 0, .25);
}
.navbar .form-control {
padding: .75rem 1rem;
border-width: 0;
border-radius: 0;
}
.form-control-dark {
color: #fff;
background-color: rgba(255, 255, 255, .1);
border-color: rgba(255, 255, 255, .1);
}
.form-control-dark:focus {
border-color: transparent;
box-shadow: 0 0 0 3px rgba(255, 255, 255, .25);
}
/*
* Utilities
*/
.border-top { border-top: 1px solid #e5e5e5; }
.border-bottom { border-bottom: 1px solid #e5e5e5; }

View File

@@ -0,0 +1,198 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

View File

@@ -0,0 +1,198 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,179 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

View File

@@ -0,0 +1,425 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link active" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="flows_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="detections_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="confidence_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="breed_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="categories_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="error_events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,198 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_lines_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="json_bytes_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

View File

@@ -0,0 +1,217 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="traffic_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer3_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="layer4_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,198 @@
<!DOCTYPE html>
<html lang="en"><head>
<meta http-equiv="cache-control" content="max-age=0" />
<meta http-equiv="cache-control" content="no-cache" />
<meta http-equiv="expires" content="0" />
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT" />
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="nDPId RRD Graph">
<meta name="author" content="Toni Uhlig">
<link rel="icon" href="https://getbootstrap.com/docs/4.0/assets/img/favicons/favicon.ico">
<title>nDPId Dashboard</title>
<link rel="canonical" href="https://getbootstrap.com/docs/4.0/examples/dashboard/">
<!-- Bootstrap core CSS -->
<link href="bootstrap.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="dashboard.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-dark sticky-top bg-dark flex-md-nowrap p-0">
<a class="navbar-brand col-sm-3 col-md-2 mr-0" href="https://github.com/utoni/nDPId">nDPId Collectd RRD Graph</a>
</nav>
<div class="container-fluid">
<div class="row">
<nav class="col-md-2 d-none d-md-block bg-light sidebar">
<div class="sidebar-sticky">
<h6 class="sidebar-heading d-flex justify-content-between align-items-center px-3 mt-4 mb-1 text-muted">
<span>Graphs</span>
</h6>
<ul class="nav flex-column mb-2">
<li class="nav-item">
<a class="nav-link" href="index.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="flows.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line><polyline points="10 9 9 9 8 9"></polyline>
</svg>
Flows
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="other.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Other
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="detections.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Detections
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="categories.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Categories
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="risks.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Risks
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="jsons.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
JSONs
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="events.html">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="feather feather-file-text">
<path d="M14 2H6a2 2 0 0 0-2 2v16a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V8z"></path>
<polyline points="14 2 14 8 20 8"></polyline>
<line x1="16" y1="13" x2="8" y2="13"></line>
<line x1="16" y1="17" x2="8" y2="17"></line>
<polyline points="10 9 9 9 8 9"></polyline>
</svg>
Events
</a>
</li>
</ul>
</div>
</nav>
<main role="main" class="col-md-9 ml-sm-auto col-lg-10 pt-3 px-4">
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="severities_past_year.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_hour.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_12hours.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_day.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_week.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_month.png" class="img-fluid" alt="Responsive image">
</div>
<div class="d-flex justify-content-center flex-wrap flex-md-nowrap align-items-center pb-2 mb-3 border-bottom">
<img src="risky_events_past_year.png" class="img-fluid" alt="Responsive image">
</div>
</main>
</div>
</div>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="jquery-3.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
<script>window.jQuery || document.write('<script src="../../assets/js/vendor/jquery-slim.min.js"><\/script>')</script>
<script src="popper.js"></script>
<script src="bootstrap.js"></script>
<!-- Icons -->
<script src="feather.js"></script>
<script>
feather.replace()
</script>
</body></html>

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,138 +0,0 @@
#include <arpa/inet.h>
#include <errno.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <sys/socket.h>
#include <sys/types.h>
#include <unistd.h>
#include "config.h"
#include "jsmn.h"
static char serv_listen_addr[INET_ADDRSTRLEN] = DISTRIBUTOR_HOST;
static uint16_t serv_listen_port = DISTRIBUTOR_PORT;
int main(void)
{
int sockfd = socket(AF_INET, SOCK_STREAM, 0);
struct sockaddr_in remote_addr = {};
socklen_t remote_addrlen = sizeof(remote_addr);
uint8_t buf[NETWORK_BUFFER_MAX_SIZE];
size_t buf_used = 0;
size_t json_start = 0;
unsigned long long int json_bytes = 0;
jsmn_parser parser;
jsmntok_t tokens[128];
if (sockfd < 0)
{
perror("socket");
return 1;
}
remote_addr.sin_family = AF_INET;
if (inet_pton(AF_INET, &serv_listen_addr[0], &remote_addr.sin_addr) != 1)
{
perror("inet_pton");
return 1;
}
remote_addr.sin_port = htons(serv_listen_port);
if (connect(sockfd, (struct sockaddr *)&remote_addr, remote_addrlen) != 0)
{
perror("connect");
return 1;
}
while (1)
{
errno = 0;
ssize_t bytes_read = read(sockfd, buf + buf_used, sizeof(buf) - buf_used);
if (bytes_read <= 0 || errno != 0)
{
fprintf(stderr, "Remote end disconnected.\n");
break;
}
buf_used += bytes_read;
while (buf_used >= NETWORK_BUFFER_LENGTH_DIGITS + 1)
{
if (buf[NETWORK_BUFFER_LENGTH_DIGITS] != '{')
{
fprintf(stderr, "BUG: JSON invalid opening character: '%c'\n", buf[NETWORK_BUFFER_LENGTH_DIGITS]);
exit(1);
}
char * json_str_start = NULL;
json_bytes = strtoull((char *)buf, &json_str_start, 10);
json_bytes += (uint8_t *)json_str_start - buf;
json_start = (uint8_t *)json_str_start - buf;
if (errno == ERANGE)
{
fprintf(stderr, "BUG: Size of JSON exceeds limit\n");
exit(1);
}
if ((uint8_t *)json_str_start == buf)
{
fprintf(stderr, "BUG: Missing size before JSON string: \"%.*s\"\n", NETWORK_BUFFER_LENGTH_DIGITS, buf);
exit(1);
}
if (json_bytes > sizeof(buf))
{
fprintf(stderr, "BUG: JSON string too big: %llu > %zu\n", json_bytes, sizeof(buf));
exit(1);
}
if (json_bytes > buf_used)
{
break;
}
if (buf[json_bytes - 2] != '}' ||
buf[json_bytes - 1] != '\n')
{
fprintf(stderr, "BUG: Invalid JSON string: \"%.*s\"\n", (int)json_bytes, buf);
exit(1);
}
int r;
jsmn_init(&parser);
r = jsmn_parse(&parser,
(char *)(buf + json_start),
json_bytes - json_start,
tokens,
sizeof(tokens) / sizeof(tokens[0]));
if (r < 0 || tokens[0].type != JSMN_OBJECT)
{
fprintf(stderr, "JSON parsing failed with return value %d at position %u\n", r, parser.pos);
fprintf(stderr, "JSON string: '%.*s'\n", (int)(json_bytes - json_start), (char *)(buf + json_start));
exit(1);
}
for (int i = 1; i < r; i++)
{
if (i % 2 == 1)
{
#ifdef JSMN_PARENT_LINKS
printf("[%d][%d]", i, tokens[i].parent);
#endif
printf("[%.*s : ", tokens[i].end - tokens[i].start, (char *)(buf + json_start) + tokens[i].start);
}
else
{
printf("%.*s] ", tokens[i].end - tokens[i].start, (char *)(buf + json_start) + tokens[i].start);
}
}
printf("EoF\n");
memmove(buf, buf + json_bytes, buf_used - json_bytes);
buf_used -= json_bytes;
json_bytes = 0;
json_start = 0;
}
}
return 0;
}

View File

@@ -0,0 +1,714 @@
#include <dbus-1.0/dbus/dbus.h>
#include <signal.h>
#include <stdint.h>
#include "nDPIsrvd.h"
#include "utstring.h"
#include "utils.h"
#define SLEEP_TIME_IN_S (3)
struct flow_user_data
{
nDPIsrvd_ull detected_risks;
};
enum dbus_level
{
DBUS_LOW = 0,
DBUS_NORMAL,
DBUS_CRITICAL
};
static char const * const flow_severities[] = {"Low", "Medium", "High", "Severe", "Critical", "Emergency"};
static char const * const flow_breeds[] = {
"Safe", "Acceptable", "Fun", "Unsafe", "Potentially_Dangerous", "Tracker_Ads", "Dangerous", "Unrated", "???"};
static char const * const flow_categories[] = {"Unspecified",
"Media",
"VPN",
"Email",
"DataTransfer",
"Web",
"SocialNetwork",
"Download",
"Game",
"Chat",
"VoIP",
"Database",
"RemoteAccess",
"Cloud",
"Network",
"Collaborative",
"RPC",
"Streaming",
"System",
"SoftwareUpdate",
"Music",
"Video",
"Shopping",
"Productivity",
"FileSharing",
"ConnCheck",
"IoT-Scada",
"VirtAssistant",
"Cybersecurity",
"AdultContent",
"Mining",
"Malware",
"Advertisement",
"Banned_Site",
"Site_Unavailable",
"Allowed_Site",
"Antimalware",
"Crypto_Currency",
"Gambling",
"Health",
"ArtifIntelligence",
"Finance",
"News",
"Sport",
"Business",
"Internet",
"Blockchain_Crypto",
"Blog_Forum",
"Government",
"Education",
"CDN_Proxy",
"Hw_Sw",
"Dating",
"Travel",
"Food",
"Bots",
"Scanners",
"Hosting",
"Art",
"Fashion",
"Books",
"Science",
"Maps_Navigation",
"Login_Portal",
"Legal",
"Environmental_Services",
"Culture",
"Housing",
"Telecommunication",
"Transportation",
"Design",
"Employment",
"Events",
"Weather",
"Lifestyle",
"Real_Estate",
"Security",
"Environment",
"Hobby",
"Computer_Science",
"Construction",
"Engineering",
"Religion",
"Entertainment",
"Agriculture",
"Technology",
"Beauty",
"History",
"Politics",
"Vehicles"};
static uint8_t desired_flow_severities[nDPIsrvd_ARRAY_LENGTH(flow_severities)] = {};
static uint8_t desired_flow_breeds[nDPIsrvd_ARRAY_LENGTH(flow_breeds)] = {};
static uint8_t desired_flow_categories[nDPIsrvd_ARRAY_LENGTH(flow_categories)] = {};
static unsigned int id = 0;
static char const * const application = "nDPIsrvd.notifyd";
static int main_thread_shutdown = 0;
static char * pidfile = NULL;
static char * serv_optarg = NULL;
#ifdef ENABLE_MEMORY_PROFILING
void nDPIsrvd_memprof_log_alloc(size_t alloc_size)
{
(void)alloc_size;
}
void nDPIsrvd_memprof_log_free(size_t free_size)
{
(void)free_size;
}
void nDPIsrvd_memprof_log(char const * const format, ...)
{
va_list ap;
va_start(ap, format);
fprintf(stderr, "%s", "nDPIsrvd MemoryProfiler: ");
vfprintf(stderr, format, ap);
fprintf(stderr, "%s\n", "");
va_end(ap);
}
#endif
static void send_to_dbus(char const * const icon,
char const * const urgency,
enum dbus_level level,
char const * const summary,
char const * const body,
int timeout)
{
DBusConnection * connection = dbus_bus_get(DBUS_BUS_SESSION, 0);
DBusMessage * message = dbus_message_new_method_call("org.freedesktop.Notifications",
"/org/freedesktop/Notifications",
"org.freedesktop.Notifications",
"Notify");
DBusMessageIter iter[4];
dbus_message_iter_init_append(message, iter);
dbus_message_iter_append_basic(iter, 's', &application);
dbus_message_iter_append_basic(iter, 'u', &id);
dbus_message_iter_append_basic(iter, 's', &icon);
dbus_message_iter_append_basic(iter, 's', &summary);
dbus_message_iter_append_basic(iter, 's', &body);
dbus_message_iter_open_container(iter, 'a', "s", iter + 1);
dbus_message_iter_close_container(iter, iter + 1);
dbus_message_iter_open_container(iter, 'a', "{sv}", iter + 1);
dbus_message_iter_open_container(iter + 1, 'e', 0, iter + 2);
dbus_message_iter_append_basic(iter + 2, 's', &urgency);
dbus_message_iter_open_container(iter + 2, 'v', "y", iter + 3);
dbus_message_iter_append_basic(iter + 3, 'y', &level);
dbus_message_iter_close_container(iter + 2, iter + 3);
dbus_message_iter_close_container(iter + 1, iter + 2);
dbus_message_iter_close_container(iter, iter + 1);
dbus_message_iter_append_basic(iter, 'i', &timeout);
dbus_connection_send(connection, message, 0);
dbus_connection_flush(connection);
dbus_message_unref(message);
dbus_connection_unref(connection);
id++;
}
static void notify(enum dbus_level level, char const * const summary, int timeout, char const * const body)
{
send_to_dbus("dialog-information", "urgency", level, summary, body, timeout);
}
__attribute__((format(printf, 4, 5))) static void notifyf(
enum dbus_level level, char const * const summary, int timeout, char const * const body_fmt, ...)
{
va_list ap;
char buf[BUFSIZ];
va_start(ap, body_fmt);
if (vsnprintf(buf, sizeof(buf), body_fmt, ap) > 0)
{
notify(level, summary, timeout, buf);
}
va_end(ap);
}
static ssize_t get_value_index(char const * const possible_values[],
size_t possible_values_size,
char const * const needle,
size_t needle_len)
{
size_t i;
for (i = 0; i < possible_values_size; ++i)
{
if (strncmp(needle, possible_values[i], needle_len) == 0)
{
break;
}
}
if (i == possible_values_size)
{
return -1;
}
return i;
}
static void check_value(char const * const possible_values[],
size_t possible_values_size,
char const * const needle,
size_t needle_len)
{
if (get_value_index(possible_values, possible_values_size, needle, needle_len) == -1)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 5000, "BUG: Unknown value: %.*s", (int)needle_len, needle);
}
}
static enum nDPIsrvd_callback_return notifyd_json_callback(struct nDPIsrvd_socket * const sock,
struct nDPIsrvd_instance * const instance,
struct nDPIsrvd_thread_data * const thread_data,
struct nDPIsrvd_flow * const flow)
{
(void)instance;
(void)thread_data;
struct nDPIsrvd_json_token const * const flow_event_name = TOKEN_GET_SZ(sock, "flow_event_name");
struct flow_user_data * flow_user_data = NULL;
if (flow != NULL)
{
flow_user_data = (struct flow_user_data *)flow->flow_user_data;
}
if (TOKEN_VALUE_EQUALS_SZ(sock, flow_event_name, "detected") != 0 ||
TOKEN_VALUE_EQUALS_SZ(sock, flow_event_name, "detection-update") != 0 ||
TOKEN_VALUE_EQUALS_SZ(sock, flow_event_name, "update") != 0)
{
struct nDPIsrvd_json_token const * const flow_risks = TOKEN_GET_SZ(sock, "ndpi", "flow_risk");
struct nDPIsrvd_json_token const * current = NULL;
int next_child_index = -1, desired_severity_found = 0;
UT_string risks;
utstring_init(&risks);
if (flow_risks != NULL)
{
while ((current = nDPIsrvd_get_next_token(sock, flow_risks, &next_child_index)) != NULL)
{
nDPIsrvd_ull numeric_risk_value = (nDPIsrvd_ull)-1;
size_t flow_risk_key_len = 0;
char const * const flow_risk_key = TOKEN_GET_KEY(sock, current, &flow_risk_key_len);
if (flow_risk_key == NULL || flow_risk_key_len == 0)
{
continue;
}
if (str_value_to_ull(flow_risk_key, &numeric_risk_value) == CONVERSION_OK && flow_user_data != NULL &&
(flow_user_data->detected_risks & (1ull << numeric_risk_value)) == 0)
{
flow_user_data->detected_risks |= (1ull << (numeric_risk_value - 1));
char flow_risk_sz[flow_risk_key_len + 1];
snprintf(flow_risk_sz, sizeof(flow_risk_sz), "%llu", numeric_risk_value);
size_t flow_risk_len = 0;
size_t flow_severity_len = 0;
char const * const flow_risk_str =
TOKEN_GET_VALUE(sock,
TOKEN_GET_SZ(sock, "ndpi", "flow_risk", flow_risk_sz, "risk"),
&flow_risk_len);
char const * const flow_severity_str =
TOKEN_GET_VALUE(sock,
TOKEN_GET_SZ(sock, "ndpi", "flow_risk", flow_risk_sz, "severity"),
&flow_severity_len);
if (flow_risk_str == NULL || flow_risk_len == 0 || flow_severity_str == NULL ||
flow_severity_len == 0)
{
continue;
}
ssize_t severity_index = get_value_index(flow_severities,
nDPIsrvd_ARRAY_LENGTH(flow_severities),
flow_severity_str,
flow_severity_len);
if (severity_index != -1 && desired_flow_severities[severity_index] != 0)
{
desired_severity_found = 1;
}
utstring_printf(&risks,
"Risk: '%.*s'\n"
"Severity: '%.*s'\n",
(int)flow_risk_len,
flow_risk_str,
(int)flow_severity_len,
flow_severity_str);
check_value(flow_severities,
nDPIsrvd_ARRAY_LENGTH(flow_severities),
flow_severity_str,
flow_severity_len);
}
}
}
{
size_t flow_srcip_len = 0;
size_t flow_dstip_len = 0;
size_t flow_breed_len = 0;
size_t flow_category_len = 0;
size_t flow_hostname_len = 0;
char const * const flow_srcip = TOKEN_GET_VALUE(sock, TOKEN_GET_SZ(sock, "src_ip"), &flow_srcip_len);
char const * const flow_dstip = TOKEN_GET_VALUE(sock, TOKEN_GET_SZ(sock, "dst_ip"), &flow_dstip_len);
char const * const flow_breed_str =
TOKEN_GET_VALUE(sock, TOKEN_GET_SZ(sock, "ndpi", "breed"), &flow_breed_len);
char const * const flow_category_str =
TOKEN_GET_VALUE(sock, TOKEN_GET_SZ(sock, "ndpi", "category"), &flow_category_len);
char const * const flow_hostname =
TOKEN_GET_VALUE(sock, TOKEN_GET_SZ(sock, "ndpi", "hostname"), &flow_hostname_len);
if (flow_breed_str != NULL && flow_breed_len != 0 && flow_category_str != NULL && flow_category_len != 0)
{
ssize_t breed_index =
get_value_index(flow_breeds, nDPIsrvd_ARRAY_LENGTH(flow_breeds), flow_breed_str, flow_breed_len);
ssize_t category_index = get_value_index(flow_categories,
nDPIsrvd_ARRAY_LENGTH(flow_categories),
flow_category_str,
flow_category_len);
if ((breed_index != -1 && desired_flow_breeds[breed_index] != 0) ||
(category_index != -1 && desired_flow_categories[category_index] != 0) ||
desired_severity_found != 0)
{
notifyf(DBUS_CRITICAL,
"Flow Notification",
5000,
"%.*s -> %.*s (%.*s)\nBreed: '%.*s', Category: '%.*s'\n%s",
(int)flow_srcip_len,
flow_srcip,
(int)flow_dstip_len,
flow_dstip,
(flow_hostname_len > 0 ? (int)flow_hostname_len : 1),
(flow_hostname_len > 0 ? flow_hostname : "-"),
(int)flow_breed_len,
flow_breed_str,
(int)flow_category_len,
flow_category_str,
(utstring_len(&risks) > 0 ? utstring_body(&risks) : "No flow risks detected\n"));
}
check_value(flow_breeds, nDPIsrvd_ARRAY_LENGTH(flow_breeds), flow_breed_str, flow_breed_len);
check_value(flow_categories,
nDPIsrvd_ARRAY_LENGTH(flow_categories),
flow_category_str,
flow_category_len);
}
else if (desired_severity_found != 0)
{
notifyf(DBUS_CRITICAL,
"Risky Flow",
5000,
"%.*s -> %.*s (%.*s)\n%s",
(int)flow_srcip_len,
flow_srcip,
(int)flow_dstip_len,
flow_dstip,
(flow_hostname_len > 0 ? (int)flow_hostname_len : 1),
(flow_hostname_len > 0 ? flow_hostname : "-"),
utstring_body(&risks));
}
}
utstring_done(&risks);
}
return CALLBACK_OK;
}
static void print_usage(char const * const arg0)
{
static char const usage[] =
"Usage: %s "
"[-s host] [-C category...] [-B breed...] [-S severity...]\n\n"
"\t-s\tDestination where nDPIsrvd is listening on.\n"
"\t-C\tDesired nDPI category which fires a notificiation.\n"
"\t \tCan be specified multiple times.\n"
"\t-B\tDesired nDPI breed which fires a notification.\n"
"\t \tCan be specified multiple times.\n"
"\t-S\tDesired nDPI risk severity which fires a notification.\n"
"\t \tCan be specified multiple times.\n"
"\n"
"Possible values for `-C': %s\n"
"Possible values for `-B': %s\n"
"Possible values for `-S': %s\n"
"\n";
UT_string flow_categories_str, flow_breeds_str, flow_severities_str;
utstring_init(&flow_categories_str);
utstring_init(&flow_breeds_str);
utstring_init(&flow_severities_str);
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(flow_categories); ++i)
{
utstring_printf(&flow_categories_str, "%s, ", flow_categories[i]);
}
flow_categories_str.d[flow_categories_str.i - 2] = '\0';
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(flow_breeds); ++i)
{
utstring_printf(&flow_breeds_str, "%s, ", flow_breeds[i]);
}
flow_breeds_str.d[flow_breeds_str.i - 2] = '\0';
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(flow_severities); ++i)
{
utstring_printf(&flow_severities_str, "%s, ", flow_severities[i]);
}
flow_severities_str.d[flow_severities_str.i - 2] = '\0';
fprintf(stderr,
usage,
arg0,
utstring_body(&flow_categories_str),
utstring_body(&flow_breeds_str),
utstring_body(&flow_severities_str));
utstring_done(&flow_severities_str);
utstring_done(&flow_breeds_str);
utstring_done(&flow_categories_str);
}
static int set_defaults(void)
{
char const * const default_severities[] = {"High", "Severe", "Critical", "Emergency"};
char const * const default_breeds[] = {"Unsafe", "Potentially_Dangerous", "Dangerous", "Unrated"};
char const * const default_categories[] = {"Mining", "Malware", "Banned_Site", "Crypto_Currency"};
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(default_severities); ++i)
{
ssize_t index = get_value_index(flow_severities,
nDPIsrvd_ARRAY_LENGTH(flow_severities),
default_severities[i],
strlen(default_severities[i]));
if (index == -1)
{
return 1;
}
desired_flow_severities[index] = 1;
}
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(default_breeds); ++i)
{
ssize_t index = get_value_index(flow_breeds,
nDPIsrvd_ARRAY_LENGTH(flow_breeds),
default_breeds[i],
strlen(default_breeds[i]));
if (index == -1)
{
return 1;
}
desired_flow_breeds[index] = 1;
}
for (size_t i = 0; i < nDPIsrvd_ARRAY_LENGTH(default_categories); ++i)
{
ssize_t index = get_value_index(flow_categories,
nDPIsrvd_ARRAY_LENGTH(flow_categories),
default_categories[i],
strlen(default_categories[i]));
if (index == -1)
{
return 1;
}
desired_flow_categories[index] = 1;
}
return 0;
}
static int parse_options(int argc, char ** argv, struct nDPIsrvd_socket * const sock)
{
int opt, force_defaults = 1;
while ((opt = getopt(argc, argv, "hldp:s:C:B:S:")) != -1)
{
switch (opt)
{
case 'd':
daemonize_enable();
break;
case 'p':
free(pidfile);
pidfile = strdup(optarg);
break;
case 's':
free(serv_optarg);
serv_optarg = strdup(optarg);
break;
case 'C':
{
ssize_t index =
get_value_index(flow_categories, nDPIsrvd_ARRAY_LENGTH(flow_categories), optarg, strlen(optarg));
if (index == -1)
{
fprintf(stderr, "Invalid argument for `-C': %s\n", optarg);
return 1;
}
else
{
desired_flow_categories[index] = 1;
}
force_defaults = 0;
break;
}
case 'B':
{
ssize_t index =
get_value_index(flow_breeds, nDPIsrvd_ARRAY_LENGTH(flow_breeds), optarg, strlen(optarg));
if (index == -1)
{
fprintf(stderr, "Invalid argument for `-B': %s\n", optarg);
return 1;
}
else
{
desired_flow_breeds[index] = 1;
}
force_defaults = 0;
break;
}
case 'S':
{
ssize_t index =
get_value_index(flow_severities, nDPIsrvd_ARRAY_LENGTH(flow_severities), optarg, strlen(optarg));
if (index == -1)
{
fprintf(stderr, "Invalid argument for `-S': %s\n", optarg);
return 1;
}
else
{
desired_flow_severities[index] = 1;
}
force_defaults = 0;
break;
}
default:
print_usage(argv[0]);
return 1;
}
}
if (force_defaults != 0 && set_defaults() != 0)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 5000, "%s\n", "BUG: Could not set default values.");
return 1;
}
if (serv_optarg == NULL)
{
serv_optarg = strdup(DISTRIBUTOR_UNIX_SOCKET);
}
if (nDPIsrvd_setup_address(&sock->address, serv_optarg) != 0)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 3000, "Could not parse address `%s'\n", serv_optarg);
return 1;
}
if (optind < argc)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 3000, "%s\n", "Unexpected argument after options");
return 1;
}
return 0;
}
static void sighandler(int signum)
{
switch (signum)
{
case SIGINT:
notify(DBUS_LOW, "nDPIsrvd-notifyd", 3000, "Received SIGINT, shutdown.");
break;
case SIGTERM:
notify(DBUS_LOW, "nDPIsrvd-notifyd", 3000, "Received SIGTERM, shutdown.");
break;
default:
notify(DBUS_LOW, "nDPIsrvd-notifyd", 3000, "Received unknown signal, shutdown.");
break;
}
main_thread_shutdown++;
}
int main(int argc, char ** argv)
{
signal(SIGINT, sighandler);
signal(SIGTERM, sighandler);
signal(SIGPIPE, SIG_IGN);
struct nDPIsrvd_socket * sock =
nDPIsrvd_socket_init(0, 0, 0, sizeof(struct flow_user_data), notifyd_json_callback, NULL, NULL);
if (sock == NULL)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 5000, "%s\n", "BUG: nDPIsrvd socket memory allocation failed!");
return 1;
}
if (parse_options(argc, argv, sock) != 0)
{
goto failure;
}
if (daemonize_with_pidfile(pidfile) != 0)
{
return 1;
}
int previous_connect_succeeded = 1;
do
{
if (nDPIsrvd_connect(sock) != CONNECT_OK)
{
if (main_thread_shutdown != 0)
{
break;
}
if (previous_connect_succeeded != 0)
{
notifyf(
DBUS_CRITICAL, "nDPIsrvd-notifyd", 3000, "nDPIsrvd socket connect to %s failed!\n", serv_optarg);
previous_connect_succeeded = 0;
}
nDPIsrvd_socket_close(sock);
sleep(SLEEP_TIME_IN_S);
continue;
}
previous_connect_succeeded = 1;
if (nDPIsrvd_set_read_timeout(sock, 3, 0) != 0)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 3000, "nDPIsrvd set read timeout failed: %s\n", strerror(errno));
goto failure;
}
notifyf(DBUS_NORMAL, "nDPIsrvd-notifyd", 3000, "Connected to '%s'\n", serv_optarg);
while (main_thread_shutdown == 0)
{
enum nDPIsrvd_read_return read_ret = nDPIsrvd_read(sock);
if (errno == EINTR)
{
continue;
}
if (read_ret == READ_TIMEOUT)
{
continue;
}
if (read_ret != READ_OK)
{
notifyf(DBUS_CRITICAL, "nDPIsrvd-notifyd", 3000, "nDPIsrvd socket read from %s failed!\n", serv_optarg);
break;
}
enum nDPIsrvd_parse_return parse_ret = nDPIsrvd_parse_all(sock);
if (parse_ret != PARSE_NEED_MORE_DATA)
{
notifyf(DBUS_CRITICAL,
"nDPIsrvd-notifyd",
3000,
"Could not parse JSON message %s: %.*s\n",
nDPIsrvd_enum_to_string(parse_ret),
nDPIsrvd_json_buffer_length(sock),
nDPIsrvd_json_buffer_string(sock));
break;
}
}
nDPIsrvd_socket_close(sock);
notifyf(DBUS_NORMAL, "nDPIsrvd-notifyd", 3000, "Disconnected from '%s'.\n", serv_optarg);
if (main_thread_shutdown == 0)
{
sleep(SLEEP_TIME_IN_S);
}
} while (main_thread_shutdown == 0);
failure:
nDPIsrvd_socket_free(&sock);
daemonize_shutdown(pidfile);
return 0;
}

View File

@@ -9,6 +9,16 @@ static int main_thread_shutdown = 0;
static struct nDPIsrvd_socket * sock = NULL;
#ifdef ENABLE_MEMORY_PROFILING
void nDPIsrvd_memprof_log_alloc(size_t alloc_size)
{
(void)alloc_size;
}
void nDPIsrvd_memprof_log_free(size_t free_size)
{
(void)free_size;
}
void nDPIsrvd_memprof_log(char const * const format, ...)
{
va_list ap;
@@ -48,7 +58,7 @@ static void nDPIsrvd_write_flow_info_cb(struct nDPIsrvd_socket const * sock,
#endif
flow->last_seen,
flow->idle_time,
(flow->last_seen + flow->idle_time >= thread_data->most_recent_flow_time
(thread_data != NULL && flow->last_seen + flow->idle_time >= thread_data->most_recent_flow_time
? flow->last_seen + flow->idle_time - thread_data->most_recent_flow_time
: 0));
}
@@ -83,8 +93,6 @@ static void nDPIsrvd_verify_flows_cb(struct nDPIsrvd_thread_data const * const t
{
fprintf(stderr, "Thread [UNKNOWN], Flow %llu verification failed\n", flow->id_as_ull);
}
exit(1);
}
static void sighandler(int signum)
@@ -109,6 +117,11 @@ static void sighandler(int signum)
{
fprintf(stderr, "%s\n", "Flow verification succeeded.");
}
else
{
/* FATAL! */
exit(EXIT_FAILURE);
}
}
else if (main_thread_shutdown == 0)
{
@@ -129,10 +142,21 @@ static enum nDPIsrvd_callback_return simple_json_callback(struct nDPIsrvd_socket
return CALLBACK_OK;
}
struct nDPIsrvd_json_token const * const flow_event_name = TOKEN_GET_SZ(sock, "flow_event_name");
if (TOKEN_VALUE_EQUALS_SZ(flow_event_name, "new") != 0)
struct nDPIsrvd_json_token const * const alias = TOKEN_GET_SZ(sock, "alias");
struct nDPIsrvd_json_token const * const source = TOKEN_GET_SZ(sock, "source");
if (alias == NULL || source == NULL)
{
printf("Instance 0x%x, Thread %d, Flow %llu new\n",
return CALLBACK_ERROR;
}
struct nDPIsrvd_json_token const * const flow_event_name = TOKEN_GET_SZ(sock, "flow_event_name");
if (TOKEN_VALUE_EQUALS_SZ(sock, flow_event_name, "new") != 0)
{
printf("Instance %.*s/%.*s (HT-Key: 0x%x), Thread %d, Flow %llu new\n",
nDPIsrvd_get_token_size(sock, alias),
nDPIsrvd_get_token_value(sock, alias),
nDPIsrvd_get_token_size(sock, source),
nDPIsrvd_get_token_value(sock, source),
instance->alias_source_key,
flow->thread_id,
flow->id_as_ull);
@@ -150,8 +174,21 @@ static void simple_flow_cleanup_callback(struct nDPIsrvd_socket * const sock,
(void)sock;
(void)thread_data;
struct nDPIsrvd_json_token const * const alias = TOKEN_GET_SZ(sock, "alias");
struct nDPIsrvd_json_token const * const source = TOKEN_GET_SZ(sock, "source");
if (alias == NULL || source == NULL)
{
/* FATAL! */
fprintf(stderr, "BUG: Missing JSON token alias/source.\n");
exit(EXIT_FAILURE);
}
char const * const reason_str = nDPIsrvd_enum_to_string(reason);
printf("Instance 0x%x, Thread %d, Flow %llu cleanup, reason: %s\n",
printf("Instance %.*s/%.*s (HT-Key: 0x%x), Thread %d, Flow %llu cleanup, reason: %s\n",
nDPIsrvd_get_token_size(sock, alias),
nDPIsrvd_get_token_value(sock, alias),
nDPIsrvd_get_token_size(sock, source),
nDPIsrvd_get_token_value(sock, source),
instance->alias_source_key,
flow->thread_id,
flow->id_as_ull,
@@ -159,7 +196,9 @@ static void simple_flow_cleanup_callback(struct nDPIsrvd_socket * const sock,
if (reason == CLEANUP_REASON_FLOW_TIMEOUT)
{
/* FATAL! */
fprintf(stderr, "Flow %llu timeouted.\n", flow->id_as_ull);
exit(EXIT_FAILURE);
}
}
@@ -214,14 +253,20 @@ int main(int argc, char ** argv)
enum nDPIsrvd_parse_return parse_ret = nDPIsrvd_parse_all(sock);
if (parse_ret != PARSE_NEED_MORE_DATA)
{
printf("Could not parse json string: %s\n", nDPIsrvd_enum_to_string(parse_ret));
printf("Could not parse JSON message %s: %.*s\n",
nDPIsrvd_enum_to_string(parse_ret),
nDPIsrvd_json_buffer_length(sock),
nDPIsrvd_json_buffer_string(sock));
break;
}
}
if (main_thread_shutdown == 0 && read_ret != READ_OK)
{
printf("Parse read %s\n", nDPIsrvd_enum_to_string(read_ret));
printf("Parse read %s at JSON: %.*s\n",
nDPIsrvd_enum_to_string(read_ret),
nDPIsrvd_json_buffer_length(sock),
nDPIsrvd_json_buffer_string(sock));
}
return 1;

1
examples/cxx-graph Submodule

Submodule examples/cxx-graph added at 68eb1b105d

Binary file not shown.

After

Width:  |  Height:  |  Size: 62 KiB

View File

@@ -1,3 +0,0 @@
body {
background: black;
}

View File

@@ -1,291 +0,0 @@
#!/usr/bin/env python3
import multiprocessing
import os
import sys
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket
import plotly_dash
FLOW_RISK_SEVERE = 4
FLOW_RISK_HIGH = 3
FLOW_RISK_MEDIUM = 2
FLOW_RISK_LOW = 1
def nDPIsrvd_worker_onFlowCleanup(instance, current_flow, global_user_data):
_, shared_flow_dict = global_user_data
flow_key = current_flow.flow_key
shared_flow_dict['current-flows'] -= 1
if flow_key not in shared_flow_dict:
return True
shared_flow_dict['total-l4-bytes'] += shared_flow_dict[flow_key]['total-l4-bytes']
if shared_flow_dict[flow_key]['is_detected'] is True:
shared_flow_dict['current-detected-flows'] -= 1
if shared_flow_dict[flow_key]['is_guessed'] is True:
shared_flow_dict['current-guessed-flows'] -= 1
if shared_flow_dict[flow_key]['is_not_detected'] is True:
shared_flow_dict['current-not-detected-flows'] -= 1
if shared_flow_dict[flow_key]['is_midstream'] is True:
shared_flow_dict['current-midstream-flows'] -= 1
if shared_flow_dict[flow_key]['is_risky'] > 0:
shared_flow_dict['current-risky-flows'] -= 1
if shared_flow_dict[flow_key]['is_risky'] == FLOW_RISK_LOW:
shared_flow_dict['current-risky-flows-low'] -= 1
elif shared_flow_dict[flow_key]['is_risky'] == FLOW_RISK_MEDIUM:
shared_flow_dict['current-risky-flows-medium'] -= 1
elif shared_flow_dict[flow_key]['is_risky'] == FLOW_RISK_HIGH:
shared_flow_dict['current-risky-flows-high'] -= 1
elif shared_flow_dict[flow_key]['is_risky'] == FLOW_RISK_SEVERE:
shared_flow_dict['current-risky-flows-severe'] -= 1
del shared_flow_dict[current_flow.flow_key]
return True
def nDPIsrvd_worker_onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
nsock, shared_flow_dict = global_user_data
shared_flow_dict['total-events'] += 1
shared_flow_dict['total-json-bytes'] = nsock.received_bytes
if 'error_event_name' in json_dict:
shared_flow_dict['total-base-events'] += 1
if 'daemon_event_name' in json_dict:
shared_flow_dict['total-daemon-events'] += 1
if 'packet_event_name' in json_dict and \
(json_dict['packet_event_name'] == 'packet' or \
json_dict['packet_event_name'] == 'packet-flow'):
shared_flow_dict['total-packet-events'] += 1
if 'flow_id' not in json_dict:
return True
else:
flow_key = json_dict['alias'] + '-' + json_dict['source'] + '-' + str(json_dict['flow_id'])
if flow_key not in shared_flow_dict:
current_flow.flow_key = flow_key
shared_flow_dict[flow_key] = mgr.dict()
shared_flow_dict[flow_key]['is_detected'] = False
shared_flow_dict[flow_key]['is_guessed'] = False
shared_flow_dict[flow_key]['is_not_detected'] = False
shared_flow_dict[flow_key]['is_midstream'] = False
shared_flow_dict[flow_key]['is_risky'] = 0
shared_flow_dict[flow_key]['total-l4-bytes'] = 0
shared_flow_dict[flow_key]['json'] = mgr.dict()
shared_flow_dict['total-flows'] += 1
shared_flow_dict['current-flows'] += 1
if current_flow.flow_key != flow_key:
return False
if 'flow_tot_l4_payload_len' in json_dict:
shared_flow_dict[flow_key]['total-l4-bytes'] = json_dict['flow_tot_l4_payload_len']
if 'midstream' in json_dict and json_dict['midstream'] != 0:
if shared_flow_dict[flow_key]['is_midstream'] is False:
shared_flow_dict['total-midstream-flows'] += 1
shared_flow_dict['current-midstream-flows'] += 1
shared_flow_dict[flow_key]['is_midstream'] = True
if 'ndpi' in json_dict:
shared_flow_dict[flow_key]['json']['ndpi'] = json_dict['ndpi']
if 'flow_risk' in json_dict['ndpi']:
if shared_flow_dict[flow_key]['is_risky'] == 0:
shared_flow_dict['total-risky-flows'] += 1
shared_flow_dict['current-risky-flows'] += 1
severity = shared_flow_dict[flow_key]['is_risky']
if severity == FLOW_RISK_LOW:
shared_flow_dict['current-risky-flows-low'] -= 1
elif severity == FLOW_RISK_MEDIUM:
shared_flow_dict['current-risky-flows-medium'] -= 1
elif severity == FLOW_RISK_HIGH:
shared_flow_dict['current-risky-flows-high'] -= 1
elif severity == FLOW_RISK_SEVERE:
shared_flow_dict['current-risky-flows-severe'] -= 1
for key in json_dict['ndpi']['flow_risk']:
if json_dict['ndpi']['flow_risk'][key]['severity'] == 'Low':
severity = max(severity, FLOW_RISK_LOW)
elif json_dict['ndpi']['flow_risk'][key]['severity'] == 'Medium':
severity = max(severity, FLOW_RISK_MEDIUM)
elif json_dict['ndpi']['flow_risk'][key]['severity'] == 'High':
severity = max(severity, FLOW_RISK_HIGH)
elif json_dict['ndpi']['flow_risk'][key]['severity'] == 'Severe':
severity = max(severity, FLOW_RISK_SEVERE)
else:
raise RuntimeError('Invalid flow risk severity: {}'.format(
json_dict['ndpi']['flow_risk'][key]['severity']))
shared_flow_dict[flow_key]['is_risky'] = severity
if severity == FLOW_RISK_LOW:
shared_flow_dict['current-risky-flows-low'] += 1
elif severity == FLOW_RISK_MEDIUM:
shared_flow_dict['current-risky-flows-medium'] += 1
elif severity == FLOW_RISK_HIGH:
shared_flow_dict['current-risky-flows-high'] += 1
elif severity == FLOW_RISK_SEVERE:
shared_flow_dict['current-risky-flows-severe'] += 1
if 'flow_event_name' not in json_dict:
return True
if json_dict['flow_state'] == 'finished' and \
json_dict['ndpi']['proto'] != 'Unknown' and \
shared_flow_dict[flow_key]['is_detected'] is False:
shared_flow_dict['total-detected-flows'] += 1
shared_flow_dict['current-detected-flows'] += 1
shared_flow_dict[flow_key]['is_detected'] = True
if json_dict['flow_event_name'] == 'new':
shared_flow_dict['total-flow-new-events'] += 1
elif json_dict['flow_event_name'] == 'update':
shared_flow_dict['total-flow-update-events'] += 1
elif json_dict['flow_event_name'] == 'end':
shared_flow_dict['total-flow-end-events'] += 1
elif json_dict['flow_event_name'] == 'idle':
shared_flow_dict['total-flow-idle-events'] += 1
elif json_dict['flow_event_name'] == 'guessed':
shared_flow_dict['total-flow-guessed-events'] += 1
if shared_flow_dict[flow_key]['is_guessed'] is False:
shared_flow_dict['total-guessed-flows'] += 1
shared_flow_dict['current-guessed-flows'] += 1
shared_flow_dict[flow_key]['is_guessed'] = True
elif json_dict['flow_event_name'] == 'not-detected':
shared_flow_dict['total-flow-not-detected-events'] += 1
if shared_flow_dict[flow_key]['is_not_detected'] is False:
shared_flow_dict['total-not-detected-flows'] += 1
shared_flow_dict['current-not-detected-flows'] += 1
shared_flow_dict[flow_key]['is_not_detected'] = True
elif json_dict['flow_event_name'] == 'detected' or \
json_dict['flow_event_name'] == 'detection-update':
if json_dict['flow_event_name'] == 'detection-update':
shared_flow_dict['total-flow-detection-update-events'] += 1
else:
shared_flow_dict['total-flow-detected-events'] += 1
if shared_flow_dict[flow_key]['is_detected'] is False:
shared_flow_dict['total-detected-flows'] += 1
shared_flow_dict['current-detected-flows'] += 1
shared_flow_dict[flow_key]['is_detected'] = True
if shared_flow_dict[flow_key]['is_guessed'] is True:
shared_flow_dict['total-guessed-flows'] -= 1
shared_flow_dict['current-guessed-flows'] -= 1
shared_flow_dict[flow_key]['is_guessed'] = False
return True
def nDPIsrvd_worker(address, shared_flow_dict):
sys.stderr.write('Recv buffer size: {}\n'
.format(nDPIsrvd.NETWORK_BUFFER_MAX_SIZE))
sys.stderr.write('Connecting to {} ..\n'
.format(address[0]+':'+str(address[1])
if type(address) is tuple else address))
try:
while True:
try:
nsock = nDPIsrvdSocket()
nsock.connect(address)
nsock.loop(nDPIsrvd_worker_onJsonLineRecvd,
nDPIsrvd_worker_onFlowCleanup,
(nsock, shared_flow_dict))
except nDPIsrvd.SocketConnectionBroken:
sys.stderr.write('Lost connection to {} .. reconnecting\n'
.format(address[0]+':'+str(address[1])
if type(address) is tuple else address))
except KeyboardInterrupt:
pass
if __name__ == '__main__':
argparser = nDPIsrvd.defaultArgumentParser()
argparser.add_argument('--listen-address', type=str, default='127.0.0.1', help='Plotly listen address')
argparser.add_argument('--listen-port', type=str, default=8050, help='Plotly listen port')
args = argparser.parse_args()
address = nDPIsrvd.validateAddress(args)
mgr = multiprocessing.Manager()
shared_flow_dict = mgr.dict()
shared_flow_dict['total-events'] = 0
shared_flow_dict['total-flow-new-events'] = 0
shared_flow_dict['total-flow-update-events'] = 0
shared_flow_dict['total-flow-end-events'] = 0
shared_flow_dict['total-flow-idle-events'] = 0
shared_flow_dict['total-flow-detected-events'] = 0
shared_flow_dict['total-flow-detection-update-events'] = 0
shared_flow_dict['total-flow-guessed-events'] = 0
shared_flow_dict['total-flow-not-detected-events'] = 0
shared_flow_dict['total-packet-events'] = 0
shared_flow_dict['total-base-events'] = 0
shared_flow_dict['total-daemon-events'] = 0
shared_flow_dict['total-json-bytes'] = 0
shared_flow_dict['total-l4-bytes'] = 0
shared_flow_dict['total-flows'] = 0
shared_flow_dict['total-detected-flows'] = 0
shared_flow_dict['total-risky-flows'] = 0
shared_flow_dict['total-midstream-flows'] = 0
shared_flow_dict['total-guessed-flows'] = 0
shared_flow_dict['total-not-detected-flows'] = 0
shared_flow_dict['current-flows'] = 0
shared_flow_dict['current-detected-flows'] = 0
shared_flow_dict['current-midstream-flows'] = 0
shared_flow_dict['current-guessed-flows'] = 0
shared_flow_dict['current-not-detected-flows'] = 0
shared_flow_dict['current-risky-flows'] = 0
shared_flow_dict['current-risky-flows-severe'] = 0
shared_flow_dict['current-risky-flows-high'] = 0
shared_flow_dict['current-risky-flows-medium'] = 0
shared_flow_dict['current-risky-flows-low'] = 0
nDPIsrvd_job = multiprocessing.Process(target=nDPIsrvd_worker,
args=(address, shared_flow_dict))
nDPIsrvd_job.start()
web_job = multiprocessing.Process(target=plotly_dash.web_worker,
args=(shared_flow_dict, args.listen_address, args.listen_port))
web_job.start()
nDPIsrvd_job.join()
web_job.terminate()
web_job.join()

View File

@@ -1,414 +0,0 @@
import math
import dash
try:
from dash import dcc
except ImportError:
import dash_core_components as dcc
try:
from dash import html
except ImportError:
import dash_html_components as html
try:
from dash import dash_table as dt
except ImportError:
import dash_table as dt
from dash.dependencies import Input, Output, State
import dash_daq as daq
import plotly.graph_objects as go
global shared_flow_dict
app = dash.Dash(__name__)
def generate_box():
return {
'display': 'flex', 'flex-direction': 'row',
'background-color': '#082255'
}
def generate_led_display(div_id, label_name):
return daq.LEDDisplay(
id=div_id,
label={'label': label_name, 'style': {'color': '#C4CDD5'}},
labelPosition='bottom',
value='0',
backgroundColor='#082255',
color='#C4CDD5',
)
def generate_gauge(div_id, label_name, max_value=10):
return daq.Gauge(
id=div_id,
value=0,
label={'label': label_name, 'style': {'color': '#C4CDD5'}},
max=max_value,
min=0,
)
def build_gauge(key, max_value=100):
gauge_max = int(max(max_value,
shared_flow_dict[key]))
grad_green = [0, int(gauge_max * 1/3)]
grad_yellow = [int(gauge_max * 1/3), int(gauge_max * 2/3)]
grad_red = [int(gauge_max * 2/3), gauge_max]
grad_dict = {
"gradient":True,
"ranges":{
"green":grad_green,
"yellow":grad_yellow,
"red":grad_red
}
}
return shared_flow_dict[key], gauge_max, grad_dict
def build_piechart(labels, values, color_map=None):
lay = dict(
plot_bgcolor = '#082255',
paper_bgcolor = '#082255',
font={"color": "#fff"},
uirevision=True,
autosize=True,
height=250,
margin = {'autoexpand': True, 'b': 0, 'l': 0, 'r': 0, 't': 0, 'pad': 0},
width = 500,
uniformtext_minsize = 12,
uniformtext_mode = 'hide',
)
return go.Figure(layout=lay, data=[go.Pie(labels=labels, values=values, sort=False, marker_colors=color_map, textinfo='percent', textposition='inside')])
COLOR_MAP = {
'piechart-flows': ['rgb(153, 153, 255)', 'rgb(153, 204, 255)', 'rgb(255, 204, 153)', 'rgb(255, 255, 255)'],
'piechart-midstream-flows': ['rgb(255, 255, 153)', 'rgb(153, 153, 255)'],
'piechart-risky-flows': ['rgb(255, 0, 0)', 'rgb(255, 128, 0)', 'rgb(255, 255, 0)', 'rgb(128, 255, 0)', 'rgb(153, 153, 255)'],
'graph-flows': {'Current Active Flows': {'color': 'rgb(153, 153, 255)', 'width': 1},
'Current Risky Flows': {'color': 'rgb(255, 153, 153)', 'width': 3},
'Current Midstream Flows': {'color': 'rgb(255, 255, 153)', 'width': 3},
'Current Guessed Flows': {'color': 'rgb(153, 204, 255)', 'width': 1},
'Current Not-Detected Flows': {'color': 'rgb(255, 204, 153)', 'width': 1},
'Current Unclassified Flows': {'color': 'rgb(255, 255, 255)', 'width': 1},
},
}
def generate_tab_flow():
return html.Div([
html.Div(children=[
dcc.Interval(id="tab-flow-default-interval", interval=1 * 2000, n_intervals=0),
html.Div(children=[
dt.DataTable(
id='table-info',
columns=[{'id': c.lower(), 'name': c, 'editable': False}
for c in ['Name', 'Total']],
style_header={
'backgroundColor': '#082233',
'color': 'white'
},
style_data={
'backgroundColor': '#082244',
'color': 'white'
},
)
], style={'display': 'flex', 'flex-direction': 'row'}),
html.Div(children=[
dcc.Graph(
id='piechart-flows',
config={
'displayModeBar': False,
},
figure=build_piechart(['Detected', 'Guessed', 'Not-Detected', 'Unclassified'],
[0, 0, 0, 0], COLOR_MAP['piechart-flows']),
),
], style={'padding': 10, 'flex': 1}),
html.Div(children=[
dcc.Graph(
id='piechart-midstream-flows',
config={
'displayModeBar': False,
},
figure=build_piechart(['Midstream', 'Not Midstream'],
[0, 0], COLOR_MAP['piechart-midstream-flows']),
),
], style={'padding': 10, 'flex': 1}),
html.Div(children=[
dcc.Graph(
id='piechart-risky-flows',
config={
'displayModeBar': False,
},
figure=build_piechart(['Severy Risk', 'High Risk', 'Medium Risk', 'Low Risk', 'No Risk'],
[0, 0], COLOR_MAP['piechart-risky-flows']),
),
], style={'padding': 10, 'flex': 1}),
], style=generate_box()),
html.Div(children=[
dcc.Interval(id="tab-flow-graph-interval", interval=4 * 1000, n_intervals=0),
dcc.Store(id="graph-traces"),
html.Div(children=[
dcc.Graph(
id="graph-flows",
config={
'displayModeBar': True,
'displaylogo': False,
},
style={'height':'60vh'},
),
], style={'padding': 10, 'flex': 1})
], style=generate_box())
])
def generate_tab_other():
return html.Div([
html.Div(children=[
dcc.Interval(id="tab-other-default-interval", interval=1 * 2000, n_intervals=0),
html.Div(children=[
dcc.Graph(
id='piechart-events',
config={
'displayModeBar': False,
},
),
], style={'padding': 10, 'flex': 1}),
], style=generate_box())
])
TABS_STYLES = {
'height': '34px'
}
TAB_STYLE = {
'borderBottom': '1px solid #d6d6d6',
'backgroundColor': '#385285',
'padding': '6px',
'fontWeight': 'bold',
}
TAB_SELECTED_STYLE = {
'borderTop': '1px solid #d6d6d6',
'borderBottom': '1px solid #d6d6d6',
'backgroundColor': '#119DFF',
'color': 'white',
'padding': '6px'
}
app.layout = html.Div([
dcc.Tabs(id="tabs-flow-dash", value="tab-flows", children=[
dcc.Tab(label="Flow", value="tab-flows", style=TAB_STYLE,
selected_style=TAB_SELECTED_STYLE,
children=generate_tab_flow()),
dcc.Tab(label="Other", value="tab-other", style=TAB_STYLE,
selected_style=TAB_SELECTED_STYLE,
children=generate_tab_other()),
], style=TABS_STYLES),
html.Div(id="tabs-content")
])
def prettifyBytes(bytes_received):
size_names = ['B', 'KB', 'MB', 'GB', 'TB']
if bytes_received == 0:
i = 0
else:
i = min(int(math.floor(math.log(bytes_received, 1024))), len(size_names) - 1)
p = math.pow(1024, i)
s = round(bytes_received / p, 2)
return '{:.2f} {}'.format(s, size_names[i])
@app.callback(output=[Output('table-info', 'data'),
Output('piechart-flows', 'figure'),
Output('piechart-midstream-flows', 'figure'),
Output('piechart-risky-flows', 'figure')],
inputs=[Input('tab-flow-default-interval', 'n_intervals')])
def tab_flow_update_components(n):
return [[{'name': 'JSON Events', 'total': shared_flow_dict['total-events']},
{'name': 'JSON Bytes', 'total': prettifyBytes(shared_flow_dict['total-json-bytes'])},
{'name': 'Layer4 Bytes', 'total': prettifyBytes(shared_flow_dict['total-l4-bytes'])},
{'name': 'Flows', 'total': shared_flow_dict['total-flows']},
{'name': 'Risky Flows', 'total': shared_flow_dict['total-risky-flows']},
{'name': 'Midstream Flows', 'total': shared_flow_dict['total-midstream-flows']},
{'name': 'Guessed Flows', 'total': shared_flow_dict['total-guessed-flows']},
{'name': 'Not Detected Flows', 'total': shared_flow_dict['total-not-detected-flows']}],
build_piechart(['Detected', 'Guessed', 'Not-Detected', 'Unclassified'],
[shared_flow_dict['current-detected-flows'],
shared_flow_dict['current-guessed-flows'],
shared_flow_dict['current-not-detected-flows'],
shared_flow_dict['current-flows']
- shared_flow_dict['current-detected-flows']
- shared_flow_dict['current-guessed-flows']
- shared_flow_dict['current-not-detected-flows']],
COLOR_MAP['piechart-flows']),
build_piechart(['Midstream', 'Not Midstream'],
[shared_flow_dict['current-midstream-flows'],
shared_flow_dict['current-flows'] -
shared_flow_dict['current-midstream-flows']],
COLOR_MAP['piechart-midstream-flows']),
build_piechart(['Severe', 'High', 'Medium', 'Low', 'No Risk'],
[shared_flow_dict['current-risky-flows-severe'],
shared_flow_dict['current-risky-flows-high'],
shared_flow_dict['current-risky-flows-medium'],
shared_flow_dict['current-risky-flows-low'],
shared_flow_dict['current-flows'] -
shared_flow_dict['current-risky-flows']],
COLOR_MAP['piechart-risky-flows'])]
@app.callback(output=[Output('graph-flows', 'figure'),
Output('graph-traces', 'data')],
inputs=[Input('tab-flow-graph-interval', 'n_intervals'),
Input('tab-flow-graph-interval', 'interval')],
state=[State('graph-traces', 'data')])
def tab_flow_update_graph(n, i, traces):
if traces is None:
traces = ([], [], [], [], [], [])
max_bins = 75
traces[0].append(shared_flow_dict['current-flows'])
traces[1].append(shared_flow_dict['current-risky-flows'])
traces[2].append(shared_flow_dict['current-midstream-flows'])
traces[3].append(shared_flow_dict['current-guessed-flows'])
traces[4].append(shared_flow_dict['current-not-detected-flows'])
traces[5].append(shared_flow_dict['current-flows']
- shared_flow_dict['current-detected-flows']
- shared_flow_dict['current-guessed-flows']
- shared_flow_dict['current-not-detected-flows'])
if len(traces[0]) > max_bins:
traces[0] = traces[0][1:]
traces[1] = traces[1][1:]
traces[2] = traces[2][1:]
traces[3] = traces[3][1:]
traces[4] = traces[4][1:]
traces[5] = traces[5][1:]
i /= 1000.0
x = list(range(max(n - max_bins, 0) * int(i), n * int(i), max(int(i), 0)))
if len(x) > 0 and x[0] > 60:
x = [round(t / 60, 2) for t in x]
x_div = 60
x_axis_title = 'Time (min)'
else:
x_div = 1
x_axis_title = 'Time (sec)'
min_x = max(0, x[0] if len(x) >= max_bins else 0)
max_x = max((max_bins * i) / x_div, x[max_bins - 1] if len(x) >= max_bins else 0)
lay = dict(
plot_bgcolor = '#082255',
paper_bgcolor = '#082255',
font={"color": "#fff"},
xaxis = {
'title': x_axis_title,
"showgrid": False,
"showline": False,
"fixedrange": True,
"tickmode": 'linear',
"tick0": round(max_bins / x_div, 2),
"dtick": round(max_bins / x_div, 2),
},
yaxis = {
'title': 'Flow Count',
"showgrid": False,
"showline": False,
"zeroline": False,
"fixedrange": True,
"tickmode": 'linear',
"dtick": 10,
},
uirevision=True,
autosize=True,
bargap=0.01,
bargroupgap=0,
hovermode="closest",
margin = {'b': 0, 'l': 0, 'r': 0, 't': 30, 'pad': 0},
legend = {'borderwidth': 0},
)
fig = go.Figure(layout=lay)
fig.update_xaxes(showgrid=True, gridwidth=1, gridcolor='#004D80', zeroline=True, zerolinewidth=1, range=[min_x, max_x])
fig.update_yaxes(showgrid=True, gridwidth=1, gridcolor='#004D80', zeroline=True, zerolinewidth=1)
fig.add_trace(go.Scatter(
x=x,
y=traces[0],
name='Current Active Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Active Flows'],
))
fig.add_trace(go.Scatter(
x=x,
y=traces[1],
name='Current Risky Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Risky Flows'],
))
fig.add_trace(go.Scatter(
x=x,
y=traces[2],
name='Current Midstream Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Midstream Flows'],
))
fig.add_trace(go.Scatter(
x=x,
y=traces[3],
name='Current Guessed Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Guessed Flows'],
))
fig.add_trace(go.Scatter(
x=x,
y=traces[4],
name='Current Not-Detected Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Not-Detected Flows'],
))
fig.add_trace(go.Scatter(
x=x,
y=traces[5],
name='Current Unclassified Flows',
mode='lines+markers',
line=COLOR_MAP['graph-flows']['Current Unclassified Flows'],
))
return [fig, traces]
@app.callback(output=[Output('piechart-events', 'figure')],
inputs=[Input('tab-other-default-interval', 'n_intervals')])
def tab_other_update_components(n):
return [build_piechart(['Base', 'Daemon', 'Packet',
'Flow New', 'Flow Update', 'Flow End', 'Flow Idle',
'Flow Detection', 'Flow Detection-Updates', 'Flow Guessed', 'Flow Not-Detected'],
[shared_flow_dict['total-base-events'],
shared_flow_dict['total-daemon-events'],
shared_flow_dict['total-packet-events'],
shared_flow_dict['total-flow-new-events'],
shared_flow_dict['total-flow-update-events'],
shared_flow_dict['total-flow-end-events'],
shared_flow_dict['total-flow-idle-events'],
shared_flow_dict['total-flow-detected-events'],
shared_flow_dict['total-flow-detection-update-events'],
shared_flow_dict['total-flow-guessed-events'],
shared_flow_dict['total-flow-not-detected-events']])]
def web_worker(mp_shared_flow_dict, listen_host, listen_port):
global shared_flow_dict
shared_flow_dict = mp_shared_flow_dict
try:
app.run_server(debug=False, host=listen_host, port=listen_port)
except KeyboardInterrupt:
pass

View File

@@ -1,3 +0,0 @@
dash
dash_daq
Werkzeug==2.0

View File

@@ -8,6 +8,7 @@ import datetime
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor
@@ -31,12 +32,12 @@ def set_attr_if_not_set(some_object, attr_name, value):
class Stats:
def __init__(self, nDPIsrvd_sock):
self.statusbar_enabled = True
self.start_time = time.time()
self.nsock = nDPIsrvd_sock
self.last_status_length = 0
self.avg_xfer_json_bytes = 0.0
self.expired_tot_l4_payload_len = 0
self.expired_avg_l4_payload_len = 0
self.total_flows = 0
self.risky_flows = 0
self.midstream_flows = 0
@@ -46,11 +47,14 @@ class Stats:
self.json_lines = 0
self.spinner_state = 0
def disableStatusbar(self):
self.statusbar_enabled = False
def updateSpinner(self):
if self.current_time + 0.25 <= time.time():
self.spinner_state += 1
def getSpinner(self):
def __getSpinner(self):
#spinner_states = ['-', '\\', '|', '/']
#spinner_states = ['▉', '▊', '▋', '▌', '▍', '▎', '▏', '▎', '▍', '▌', '▋', '▊', '▉']
spinner_states = ['', '', '', '', '', '', '', '']
@@ -59,12 +63,12 @@ class Stats:
#spinner_states = ['┤', '┘', '┴', '└', '├', '┌', '┬', '┐']
return spinner_states[self.spinner_state % len(spinner_states)]
def getDataFromJson(self, json_dict, current_flow):
def __getDataFromJson(self, json_dict, current_flow):
if current_flow is None:
return
set_attr_from_dict(current_flow, json_dict, 'flow_tot_l4_payload_len', 0)
set_attr_from_dict(current_flow, json_dict, 'flow_avg_l4_payload_len', 0)
set_attr_from_dict(current_flow, json_dict, 'flow_src_tot_l4_payload_len', 0)
set_attr_from_dict(current_flow, json_dict, 'flow_dst_tot_l4_payload_len', 0)
if 'ndpi' in json_dict:
set_attr_from_dict(current_flow, json_dict['ndpi'], 'flow_risk', {})
else:
@@ -87,23 +91,21 @@ class Stats:
self.json_lines += 1
self.current_time = time.time()
self.avg_xfer_json_bytes = self.nsock.received_bytes / (self.current_time - self.start_time)
self.getDataFromJson(json_dict, current_flow)
self.__getDataFromJson(json_dict, current_flow)
def updateOnCleanup(self, current_flow):
self.total_flows += 1
self.expired_tot_l4_payload_len += current_flow.flow_tot_l4_payload_len
self.expired_avg_l4_payload_len += current_flow.flow_avg_l4_payload_len
self.expired_tot_l4_payload_len += current_flow.flow_src_tot_l4_payload_len + current_flow.flow_dst_tot_l4_payload_len
self.risky_flows += 1 if len(current_flow.flow_risk) > 0 else 0
self.midstream_flows += 1 if current_flow.midstream != 0 else 0
self.guessed_flows += 1 if current_flow.guessed != 0 else 0
self.not_detected_flows += 1 if current_flow.not_detected != 0 else 0
def getStatsFromFlowMgr(self):
def __getStatsFromFlowMgr(self):
alias_count = 0
source_count = 0
flow_count = 0
flow_tot_l4_payload_len = 0.0
flow_avg_l4_payload_len = 0.0
risky = 0
midstream = 0
guessed = 0
@@ -118,45 +120,74 @@ class Stats:
flow_count += 1
current_flow = instances[alias][source].flows[flow_id]
flow_tot_l4_payload_len += current_flow.flow_tot_l4_payload_len
flow_avg_l4_payload_len += current_flow.flow_avg_l4_payload_len
risky += 1 if len(current_flow.flow_risk) > 0 else 0
midstream += 1 if current_flow.midstream != 0 else 0
guessed += 1 if current_flow.guessed != 0 else 0
not_detected = 1 if current_flow.not_detected != 0 else 0
try:
flow_src_tot_l4_payload_len = current_flow.flow_src_tot_l4_payload_len
flow_dst_tot_l4_payload_len = current_flow.flow_dst_tot_l4_payload_len
flow_risk = current_flow.flow_risk
midstream = current_flow.midstream
guessed = current_flow.guessed
not_detected = current_flow.not_detected
except AttributeError:
flow_src_tot_l4_payload_len = 0
flow_dst_tot_l4_payload_len = 0
flow_risk = []
midstream = 0
guessed = 0
not_detected = 0
flow_tot_l4_payload_len += flow_src_tot_l4_payload_len + flow_dst_tot_l4_payload_len
risky += 1 if len(flow_risk) > 0 else 0
midstream += 1 if midstream != 0 else 0
guessed += 1 if guessed != 0 else 0
not_detected = 1 if not_detected != 0 else 0
return alias_count, source_count, flow_count, \
flow_tot_l4_payload_len, flow_avg_l4_payload_len, \
flow_tot_l4_payload_len, \
risky, midstream, guessed, not_detected
@staticmethod
def prettifyBytes(bytes_received):
size_names = ['B', 'KB', 'MB', 'GB', 'TB']
def prettifyBytes(bytes_received, is_byte_unit = True):
if not is_byte_unit:
size_names = ['', 'K', 'M', 'G', 'T']
divisor = 1000
else:
size_names = ['B', 'KiB', 'MiB', 'GiB', 'TiB']
divisor = 1024
if bytes_received == 0:
i = 0
else:
i = min(int(math.floor(math.log(bytes_received, 1024))), len(size_names) - 1)
p = math.pow(1024, i)
i = min(int(math.floor(math.log(bytes_received, divisor))), len(size_names) - 1)
p = math.pow(divisor, i)
s = round(bytes_received / p, 2)
return '{:.2f} {}'.format(s, size_names[i])
if not is_byte_unit:
return '{:.0f}{}'.format(s, ' ' + size_names[i] if len(size_names[i]) > 0 else size_names[i])
else:
return '{:.2f} {}'.format(s, size_names[i])
def resetStatus(self):
if self.statusbar_enabled is False:
return
sys.stdout.write('\r' + str(' ' * self.last_status_length) + '\r')
sys.stdout.flush()
def printStatus(self):
alias_count, source_count, flow_count, \
tot_l4_payload_len, avg_l4_payload_len, \
risky, midstream, guessed, not_detected = self.getStatsFromFlowMgr()
if self.statusbar_enabled is False:
return
out_str = '\r[n|tot|avg JSONs: {}|{}|{}/s] [tot|avg l4: {}|{}] ' \
alias_count, source_count, flow_count, \
tot_l4_payload_len, \
risky, midstream, guessed, not_detected = self.__getStatsFromFlowMgr()
out_str = '\r[n|tot|avg JSONs: {}|{}|{}/s] [tot l4: {}] ' \
'[lss|srcs: {}|{}] ' \
'[flws|rsky|mdstrm|!dtctd|gssd: {}|{}|{}|{}|{} / {}|{}|{}|{}|{}] [{}]' \
''.format(self.json_lines,
Stats.prettifyBytes(self.nsock.received_bytes),
Stats.prettifyBytes(self.avg_xfer_json_bytes),
Stats.prettifyBytes(tot_l4_payload_len + self.expired_tot_l4_payload_len),
Stats.prettifyBytes(avg_l4_payload_len + self.expired_avg_l4_payload_len),
alias_count, source_count,
flow_count, risky, midstream, not_detected, guessed,
flow_count + self.total_flows,
@@ -164,7 +195,7 @@ class Stats:
midstream + self.midstream_flows,
not_detected + self.not_detected_flows,
guessed + self.guessed_flows,
self.getSpinner())
self.__getSpinner())
self.last_status_length = len(out_str) - 1 # '\r'
sys.stdout.write(out_str)
@@ -188,7 +219,7 @@ def checkEventFilter(json_dict):
'guessed': args.guessed, 'detected': args.detected,
'detection-update': args.detection_update,
'not-detected': args.not_detected,
'update': args.update}
'update': args.update, 'analyse': args.analyse}
if flow_events[json_dict['flow_event_name']] is True:
return True
@@ -226,9 +257,19 @@ def onFlowCleanup(instance, current_flow, global_user_data):
return True
def limitFloatValue(value, fmt, limit):
if float(value) < float(limit) and float(value) > 0.0:
return '<' + str(fmt).format(limit)
else:
return ' ' + str(fmt).format(value)
def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
stats = global_user_data
stats.update(json_dict, current_flow)
if 'packet_event_id' in json_dict:
return True
stats.resetStatus()
instance_and_source = ''
@@ -243,26 +284,27 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
basic_daemon_event_prefix = ''
timestamp = ''
if args.print_timestamp is True:
if 'thread_ts_msec' in json_dict:
if 'thread_ts_usec' in json_dict:
timestamp += '[{}]'.format(time.strftime('%H:%M:%S',
time.localtime(json_dict['thread_ts_msec'] / 1000)))
elif 'global_ts_msec' in json_dict:
time.localtime(nDPIsrvd.toSeconds(json_dict['thread_ts_usec']))))
elif 'global_ts_usec' in json_dict:
timestamp += '[{}]'.format(time.strftime('%H:%M:%S',
time.localtime(json_dict['global_ts_msec'] / 1000)))
time.localtime(nDPIsrvd.toSeconds(json_dict['global_ts_usec']))))
first_seen = ''
if args.print_first_seen is True:
basic_daemon_event_prefix += ' ' * 11
if 'flow_first_seen' in json_dict:
first_seen = '[' + prettifyTimediff(json_dict['flow_first_seen'] / 1000,
json_dict['thread_ts_msec'] / 1000) + ']'
first_seen = '[' + prettifyTimediff(nDPIsrvd.toSeconds(json_dict['flow_first_seen']),
nDPIsrvd.toSeconds(json_dict['thread_ts_usec'])) + ']'
last_seen = ''
if args.print_last_seen is True:
basic_daemon_event_prefix += ' ' * 11
if 'flow_last_seen' in json_dict:
last_seen = '[' + prettifyTimediff(json_dict['flow_last_seen'] / 1000,
json_dict['thread_ts_msec'] / 1000) + ']'
if current_flow is not None:
flow_last_seen = nDPIsrvd.FlowManager.getLastPacketTime(instance, current_flow.flow_id, json_dict)
last_seen = '[' + prettifyTimediff(nDPIsrvd.toSeconds(flow_last_seen),
nDPIsrvd.toSeconds(json_dict['thread_ts_usec'])) + ']'
if 'daemon_event_id' in json_dict:
if json_dict['daemon_event_name'] == 'status':
@@ -287,9 +329,9 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
stats.printStatus()
return True
if 'error_event_id' in json_dict:
print('{}{}{} {}: {}'.format(timestamp, basic_daemon_event_prefix, instance_and_source,
print('{}{}{} {}: {} [{}/{}]'.format(timestamp, basic_daemon_event_prefix, instance_and_source,
prettifyEvent([TermColor.FAIL, TermColor.BLINK], 15, 'ERROR-EVENT'),
json_dict['error_event_name']))
json_dict['error_event_name'], json_dict['threshold_n'], json_dict['threshold_n_max']))
stats.printStatus()
return True
elif 'flow_event_id' not in json_dict:
@@ -301,24 +343,50 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
return True
ndpi_proto_categ_breed = ''
ndpi_frisk = ''
next_lines = []
if 'ndpi' in json_dict:
ndpi_proto_categ_breed += ' '
if 'proto' in json_dict['ndpi']:
if args.ignore_protocol is not None:
for proto in args.ignore_protocol:
if json_dict['ndpi']['proto'].lower().startswith(proto.lower()) is True:
stats.printStatus()
return True
ndpi_proto_categ_breed += '[' + str(json_dict['ndpi']['proto']) + ']'
if 'proto_by_ip' in json_dict['ndpi']:
if args.ignore_ip_protocol is not None:
for proto in args.ignore_ip_protocol:
if json_dict['ndpi']['proto_by_ip'].lower().startswith(proto.lower()) is True:
stats.printStatus()
return True
ndpi_proto_categ_breed += '[' + str(json_dict['ndpi']['proto_by_ip']) + ']'
if 'category' in json_dict['ndpi']:
if args.ignore_category is not None:
for cat in args.ignore_category:
if json_dict['ndpi']['category'].lower().startswith(cat.lower()) is True:
stats.printStatus()
return True
ndpi_proto_categ_breed += '[' + str(json_dict['ndpi']['category']) + ']'
if 'breed' in json_dict['ndpi']:
if args.ignore_breed is not None:
for breed in args.ignore_breed:
if json_dict['ndpi']['breed'].lower().startswith(breed.lower()) is True:
stats.printStatus()
return True
ndpi_proto_categ_breed += '[' + str(json_dict['ndpi']['breed']) + ']'
if 'flow_risk' in json_dict['ndpi']:
if 'flow_risk' in json_dict['ndpi'] and args.hide_risk_info == False:
severity = 0
cnt = 0
next_lines += ['']
for key in json_dict['ndpi']['flow_risk']:
ndpi_frisk += str(json_dict['ndpi']['flow_risk'][key]['risk']) + ', '
next_lines[0] += str(json_dict['ndpi']['flow_risk'][key]['risk']) + ', '
if json_dict['ndpi']['flow_risk'][key]['severity'] == 'Low':
severity = max(severity, 1)
elif json_dict['ndpi']['flow_risk'][key]['severity'] == 'Medium':
@@ -340,7 +408,10 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
else:
color = ''
ndpi_frisk = '{}{}{}: {}'.format(color, 'RISK', TermColor.END, ndpi_frisk[:-2])
if severity >= args.min_risk_severity:
next_lines[0] = '{}{}{}: {}'.format(color, 'RISK', TermColor.END, next_lines[0][:-2])
else:
del next_lines[0]
line_suffix = ''
flow_event_name = ''
@@ -351,15 +422,65 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
elif json_dict['flow_event_name'] == 'not-detected':
flow_event_name += '{}{:>16}{}'.format(TermColor.WARNING + TermColor.BOLD + TermColor.BLINK,
json_dict['flow_event_name'], TermColor.END)
elif json_dict['flow_event_name'] == 'analyse':
flow_event_name += '{}{:>16}{}'.format(TermColor.WARNING,
json_dict['flow_event_name'], TermColor.END)
if args.print_analyse_results is True:
next_lines = [' {:>10}|{:>10}|{:>10}|{:>10}|{:>17}|{:>9}'.format(
'min', 'max', 'avg', 'stddev', 'variance', 'entropy')]
next_lines += ['[IAT.........: {}|{}|{}|{}|{}|{}]'.format(
limitFloatValue(nDPIsrvd.toSeconds(json_dict['data_analysis']['iat']['min']),
'{:>9.3f}', 0.001),
limitFloatValue(nDPIsrvd.toSeconds(json_dict['data_analysis']['iat']['max']),
'{:>9.3f}', 0.001),
limitFloatValue(nDPIsrvd.toSeconds(json_dict['data_analysis']['iat']['avg']),
'{:>9.3f}', 0.001),
limitFloatValue(nDPIsrvd.toSeconds(json_dict['data_analysis']['iat']['stddev']),
'{:>9.3f}', 0.001),
limitFloatValue(nDPIsrvd.toSeconds(json_dict['data_analysis']['iat']['var']),
'{:>16.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['iat']['ent'],
'{:>8.3f}', 0.001)
)]
next_lines += ['']
next_lines[-1] += '[PKTLEN......: {}|{}|{}|{}|{}|{}]'.format(
limitFloatValue(json_dict['data_analysis']['pktlen']['min'], '{:>9.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['pktlen']['max'], '{:>9.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['pktlen']['avg'], '{:>9.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['pktlen']['stddev'],
'{:>9.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['pktlen']['var'], '{:>16.3f}', 0.001),
limitFloatValue(json_dict['data_analysis']['pktlen']['ent'], '{:>8.3f}', 0.001)
)
next_lines += ['']
next_lines[-1] += '[BINS(c->s)..: {}]'.format(','.join([str(n) for n in json_dict['data_analysis']['bins']['c_to_s']]))
next_lines += ['']
next_lines[-1] += '[BINS(s->c)..: {}]'.format(','.join([str(n) for n in json_dict['data_analysis']['bins']['s_to_c']]))
next_lines += ['']
next_lines[-1] += '[DIRECTIONS..: {}]'.format(','.join([str(n) for n in json_dict['data_analysis']['directions']]))
next_lines += ['']
iats = ''
for n in json_dict['data_analysis']['iat']['data']:
iats += '{:.1f},'.format(n / 1000.0)
iats = iats[:-1]
next_lines[-1] += '[IATS(ms)....: {}]'.format(iats)
next_lines += ['']
next_lines[-1] += '[PKTLENS.....: {}]'.format(','.join([str(n) for n in json_dict['data_analysis']['pktlen']['data']]))
next_lines += ['']
ents = ''
for n in json_dict['data_analysis']['entropies']:
ents += '{:.1f},'.format(n)
ents = ents[:-1]
next_lines[-1] += '[ENTROPIES...: {}]'.format(ents)
else:
if json_dict['flow_event_name'] == 'new':
line_suffix = ''
if json_dict['midstream'] != 0:
line_suffix += '[{}] '.format(TermColor.WARNING + TermColor.BLINK + 'MIDSTREAM' + TermColor.END)
line_suffix += ' [{}]'.format(TermColor.WARNING + TermColor.BLINK + 'MIDSTREAM' + TermColor.END)
if args.ipwhois is True:
src_whois = whois(json_dict['src_ip'].lower())
dst_whois = whois(json_dict['dst_ip'].lower())
line_suffix += '['
line_suffix += ' ['
if src_whois is not None:
line_suffix += '{}'.format(src_whois)
if dst_whois is not None:
@@ -371,17 +492,54 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
line_suffix += ']'
flow_event_name += '{}{:>16}{}'.format(flow_active_color, json_dict['flow_event_name'], TermColor.END)
if 'ndpi' in json_dict and 'hostname' in json_dict['ndpi']:
if args.ignore_hostname is not None:
for hostname in args.ignore_hostname:
if json_dict['ndpi']['hostname'].lower().endswith(hostname.lower()) is True:
stats.printStatus()
return True
if args.print_hostname is True:
line_suffix += '[{}]'.format(json_dict['ndpi']['hostname'])
if args.skip_empty is True:
if json_dict['flow_src_tot_l4_payload_len'] == 0 or json_dict['flow_dst_tot_l4_payload_len'] == 0:
stats.printStatus()
return True
if args.print_bytes is True:
src_color = ''
dst_color = ''
tot_color = ''
if json_dict['flow_src_tot_l4_payload_len'] >= 1 * 1024 * 1024:
tot_color = src_color = TermColor.HINT
if json_dict['flow_src_tot_l4_payload_len'] >= 1 * 1024 * 1024 * 1024:
src_color += TermColor.BOLD + TermColor.BLINK
if json_dict['flow_dst_tot_l4_payload_len'] >= 1 * 1024 * 1024:
tot_color = dst_color = TermColor.HINT
if json_dict['flow_dst_tot_l4_payload_len'] >= 1 * 1024 * 1024 * 1024:
dst_color += TermColor.BOLD + TermColor.BLINK
line_suffix += '[' + src_color + Stats.prettifyBytes(json_dict['flow_src_tot_l4_payload_len']) + TermColor.END + ']' \
'[' + dst_color + Stats.prettifyBytes(json_dict['flow_dst_tot_l4_payload_len']) + TermColor.END +']' \
'[' + tot_color + Stats.prettifyBytes(json_dict['flow_src_tot_l4_payload_len'] + \
json_dict['flow_dst_tot_l4_payload_len']) + TermColor.END + ']'
if args.print_packets is True:
line_suffix += '[' + Stats.prettifyBytes(json_dict['flow_src_packets_processed'], False) + ']' \
'[' + Stats.prettifyBytes(json_dict['flow_dst_packets_processed'], False) + ']'
if json_dict['l3_proto'] == 'ip4':
print('{}{}{}{}{}: [{:.>6}] [{}][{:.>5}] [{:.>15}]{} -> [{:.>15}]{} {}{}' \
print('{}{}{}{}{}: [{:.>6}]{} [{}][{:.>5}] [{:.>15}]{} -> [{:.>15}]{}{}{}' \
''.format(timestamp, first_seen, last_seen, instance_and_source, flow_event_name,
json_dict['flow_id'], json_dict['l3_proto'], json_dict['l4_proto'],
json_dict['flow_id'],
'[{:.>4}]'.format(json_dict['vlan_id']) if 'vlan_id' in json_dict else '',
json_dict['l3_proto'], json_dict['l4_proto'],
json_dict['src_ip'].lower(),
'[{:.>5}]'.format(json_dict['src_port']) if 'src_port' in json_dict else '',
json_dict['dst_ip'].lower(),
'[{:.>5}]'.format(json_dict['dst_port']) if 'dst_port' in json_dict else '',
ndpi_proto_categ_breed, line_suffix))
elif json_dict['l3_proto'] == 'ip6':
print('{}{}{}{}{}: [{:.>6}] [{}][{:.>5}] [{:.>39}]{} -> [{:.>39}]{} {}{}' \
print('{}{}{}{}{}: [{:.>6}] [{}][{:.>5}] [{:.>39}]{} -> [{:.>39}]{}{}{}' \
''.format(timestamp, first_seen, last_seen, instance_and_source, flow_event_name,
json_dict['flow_id'], json_dict['l3_proto'], json_dict['l4_proto'],
json_dict['src_ip'].lower(),
@@ -392,24 +550,38 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
else:
raise RuntimeError('unsupported l3 protocol: {}'.format(json_dict['l3_proto']))
if len(ndpi_frisk) > 0:
for line in next_lines:
print('{}{}{}{}{:>18}{}'.format(timestamp, first_seen, last_seen,
instance_and_source, '', ndpi_frisk))
instance_and_source, '', line))
stats.printStatus()
return True
if __name__ == '__main__':
argparser = nDPIsrvd.defaultArgumentParser('Prettify and print events using the nDPIsrvd Python interface.')
argparser = nDPIsrvd.defaultArgumentParser('Prettify and print events using the nDPIsrvd Python interface.', True)
argparser.add_argument('--no-color', action='store_true', default=False,
help='Disable all terminal colors.')
argparser.add_argument('--no-blink', action='store_true', default=False,
help='Disable all blink effects.')
argparser.add_argument('--no-statusbar', action='store_true', default=False,
help='Disable informational status bar.')
argparser.add_argument('--hide-instance-info', action='store_true', default=False,
help='Hide instance Alias/Source prefixed every line.')
argparser.add_argument('--hide-risk-info', action='store_true', default=False,
help='Skip printing risks.')
argparser.add_argument('--print-timestamp', action='store_true', default=False,
help='Print received event timestamps.')
argparser.add_argument('--print-first-seen', action='store_true', default=False,
help='Print first seen flow time diff.')
argparser.add_argument('--print-last-seen', action='store_true', default=False,
help='Print last seen flow time diff.')
argparser.add_argument('--print-bytes', action='store_true', default=False,
help='Print received/transmitted source/dest bytes for every flow.')
argparser.add_argument('--print-packets', action='store_true', default=False,
help='Print received/transmitted source/dest packets for every flow.')
argparser.add_argument('--skip-empty', action='store_true', default=False,
help='Do not print flows that did not carry any layer7 payload.')
argparser.add_argument('--guessed', action='store_true', default=False, help='Print only guessed flow events.')
argparser.add_argument('--not-detected', action='store_true', default=False, help='Print only undetected flow events.')
argparser.add_argument('--detected', action='store_true', default=False, help='Print only detected flow events.')
@@ -420,27 +592,55 @@ if __name__ == '__main__':
argparser.add_argument('--end', action='store_true', default=False, help='Print only end flow events.')
argparser.add_argument('--idle', action='store_true', default=False, help='Print only idle flow events.')
argparser.add_argument('--update', action='store_true', default=False, help='Print only update flow events.')
argparser.add_argument('--detection', action='store_true', default=False, help='Print only detected/detection-update flow events.')
argparser.add_argument('--analyse', action='store_true', default=False, help='Print only analyse flow events.')
argparser.add_argument('--detection', action='store_true', default=False, help='Print only detected/guessed/not-detected flow events.')
argparser.add_argument('--ipwhois', action='store_true', default=False, help='Use Python-IPWhois to print additional location information.')
argparser.add_argument('--print-hostname', action='store_true', default=False, help='Print detected hostnames if available.')
argparser.add_argument('--print-analyse-results', action='store_true', default=False,
help='Print detailed results of analyse events.')
argparser.add_argument('--ignore-protocol', action='append', help='Ignore printing lines with a certain protocol.')
argparser.add_argument('--ignore-ip-protocol', action='append', help='Ignore printing lines with a certain IP protocol.')
argparser.add_argument('--ignore-category', action='append', help='Ignore printing lines with a certain category.')
argparser.add_argument('--ignore-breed', action='append', help='Ignore printing lines with a certain breed.')
argparser.add_argument('--ignore-hostname', action='append', help='Ignore printing lines with a certain hostname.')
argparser.add_argument('--min-risk-severity', action='store', type=int, default=0, help='Print only risks with a risk severity greater or equal to the given argument')
args = argparser.parse_args()
if args.no_color is True:
TermColor.disableColor()
if args.no_blink is True:
TermColor.disableBlink()
if args.ipwhois is True:
import dns, ipwhois
whois_db = dict()
if args.detection is True:
args.detected = True
args.guessed = True
args.not_detected = True
address = nDPIsrvd.validateAddress(args)
sys.stderr.write('Recv buffer size: {}\n'.format(nDPIsrvd.NETWORK_BUFFER_MAX_SIZE))
sys.stderr.write('Connecting to {} ..\n'.format(address[0]+':'+str(address[1]) if type(address) is tuple else address))
nsock = nDPIsrvdSocket()
nDPIsrvd.prepareJsonFilter(args, nsock)
nsock.connect(address)
nsock.timeout(1.0)
stats = Stats(nsock)
if args.no_statusbar is True:
stats.disableStatusbar()
while True:
try:
nsock.loop(onJsonLineRecvd, onFlowCleanup, stats)
except nDPIsrvd.SocketConnectionBroken as err:
sys.stderr.write('\n{}\n'.format(err))
break
except KeyboardInterrupt:
print('\n\nKeyboard Interrupt: cleaned up {} flows.'.format(len(nsock.shutdown())))
break

View File

@@ -6,7 +6,8 @@ import sys
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../usr/share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket

View File

@@ -1,142 +0,0 @@
#!/usr/bin/env python3
import io
import json
import os
import pandas
import requests
import sys
import time
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../usr/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket
global ja3_fps
ja3_fps = dict()
# 1 hour = 3600 sec/hour = (60 minutes/hour) * (60 seconds/minute)
JA3_FP_MAX_AGE = 60 * 60
global ja3_bl
ja3_bl = None
global ja3_bl_printed
ja3_bl_printed = dict()
def downloadJA3Blacklist():
response = requests.get(
'https://sslbl.abuse.ch/blacklist/ja3_fingerprints.csv'
)
if response.status_code == 200:
global ja3_bl
ja3_bl = pandas.read_csv(io.StringIO(response.text), header=9)
return True
return False
def getBlacklisted(ja3_hash):
global ja3_bl
return ja3_bl[(ja3_bl['# ja3_md5'] == ja3_hash)]
def checkBlacklisted(ja3_hash):
if ja3_bl is None:
return
csv_entry = getBlacklisted(ja3_hash)
if not csv_entry.empty and ja3_hash not in ja3_bl_printed:
print('Found CSV JA3 blacklist entry:')
print(csv_entry)
ja3_bl_printed[ja3_hash] = True
class JA3ER(object):
def __init__(self, json_dict):
self.json = json_dict
self.last_checked = time.time()
def isTooOld(self):
current_time = time.time()
if current_time - self.last_checked >= JA3_FP_MAX_AGE:
return True
return False
def isJA3InfoTooOld(ja3_hash):
global ja3_fps
if ja3_hash in ja3_fps:
if ja3_fps[ja3_hash].isTooOld() is True:
print('Fingerprint {} too old, re-newing..'.format(ja3_hash))
return True
else:
return True
return False
def getInfoFromJA3ER(ja3_hash):
global ja3_fps
response = requests.get('https://ja3er.com/search/' + ja3_hash)
if response.status_code == 200:
ja3_fps[ja3_hash] = JA3ER(json.loads(response.text, strict=True))
if 'error' not in ja3_fps[ja3_hash].json:
print('Fingerprints for JA3 {}:'.format(ja3_hash))
for ua in ja3_fps[ja3_hash].json:
if 'User-Agent' in ua:
print('\tUser-Agent: {}\n'
'\t Last seen: {}, '
'Count: {}'.format(ua['User-Agent'],
ua['Last_seen'],
ua['Count']))
elif 'Comment' in ua:
print('\tComment...: {}\n'
'\t Reported: {}'
.format(ua['Comment'].replace('\r', '')
.replace('\n', ' '), ua['Reported']))
else:
print(ua)
else:
print('No fingerprint for JA3 {} found.'.format(ja3_hash))
def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
if 'tls' in json_dict and 'ja3' in json_dict['tls']:
if json_dict['tls']['client_requested_server_name'] == 'ja3er.com':
return True
if isJA3InfoTooOld(json_dict['tls']['ja3']) is True:
getInfoFromJA3ER(json_dict['tls']['ja3'])
if isJA3InfoTooOld(json_dict['tls']['ja3']) is True:
getInfoFromJA3ER(json_dict['tls']['ja3s'])
checkBlacklisted(json_dict['tls']['ja3'])
return True
if __name__ == '__main__':
argparser = nDPIsrvd.defaultArgumentParser()
args = argparser.parse_args()
address = nDPIsrvd.validateAddress(args)
sys.stderr.write('Recv buffer size: {}\n'
.format(nDPIsrvd.NETWORK_BUFFER_MAX_SIZE))
sys.stderr.write('Connecting to {} ..\n'
.format(address[0] + ':' +
str(address[1])
if type(address) is tuple else address))
if downloadJA3Blacklist() is False:
print('Could not download JA3 blacklist.')
nsock = nDPIsrvdSocket()
nsock.connect(address)
try:
nsock.loop(onJsonLineRecvd, None, None)
except nDPIsrvd.SocketConnectionBroken as err:
sys.stderr.write('\n{}\n'.format(err))
except KeyboardInterrupt:
print()

View File

@@ -5,7 +5,8 @@ import sys
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../usr/share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor
@@ -14,7 +15,7 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
return True
if __name__ == '__main__':
argparser = nDPIsrvd.defaultArgumentParser()
argparser = nDPIsrvd.defaultArgumentParser('Plain and simple nDPIsrvd JSON event printer with filter capabilities.', True)
args = argparser.parse_args()
address = nDPIsrvd.validateAddress(args)
@@ -22,5 +23,6 @@ if __name__ == '__main__':
sys.stderr.write('Connecting to {} ..\n'.format(address[0]+':'+str(address[1]) if type(address) is tuple else address))
nsock = nDPIsrvdSocket()
nDPIsrvd.prepareJsonFilter(args, nsock)
nsock.connect(address)
nsock.loop(onJsonLineRecvd, None, None)

View File

@@ -0,0 +1,384 @@
#!/usr/bin/env python3
import base64
import binascii
import datetime as dt
import math
import matplotlib.animation as ani
import matplotlib.pyplot as plt
import multiprocessing as mp
import numpy as np
import os
import queue
import sys
import tensorflow as tf
from tensorflow.keras import models, layers, preprocessing
from tensorflow.keras.layers import Embedding, Masking, Input, Dense
from tensorflow.keras.models import Model
from tensorflow.keras.utils import plot_model
from tensorflow.keras.losses import MeanSquaredError, KLDivergence
from tensorflow.keras.optimizers import Adam, SGD
from tensorflow.keras.callbacks import TensorBoard, EarlyStopping
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor
INPUT_SIZE = nDPIsrvd.nDPId_PACKETS_PLEN_MAX
LATENT_SIZE = 8
TRAINING_SIZE = 512
EPOCH_COUNT = 3
BATCH_SIZE = 16
LEARNING_RATE = 0.000001
ES_PATIENCE = 3
PLOT = False
PLOT_HISTORY = 100
TENSORBOARD = False
TB_LOGPATH = 'logs/' + dt.datetime.now().strftime("%Y%m%d-%H%M%S")
VAE_USE_KLDIV = False
VAE_USE_SGD = False
def generate_autoencoder():
# TODO: The current model does handle *each* packet separatly.
# But in fact, depending on the nDPId settings (nDPId_PACKETS_PER_FLOW_TO_SEND), packets can be in relation to each other.
# The accuracy may (or may not) improve significantly, but some of changes in the code are required.
input_i = Input(shape=(), name='input_i')
input_e = Embedding(input_dim=INPUT_SIZE, output_dim=INPUT_SIZE, mask_zero=True, name='input_e')(input_i)
masked_e = Masking(mask_value=0.0, name='masked_e')(input_e)
encoded_h1 = Dense(4096, activation='relu', name='encoded_h1')(masked_e)
encoded_h2 = Dense(2048, activation='relu', name='encoded_h2')(encoded_h1)
encoded_h3 = Dense(1024, activation='relu', name='encoded_h3')(encoded_h2)
encoded_h4 = Dense(512, activation='relu', name='encoded_h4')(encoded_h3)
encoded_h5 = Dense(128, activation='relu', name='encoded_h5')(encoded_h4)
encoded_h6 = Dense(64, activation='relu', name='encoded_h6')(encoded_h5)
encoded_h7 = Dense(32, activation='relu', name='encoded_h7')(encoded_h6)
latent = Dense(LATENT_SIZE, activation='relu', name='latent')(encoded_h7)
input_l = Input(shape=(LATENT_SIZE), name='input_l')
decoder_h1 = Dense(32, activation='relu', name='decoder_h1')(input_l)
decoder_h2 = Dense(64, activation='relu', name='decoder_h2')(decoder_h1)
decoder_h3 = Dense(128, activation='relu', name='decoder_h3')(decoder_h2)
decoder_h4 = Dense(512, activation='relu', name='decoder_h4')(decoder_h3)
decoder_h5 = Dense(1024, activation='relu', name='decoder_h5')(decoder_h4)
decoder_h6 = Dense(2048, activation='relu', name='decoder_h6')(decoder_h5)
decoder_h7 = Dense(4096, activation='relu', name='decoder_h7')(decoder_h6)
output_i = Dense(INPUT_SIZE, activation='sigmoid', name='output_i')(decoder_h7)
encoder = Model(input_e, latent, name='encoder')
decoder = Model(input_l, output_i, name='decoder')
return KLDivergence() if VAE_USE_KLDIV else MeanSquaredError(), \
SGD() if VAE_USE_SGD else Adam(learning_rate=LEARNING_RATE), \
Model(input_e, decoder(encoder(input_e)), name='VAE')
def compile_autoencoder():
loss, optimizer, autoencoder = generate_autoencoder()
autoencoder.compile(loss=loss, optimizer=optimizer, metrics=[])
return autoencoder
def get_autoencoder(load_from_file=None):
if load_from_file is None:
autoencoder = compile_autoencoder()
else:
autoencoder = models.load_model(load_from_file)
encoder_submodel = autoencoder.layers[1]
decoder_submodel = autoencoder.layers[2]
return encoder_submodel, decoder_submodel, autoencoder
def on_json_line(json_dict, instance, current_flow, global_user_data):
if 'packet_event_name' not in json_dict:
return True
if json_dict['packet_event_name'] != 'packet' and \
json_dict['packet_event_name'] != 'packet-flow':
return True
shutdown_event, training_event, padded_pkts, print_dots = global_user_data
if shutdown_event.is_set():
return False
try:
buf = base64.b64decode(json_dict['pkt'], validate=True)
except binascii.Error as err:
sys.stderr.write('\nBase64 Exception: {}\n'.format(str(err)))
sys.stderr.write('Affected JSON: {}\n'.format(str(json_dict)))
sys.stderr.flush()
return False
# Generate decimal byte buffer with valus from 0-255
int_buf = []
for v in buf:
int_buf.append(int(v))
mat = np.array([int_buf], dtype='float64')
# Normalize the values
mat = mat.astype('float64') / 255.0
# Mean removal
matmean = np.mean(mat, dtype='float64')
mat -= matmean
# Pad resulting matrice
buf = preprocessing.sequence.pad_sequences(mat, padding="post", maxlen=INPUT_SIZE, truncating='post', dtype='float64')
padded_pkts.put(buf[0])
#print(list(buf[0]))
if not training_event.is_set():
sys.stdout.write('.' * print_dots)
sys.stdout.flush()
print_dots = 1
else:
print_dots += 1
return True
def ndpisrvd_worker(address, shared_shutdown_event, shared_training_event, shared_packet_list):
nsock = nDPIsrvdSocket()
try:
nsock.connect(address)
print_dots = 1
nsock.loop(on_json_line, None, (shared_shutdown_event, shared_training_event, shared_packet_list, print_dots))
except nDPIsrvd.SocketConnectionBroken as err:
sys.stderr.write('\nnDPIsrvd-Worker Socket Error: {}\n'.format(err))
except KeyboardInterrupt:
sys.stderr.write('\n')
except Exception as err:
sys.stderr.write('\nnDPIsrvd-Worker Exception: {}\n'.format(str(err)))
sys.stderr.flush()
shared_shutdown_event.set()
def keras_worker(load_model, save_model, shared_shutdown_event, shared_training_event, shared_packet_queue, shared_plot_queue):
shared_training_event.set()
try:
encoder, _, autoencoder = get_autoencoder(load_model)
except Exception as err:
sys.stderr.write('Could not load Keras model from file: {}\n'.format(str(err)))
sys.stderr.flush()
encoder, _, autoencoder = get_autoencoder()
autoencoder.summary()
additional_callbacks = []
if TENSORBOARD is True:
tensorboard = TensorBoard(log_dir=TB_LOGPATH, histogram_freq=1)
additional_callbacks += [tensorboard]
early_stopping = EarlyStopping(monitor='val_loss', min_delta=0.0001, patience=ES_PATIENCE, restore_best_weights=True, start_from_epoch=0, verbose=0, mode='auto')
additional_callbacks += [early_stopping]
shared_training_event.clear()
try:
packets = list()
while not shared_shutdown_event.is_set():
try:
packet = shared_packet_queue.get(timeout=1)
except queue.Empty:
packet = None
if packet is None:
continue
packets.append(packet)
if len(packets) % TRAINING_SIZE == 0:
shared_training_event.set()
print('\nGot {} packets, training..'.format(len(packets)))
tmp = np.array(packets)
history = autoencoder.fit(
tmp, tmp, epochs=EPOCH_COUNT, batch_size=BATCH_SIZE,
validation_split=0.2,
shuffle=True,
callbacks=[additional_callbacks]
)
reconstructed_data = autoencoder.predict(tmp)
mse = np.mean(np.square(tmp - reconstructed_data))
reconstruction_accuracy = (1.0 / mse)
encoded_data = encoder.predict(tmp)
latent_activations = encoder.predict(tmp)
shared_plot_queue.put((reconstruction_accuracy, history.history['val_loss'], encoded_data[:, 0], encoded_data[:, 1], latent_activations))
packets.clear()
shared_training_event.clear()
except KeyboardInterrupt:
sys.stderr.write('\n')
except Exception as err:
if len(str(err)) == 0:
err = 'Unknown'
sys.stderr.write('\nKeras-Worker Exception: {}\n'.format(str(err)))
sys.stderr.flush()
if save_model is not None:
sys.stderr.write('Saving model to {}\n'.format(save_model))
sys.stderr.flush()
autoencoder.save(save_model)
try:
shared_shutdown_event.set()
except Exception:
pass
def plot_animate(i, shared_plot_queue, ax, xs, ys):
if not shared_plot_queue.empty():
accuracy, loss, encoded_data0, encoded_data1, latent_activations = shared_plot_queue.get(timeout=1)
epochs = len(loss)
loss_mean = sum(loss) / epochs
else:
return
(ax1, ax2, ax3, ax4) = ax
(ys1, ys2, ys3, ys4) = ys
if len(xs) == 0:
xs.append(epochs)
else:
xs.append(xs[-1] + epochs)
ys1.append(accuracy)
ys2.append(loss_mean)
xs = xs[-PLOT_HISTORY:]
ys1 = ys1[-PLOT_HISTORY:]
ys2 = ys2[-PLOT_HISTORY:]
ax1.clear()
ax1.plot(xs, ys1, '-')
ax2.clear()
ax2.plot(xs, ys2, '-')
ax3.clear()
ax3.scatter(encoded_data0, encoded_data1, marker='.')
ax4.clear()
ax4.imshow(latent_activations, cmap='viridis', aspect='auto')
ax1.set_xlabel('Epoch Count')
ax1.set_ylabel('Accuracy')
ax2.set_xlabel('Epoch Count')
ax2.set_ylabel('Validation Loss')
ax3.set_title('Latent Space')
ax4.set_title('Latent Space Heatmap')
ax4.set_xlabel('Latent Dimensions')
ax4.set_ylabel('Datapoints')
def plot_worker(shared_shutdown_event, shared_plot_queue):
try:
fig, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2)
fig.tight_layout()
ax1.set_xlabel('Epoch Count')
ax1.set_ylabel('Accuracy')
ax2.set_xlabel('Epoch Count')
ax2.set_ylabel('Validation Loss')
ax3.set_title('Latent Space')
ax4.set_title('Latent Space Heatmap')
ax4.set_xlabel('Latent Dimensions')
ax4.set_ylabel('Datapoints')
xs = []
ys1 = []
ys2 = []
ys3 = []
ys4 = []
ani.FuncAnimation(fig, plot_animate, fargs=(shared_plot_queue, (ax1, ax2, ax3, ax4), xs, (ys1, ys2, ys3, ys4)), interval=1000, cache_frame_data=False)
plt.subplots_adjust(left=0.05, right=0.95, top=0.95, bottom=0.05)
plt.margins(x=0, y=0)
plt.show()
except Exception as err:
sys.stderr.write('\nPlot-Worker Exception: {}\n'.format(str(err)))
sys.stderr.flush()
shared_shutdown_event.set()
return
if __name__ == '__main__':
sys.stderr.write('\b\n***************\n')
sys.stderr.write('*** WARNING ***\n')
sys.stderr.write('***************\n')
sys.stderr.write('\nThis is an unmature Autoencoder example.\n')
sys.stderr.write('Please do not rely on any of it\'s output!\n\n')
argparser = nDPIsrvd.defaultArgumentParser()
argparser.add_argument('--load-model', action='store',
help='Load a pre-trained model file.')
argparser.add_argument('--save-model', action='store',
help='Save the trained model to a file.')
argparser.add_argument('--training-size', action='store', type=int, default=TRAINING_SIZE,
help='Set the amount of captured packets required to start the training phase.')
argparser.add_argument('--batch-size', action='store', type=int, default=BATCH_SIZE,
help='Set the batch size used for the training phase.')
argparser.add_argument('--learning-rate', action='store', type=float, default=LEARNING_RATE,
help='Set the (initial) learning rate for the optimizer.')
argparser.add_argument('--plot', action='store_true', default=PLOT,
help='Show some model metrics using pyplot.')
argparser.add_argument('--plot-history', action='store', type=int, default=PLOT_HISTORY,
help='Set the history size of Line plots. Requires --plot')
argparser.add_argument('--tensorboard', action='store_true', default=TENSORBOARD,
help='Enable TensorBoard compatible logging callback.')
argparser.add_argument('--tensorboard-logpath', action='store', default=TB_LOGPATH,
help='TensorBoard logging path.')
argparser.add_argument('--use-sgd', action='store_true', default=VAE_USE_SGD,
help='Use SGD optimizer instead of Adam.')
argparser.add_argument('--use-kldiv', action='store_true', default=VAE_USE_KLDIV,
help='Use Kullback-Leibler loss function instead of Mean-Squared-Error.')
argparser.add_argument('--patience', action='store', type=int, default=ES_PATIENCE,
help='Epoch value for EarlyStopping. This value forces VAE fitting to if no improvment achieved.')
args = argparser.parse_args()
address = nDPIsrvd.validateAddress(args)
LEARNING_RATE = args.learning_rate
TRAINING_SIZE = args.training_size
BATCH_SIZE = args.batch_size
PLOT = args.plot
PLOT_HISTORY = args.plot_history
TENSORBOARD = args.tensorboard
TB_LOGPATH = args.tensorboard_logpath if args.tensorboard_logpath is not None else TB_LOGPATH
VAE_USE_SGD = args.use_sgd
VAE_USE_KLDIV = args.use_kldiv
ES_PATIENCE = args.patience
sys.stderr.write('Recv buffer size: {}\n'.format(nDPIsrvd.NETWORK_BUFFER_MAX_SIZE))
sys.stderr.write('Connecting to {} ..\n'.format(address[0]+':'+str(address[1]) if type(address) is tuple else address))
sys.stderr.write('PLOT={}, PLOT_HISTORY={}, LEARNING_RATE={}, TRAINING_SIZE={}, BATCH_SIZE={}\n\n'.format(PLOT, PLOT_HISTORY, LEARNING_RATE, TRAINING_SIZE, BATCH_SIZE))
mgr = mp.Manager()
shared_training_event = mgr.Event()
shared_training_event.clear()
shared_shutdown_event = mgr.Event()
shared_shutdown_event.clear()
shared_packet_queue = mgr.JoinableQueue()
shared_plot_queue = mgr.JoinableQueue()
nDPIsrvd_job = mp.Process(target=ndpisrvd_worker, args=(
address,
shared_shutdown_event,
shared_training_event,
shared_packet_queue
))
nDPIsrvd_job.start()
keras_job = mp.Process(target=keras_worker, args=(
args.load_model,
args.save_model,
shared_shutdown_event,
shared_training_event,
shared_packet_queue,
shared_plot_queue
))
keras_job.start()
if PLOT is True:
plot_job = mp.Process(target=plot_worker, args=(shared_shutdown_event, shared_plot_queue))
plot_job.start()
try:
shared_shutdown_event.wait()
except KeyboardInterrupt:
print('\nShutting down worker processess..')
if PLOT is True:
plot_job.terminate()
plot_job.join()
nDPIsrvd_job.terminate()
nDPIsrvd_job.join()
keras_job.join(timeout=3)
keras_job.terminate()

View File

@@ -0,0 +1,7 @@
joblib
tensorflow
scikit-learn
scipy
matplotlib
numpy
pandas

View File

@@ -0,0 +1,352 @@
#!/usr/bin/env python3
import csv
import joblib
import matplotlib.pyplot
import numpy
import os
import pandas
import sklearn
import sklearn.ensemble
import sklearn.inspection
import sys
import time
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor
N_DIRS = 0
N_BINS = 0
ENABLE_FEATURE_IAT = False
ENABLE_FEATURE_PKTLEN = False
ENABLE_FEATURE_DIRS = True
ENABLE_FEATURE_BINS = True
PROTO_CLASSES = None
def getFeatures(json):
return [json['flow_src_packets_processed'],
json['flow_dst_packets_processed'],
json['flow_src_tot_l4_payload_len'],
json['flow_dst_tot_l4_payload_len']]
def getFeaturesFromArray(json, expected_len=0):
if type(json) is str:
dirs = numpy.fromstring(json, sep=',', dtype=int)
dirs = numpy.asarray(dirs, dtype=int).tolist()
elif type(json) is list:
dirs = json
else:
raise TypeError('Invalid type: {}.'.format(type(json)))
if expected_len > 0 and len(dirs) != expected_len:
raise RuntimeError('Invalid array length; Expected {}, Got {}.'.format(expected_len, len(dirs)))
return dirs
def getRelevantFeaturesCSV(line):
ret = list()
ret.extend(getFeatures(line));
if ENABLE_FEATURE_IAT is True:
ret.extend(getFeaturesFromArray(line['iat_data'], N_DIRS - 1))
if ENABLE_FEATURE_PKTLEN is True:
ret.extend(getFeaturesFromArray(line['pktlen_data'], N_DIRS))
if ENABLE_FEATURE_DIRS is True:
ret.extend(getFeaturesFromArray(line['directions'], N_DIRS))
if ENABLE_FEATURE_BINS is True:
ret.extend(getFeaturesFromArray(line['bins_c_to_s'], N_BINS))
ret.extend(getFeaturesFromArray(line['bins_s_to_c'], N_BINS))
return [ret]
def getRelevantFeaturesJSON(line):
ret = list()
ret.extend(getFeatures(line))
if ENABLE_FEATURE_IAT is True:
ret.extend(getFeaturesFromArray(line['data_analysis']['iat']['data'], N_DIRS - 1))
if ENABLE_FEATURE_PKTLEN is True:
ret.extend(getFeaturesFromArray(line['data_analysis']['pktlen']['data'], N_DIRS))
if ENABLE_FEATURE_DIRS is True:
ret.extend(getFeaturesFromArray(line['data_analysis']['directions'], N_DIRS))
if ENABLE_FEATURE_BINS is True:
ret.extend(getFeaturesFromArray(line['data_analysis']['bins']['c_to_s'], N_BINS))
ret.extend(getFeaturesFromArray(line['data_analysis']['bins']['s_to_c'], N_BINS) )
return [ret]
def getRelevantFeatureNames():
names = list()
names.extend(['flow_src_packets_processed', 'flow_dst_packets_processed',
'flow_src_tot_l4_payload_len', 'flow_dst_tot_l4_payload_len'])
if ENABLE_FEATURE_IAT is True:
for x in range(N_DIRS - 1):
names.append('iat_{}'.format(x))
if ENABLE_FEATURE_PKTLEN is True:
for x in range(N_DIRS):
names.append('pktlen_{}'.format(x))
if ENABLE_FEATURE_DIRS is True:
for x in range(N_DIRS):
names.append('dirs_{}'.format(x))
if ENABLE_FEATURE_BINS is True:
for x in range(N_BINS):
names.append('bins_c_to_s_{}'.format(x))
for x in range(N_BINS):
names.append('bins_s_to_c_{}'.format(x))
return names
def plotPermutatedImportance(model, X, y):
result = sklearn.inspection.permutation_importance(model, X, y, n_repeats=10, random_state=42, n_jobs=-1)
forest_importances = pandas.Series(result.importances_mean, index=getRelevantFeatureNames())
fig, ax = matplotlib.pyplot.subplots()
forest_importances.plot.bar(yerr=result.importances_std, ax=ax)
ax.set_title("Feature importances using permutation on full model")
ax.set_ylabel("Mean accuracy decrease")
fig.tight_layout()
matplotlib.pyplot.show()
def isProtoClass(proto_class, line):
if type(proto_class) != list or type(line) != str:
raise TypeError('Invalid type: {}/{}.'.format(type(proto_class), type(line)))
s = line.lower()
for x in range(len(proto_class)):
if s.startswith(proto_class[x].lower()) is True:
return x + 1
return 0
def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
if 'flow_event_name' not in json_dict:
return True
if json_dict['flow_event_name'] != 'analyse':
return True
if 'ndpi' not in json_dict:
return True
if 'proto' not in json_dict['ndpi']:
return True
#print(json_dict)
model, proto_class, disable_colors = global_user_data
try:
X = getRelevantFeaturesJSON(json_dict)
y = model.predict(X)
p = model.predict_log_proba(X)
if y[0] <= 0:
y_text = 'n/a'
else:
y_text = proto_class[y[0] - 1]
color_start = ''
color_end = ''
pred_failed = False
if disable_colors is False:
if json_dict['ndpi']['proto'].lower().startswith(y_text) is True:
color_start = TermColor.BOLD
color_end = TermColor.END
elif y_text not in proto_class and \
json_dict['ndpi']['proto'].lower() not in proto_class:
pass
else:
pred_failed = True
color_start = TermColor.WARNING + TermColor.BOLD
color_end = TermColor.END
probs = str()
for i in range(len(p[0])):
if json_dict['ndpi']['proto'].lower().startswith(proto_class[i - 1]) and disable_colors is False:
probs += '{}{:>2.1f}{}, '.format(TermColor.BOLD + TermColor.BLINK if pred_failed is True else '',
p[0][i], TermColor.END)
elif i == y[0]:
probs += '{}{:>2.1f}{}, '.format(color_start, p[0][i], color_end)
else:
probs += '{:>2.1f}, '.format(p[0][i])
probs = probs[:-2]
print('DPI Engine detected: {}{:>24}{}, Predicted: {}{:>24}{}, Probabilities: {}'.format(
color_start, json_dict['ndpi']['proto'].lower(), color_end,
color_start, y_text, color_end, probs))
if pred_failed is True:
pclass = isProtoClass(args.proto_class, json_dict['ndpi']['proto'].lower())
if pclass == 0:
msg = 'false positive'
else:
msg = 'false negative'
print('{:>46} {}{}{}'.format('[-]', TermColor.FAIL + TermColor.BOLD + TermColor.BLINK, msg, TermColor.END))
except Exception as err:
print('Got exception `{}\'\nfor json: {}'.format(err, json_dict))
return True
if __name__ == '__main__':
argparser = nDPIsrvd.defaultArgumentParser()
argparser.add_argument('--load-model', action='store',
help='Load a pre-trained model file.')
argparser.add_argument('--save-model', action='store',
help='Save the trained model to a file.')
argparser.add_argument('--csv', action='store',
help='Input CSV file generated with nDPIsrvd-analysed.')
argparser.add_argument('--proto-class', action='append', required=False,
help='nDPId protocol class of interest used for training and prediction. ' +
'Can be specified multiple times. Example: tls.youtube')
argparser.add_argument('--generate-feature-importance', action='store_true',
help='Generates the permutated feature importance with matplotlib.')
argparser.add_argument('--enable-iat', action='store_true', default=None,
help='Enable packet (I)nter (A)rrival (T)ime for learning and prediction.')
argparser.add_argument('--enable-pktlen', action='store_true', default=None,
help='Enable layer 4 packet lengths for learning and prediction.')
argparser.add_argument('--disable-dirs', action='store_true', default=None,
help='Disable packet directions for learning and prediction.')
argparser.add_argument('--disable-bins', action='store_true', default=None,
help='Disable packet length distribution for learning and prediction.')
argparser.add_argument('--disable-colors', action='store_true', default=False,
help='Disable any coloring.')
argparser.add_argument('--sklearn-jobs', action='store', type=int, default=1,
help='Number of sklearn processes during training.')
argparser.add_argument('--sklearn-estimators', action='store', type=int, default=1000,
help='Number of trees in the forest.')
argparser.add_argument('--sklearn-min-samples-leaf', action='store', type=int, default=0.0001,
help='The minimum number of samples required to be at a leaf node.')
argparser.add_argument('--sklearn-class-weight', default='balanced', const='balanced', nargs='?',
choices=['balanced', 'balanced_subsample'],
help='Weights associated with the protocol classes.')
argparser.add_argument('--sklearn-max-features', default='sqrt', const='sqrt', nargs='?',
choices=['sqrt', 'log2'],
help='The number of features to consider when looking for the best split.')
argparser.add_argument('--sklearn-max-depth', action='store', type=int, default=128,
help='The maximum depth of a tree.')
argparser.add_argument('--sklearn-verbosity', action='store', type=int, default=0,
help='Controls the verbosity of sklearn\'s random forest classifier.')
args = argparser.parse_args()
address = nDPIsrvd.validateAddress(args)
if args.csv is None and args.load_model is None:
sys.stderr.write('{}: Either `--csv` or `--load-model` required!\n'.format(sys.argv[0]))
sys.exit(1)
if args.csv is None and args.generate_feature_importance is True:
sys.stderr.write('{}: `--generate-feature-importance` requires `--csv`.\n'.format(sys.argv[0]))
sys.exit(1)
if args.proto_class is None or len(args.proto_class) == 0:
if args.csv is None and args.load_model is None:
sys.stderr.write('{}: `--proto-class` missing, no useful classification can be performed.\n'.format(sys.argv[0]))
else:
if args.load_model is not None:
sys.stderr.write('{}: `--proto-class` set, but you want to load an existing model.\n'.format(sys.argv[0]))
sys.exit(1)
if args.load_model is not None:
sys.stderr.write('{}: You are loading an existing model file. ' \
'Some --sklearn-* command line parameters won\'t have any effect!\n'.format(sys.argv[0]))
if args.enable_iat is not None:
sys.stderr.write('{}: `--enable-iat` set, but you want to load an existing model.\n'.format(sys.argv[0]))
sys.exit(1)
if args.enable_pktlen is not None:
sys.stderr.write('{}: `--enable-pktlen` set, but you want to load an existing model.\n'.format(sys.argv[0]))
sys.exit(1)
if args.disable_dirs is not None:
sys.stderr.write('{}: `--disable-dirs` set, but you want to load an existing model.\n'.format(sys.argv[0]))
sys.exit(1)
if args.disable_bins is not None:
sys.stderr.write('{}: `--disable-bins` set, but you want to load an existing model.\n'.format(sys.argv[0]))
sys.exit(1)
ENABLE_FEATURE_IAT = args.enable_iat if args.enable_iat is not None else ENABLE_FEATURE_IAT
ENABLE_FEATURE_PKTLEN = args.enable_pktlen if args.enable_pktlen is not None else ENABLE_FEATURE_PKTLEN
ENABLE_FEATURE_DIRS = args.disable_dirs if args.disable_dirs is not None else ENABLE_FEATURE_DIRS
ENABLE_FEATURE_BINS = args.disable_bins if args.disable_bins is not None else ENABLE_FEATURE_BINS
PROTO_CLASSES = args.proto_class
numpy.set_printoptions(formatter={'float_kind': "{:.1f}".format}, sign=' ')
numpy.seterr(divide = 'ignore')
if args.proto_class is not None:
for i in range(len(args.proto_class)):
args.proto_class[i] = args.proto_class[i].lower()
if args.load_model is not None:
sys.stderr.write('Loading model from {}\n'.format(args.load_model))
model, options = joblib.load(args.load_model)
ENABLE_FEATURE_IAT, ENABLE_FEATURE_PKTLEN, ENABLE_FEATURE_DIRS, ENABLE_FEATURE_BINS, args.proto_class = options
if args.csv is not None:
sys.stderr.write('Learning via CSV..\n')
with open(args.csv, newline='\n') as csvfile:
reader = csv.DictReader(csvfile, delimiter=',', quotechar='"')
X = list()
y = list()
for line in reader:
N_DIRS = len(getFeaturesFromArray(line['directions']))
N_BINS = len(getFeaturesFromArray(line['bins_c_to_s']))
break
for line in reader:
try:
X += getRelevantFeaturesCSV(line)
except RuntimeError as err:
print('Runtime Error: `{}\'\non line {}: {}'.format(err, reader.line_num - 1, line))
continue
except TypeError as err:
print('Type Error: `{}\'\non line {}: {}'.format(err, reader.line_num - 1, line))
continue
try:
y += [isProtoClass(args.proto_class, line['proto'])]
except TypeError as err:
X.pop()
print('Type Error: `{}\'\non line {}: {}'.format(err, reader.line_num - 1, line))
continue
sys.stderr.write('CSV data set contains {} entries.\n'.format(len(X)))
if args.load_model is None:
model = sklearn.ensemble.RandomForestClassifier(bootstrap=False,
class_weight = args.sklearn_class_weight,
n_jobs = args.sklearn_jobs,
n_estimators = args.sklearn_estimators,
verbose = args.sklearn_verbosity,
min_samples_leaf = args.sklearn_min_samples_leaf,
max_features = args.sklearn_max_features,
max_depth = args.sklearn_max_depth
)
options = (ENABLE_FEATURE_IAT, ENABLE_FEATURE_PKTLEN, ENABLE_FEATURE_DIRS, ENABLE_FEATURE_BINS, args.proto_class)
sys.stderr.write('Training model..\n')
model.fit(X, y)
if args.generate_feature_importance is True:
sys.stderr.write('Generating feature importance .. this may take some time\n')
plotPermutatedImportance(model, X, y)
if args.save_model is not None:
sys.stderr.write('Saving model to {}\n'.format(args.save_model))
joblib.dump([model, options], args.save_model)
print('ENABLE_FEATURE_PKTLEN: {}'.format(ENABLE_FEATURE_PKTLEN))
print('ENABLE_FEATURE_BINS..: {}'.format(ENABLE_FEATURE_BINS))
print('ENABLE_FEATURE_DIRS..: {}'.format(ENABLE_FEATURE_DIRS))
print('ENABLE_FEATURE_IAT...: {}'.format(ENABLE_FEATURE_IAT))
print('Map[*] -> [0]')
for x in range(len(args.proto_class)):
print('Map["{}"] -> [{}]'.format(args.proto_class[x], x + 1))
sys.stderr.write('Predicting realtime traffic..\n')
sys.stderr.write('Recv buffer size: {}\n'.format(nDPIsrvd.NETWORK_BUFFER_MAX_SIZE))
sys.stderr.write('Connecting to {} ..\n'.format(address[0]+':'+str(address[1]) if type(address) is tuple else address))
nsock = nDPIsrvdSocket()
nsock.connect(address)
nsock.loop(onJsonLineRecvd, None, (model, args.proto_class, args.disable_colors))

View File

@@ -5,7 +5,8 @@ import sys
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../usr/share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor

View File

@@ -0,0 +1 @@
jsonschema

View File

@@ -1,15 +1,23 @@
#!/usr/bin/env python3
import base64
import os
import sys
sys.path.append(os.path.dirname(sys.argv[0]) + '/../../dependencies')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]) + '/../usr/share/nDPId')
sys.path.append(os.path.dirname(sys.argv[0]))
sys.path.append(sys.base_prefix + '/share/nDPId')
import nDPIsrvd
from nDPIsrvd import nDPIsrvdSocket, TermColor
class Stats:
KEYS = [ ['init','reconnect','shutdown','status' ], \
[ 'new','end','idle','update', ],
[ 'analyse' ], \
[ 'guessed','detected','detection-update','not-detected' ], \
[ 'packet', 'packet-flow'] ]
ALL_KEYS = KEYS[0] + KEYS[1] + KEYS[2] + KEYS[3] + KEYS[4]
def __init__(self, nDPIsrvd_sock):
self.nsock = nDPIsrvd_sock
@@ -20,11 +28,7 @@ class Stats:
self.print_nmb_every = self.print_dot_every * 5
def resetEventCounter(self):
keys = ['init','reconnect','shutdown','status', \
'new','end','idle','update',
'guessed','detected','detection-update','not-detected', \
'packet', 'packet-flow']
for k in keys:
for k in Stats.ALL_KEYS:
self.event_counter[k] = 0
def incrementEventCounter(self, json_dict):
@@ -52,13 +56,9 @@ class Stats:
return True
def getEventCounterStr(self):
keys = [ [ 'init','reconnect','shutdown','status' ], \
[ 'new','end','idle','update' ], \
[ 'guessed','detected','detection-update','not-detected' ], \
[ 'packet', 'packet-flow' ] ]
retval = str()
retval += '-' * 98 + '--\n'
for klist in keys:
for klist in Stats.KEYS:
for k in klist:
retval += '| {:<16}: {:<4} '.format(k, self.event_counter[k])
retval += '\n--' + '-' * 98 + '\n'
@@ -74,6 +74,24 @@ class SemanticValidationException(Exception):
else:
return 'Flow ID {}: {}'.format(self.current_flow.flow_id, self.text)
def verifyFlows(nsock, instance):
invalid_flows = nsock.verify()
if len(invalid_flows) > 0:
invalid_flows_str = ''
for flow_id in invalid_flows:
flow = instance.flows[flow_id]
try:
l4_proto = flow.l4_proto
except AttributeError:
l4_proto = 'n/a'
invalid_flows_str += '{} proto[{},{}] ts[{} + {} < {}] diff[{}], '.format(flow_id, l4_proto, flow.flow_idle_time,
flow.flow_last_seen, flow.flow_idle_time,
instance.getMostRecentFlowTime(flow.thread_id),
instance.getMostRecentFlowTime(flow.thread_id) -
(flow.flow_last_seen + flow.flow_idle_time))
raise SemanticValidationException(None, 'Flow Manager verification failed for: {}'.format(invalid_flows_str[:-2]))
def onFlowCleanup(instance, current_flow, global_user_data):
if type(instance) is not nDPIsrvd.Instance:
raise SemanticValidationException(current_flow,
@@ -101,28 +119,14 @@ def onFlowCleanup(instance, current_flow, global_user_data):
except AttributeError:
l4_proto = 'n/a'
invalid_flows = stats.nsock.verify()
if len(invalid_flows) > 0:
invalid_flows_str = ''
for flow_id in invalid_flows:
flow = instance.flows[flow_id]
try:
l4_proto = flow.l4_proto
except AttributeError:
l4_proto = 'n/a'
invalid_flows_str += '{} proto[{},{}] ts[{} + {} < {}] diff[{}], '.format(flow_id, l4_proto, flow.flow_idle_time,
flow.flow_last_seen, flow.flow_idle_time,
instance.most_recent_flow_time,
instance.most_recent_flow_time -
(flow.flow_last_seen + flow.flow_idle_time))
raise SemanticValidationException(None, 'Flow Manager verification failed for: {}'.format(invalid_flows_str[:-2]))
verifyFlows(stats.nsock, instance)
return True
def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
_, stats = global_user_data
stats.incrementEventCounter(json_dict)
verifyFlows(stats.nsock, instance)
if type(instance) is not nDPIsrvd.Instance:
raise SemanticValidationException(current_flow,
@@ -174,9 +178,11 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
elif json_dict['packet_event_name'] != 'packet-flow':
raise SemanticValidationException(current_flow, 'Layer4 protocol not found in JSON')
if 'flow_last_seen' in json_dict:
if json_dict['flow_last_seen'] != current_flow.flow_last_seen:
raise SemanticValidationException(current_flow, 'Flow last seen: {} != {}'.format(json_dict['flow_last_seen'],
flow_last_seen = None
if 'flow_src_last_pkt_time' in json_dict or 'flow_dst_last_pkt_time' in json_dict:
flow_last_seen = max(json_dict['flow_src_last_pkt_time'], json_dict['flow_dst_last_pkt_time'])
if flow_last_seen != current_flow.flow_last_seen:
raise SemanticValidationException(current_flow, 'Flow last seen: {} != {}'.format(flow_last_seen,
current_flow.flow_last_seen))
if 'flow_idle_time' in json_dict:
@@ -184,15 +190,14 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
raise SemanticValidationException(current_flow, 'Flow idle time mismatch: {} != {}'.format(json_dict['flow_idle_time'],
current_flow.flow_idle_time))
if ('flow_last_seen' in json_dict and 'flow_idle_time' not in json_dict) or \
('flow_last_seen' not in json_dict and 'flow_idle_time' in json_dict):
if (flow_last_seen is not None and 'flow_idle_time' not in json_dict) or \
(flow_last_seen is None and 'flow_idle_time' in json_dict):
raise SemanticValidationException(current_flow,
'Got a JSON string with only one of both keys, ' \
'both required for timeout handling:' \
'flow_last_seen, flow_idle_time')
'Got a JSON message with only 2 of 3 keys, ' \
'required for timeout handling: flow_idle_time')
if 'thread_ts_msec' in json_dict:
current_flow.thread_ts_msec = int(json_dict['thread_ts_msec'])
if 'thread_ts_usec' in json_dict:
current_flow.thread_ts_usec = int(json_dict['thread_ts_usec'])
if 'flow_packet_id' in json_dict:
try:
@@ -208,11 +213,13 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
try:
if current_flow.flow_ended == True:
raise SemanticValidationException(current_flow,
'Received JSON string for a flow that already ended/idled.')
'Received JSON message for a flow that already ended/idled.')
except AttributeError:
pass
if 'packet_event_name' in json_dict:
base64.b64decode(json_dict['pkt'], validate=True)
if json_dict['packet_event_name'] == 'packet-flow':
if lowest_possible_packet_id > json_dict['packet_id']:
raise SemanticValidationException(current_flow,
@@ -249,11 +256,27 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
except AttributeError:
pass
try:
if current_flow.flow_finished == True and \
json_dict['flow_event_name'] == 'detection-update':
raise SemanticValidationException(current_flow,
'Flow state already finished, but another detection-update received.')
except AttributeError:
pass
try:
if json_dict['flow_state'] == 'finished':
current_flow.flow_finished = True
elif json_dict['flow_state'] == 'info' and \
current_flow.flow_finished is True:
raise SemanticValidationException(current_flow,
'Flow state already finished, but switched back to info state.')
except AttributeError:
pass
try:
if current_flow.flow_finished == True and \
json_dict['flow_event_name'] != 'analyse' and \
json_dict['flow_event_name'] != 'update' and \
json_dict['flow_event_name'] != 'idle' and \
json_dict['flow_event_name'] != 'end':
@@ -264,14 +287,14 @@ def onJsonLineRecvd(json_dict, instance, current_flow, global_user_data):
pass
try:
if json_dict['flow_first_seen'] > current_flow.thread_ts_msec or \
json_dict['flow_last_seen'] > current_flow.thread_ts_msec or \
json_dict['flow_first_seen'] > json_dict['flow_last_seen']:
if json_dict['flow_first_seen'] > current_flow.thread_ts_usec or \
flow_last_seen > current_flow.thread_ts_usec or \
json_dict['flow_first_seen'] > flow_last_seen:
raise SemanticValidationException(current_flow,
'Last packet timestamp is invalid: ' \
'first_seen({}) <= {} >= last_seen({})'.format(json_dict['flow_first_seen'],
current_flow.thread_ts_msec,
json_dict['flow_last_seen']))
current_flow.thread_ts_usec,
flow_last_seen))
except AttributeError:
if json_dict['flow_event_name'] == 'new':
pass
@@ -341,6 +364,10 @@ if __name__ == '__main__':
sys.stderr.write('\n{}\n'.format(err))
except KeyboardInterrupt:
print()
except Exception as e:
for failed_line in nsock.failed_lines:
sys.stderr.write('Affected JSON line: {}\n'.format(failed_line[0]))
raise(e)
sys.stderr.write('\nEvent counter:\n' + stats.getEventCounterStr() + '\n')
if args.strict is True:

View File

@@ -0,0 +1,21 @@
[package]
name = "rs-simple"
version = "0.1.0"
authors = ["Toni Uhlig <toni@impl.cc>"]
edition = "2024"
[dependencies]
argh = "0.1"
bytes = "1"
crossterm = "0.29.0"
io = "0.0.2"
moka = { version = "0.12.10", features = ["future"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1", features = ["full"] }
tui = "0.19.0"
[profile.release]
strip = true
lto = true
codegen-units = 1

View File

@@ -0,0 +1,860 @@
use argh::FromArgs;
use bytes::BytesMut;
use crossterm::{
cursor,
event::{self, KeyCode, KeyEvent},
ExecutableCommand,
terminal::{self, ClearType},
};
use moka::{future::Cache, Expiry};
use serde::{Deserialize, Serialize};
use serde_json::Value;
use std::{
collections::HashMap,
fmt,
hash::{Hash, Hasher},
io::self,
sync::Arc,
time::{Duration, Instant, SystemTime, UNIX_EPOCH},
};
use tokio::io::AsyncReadExt;
use tokio::sync::mpsc;
use tokio::sync::Mutex;
use tokio::sync::MutexGuard;
use tokio::net::TcpStream;
use tui::{
backend::CrosstermBackend,
layout::{Layout, Constraint, Direction},
style::{Style, Color, Modifier},
Terminal,
widgets::{Block, Borders, List, ListItem, Row, Table, TableState},
};
#[derive(FromArgs, Debug)]
/// Simple Rust nDPIsrvd Client Example
struct Args {
/// nDPIsrvd host(s) to connect to
#[argh(option)]
host: Vec<String>,
}
#[derive(Debug)]
enum ParseError {
Protocol(),
Json(),
Schema(),
}
impl From<serde_json::Error> for ParseError {
fn from(_: serde_json::Error) -> Self {
ParseError::Json()
}
}
#[derive(Serialize, Deserialize, Debug, PartialEq)]
#[serde(rename_all = "lowercase")]
enum EventName {
Invalid, New, End, Idle, Update, Analyse,
Guessed, Detected,
#[serde(rename = "detection-update")]
DetectionUpdate,
#[serde(rename = "not-detected")]
NotDetected,
}
#[derive(Serialize, Deserialize, Copy, Clone, Debug)]
#[serde(rename_all = "lowercase")]
enum State {
Unknown, Info, Finished,
}
#[derive(Serialize, Deserialize, Debug)]
struct FlowEventNdpiFlowRisk {
#[serde(rename = "risk")]
risk: String,
}
#[derive(Serialize, Deserialize, Debug)]
struct FlowEventNdpi {
#[serde(rename = "proto")]
proto: String,
#[serde(rename = "flow_risk")]
risks: Option<HashMap<String, FlowEventNdpiFlowRisk>>,
}
#[derive(Serialize, Deserialize, Debug)]
struct FlowEvent {
#[serde(rename = "flow_event_name")]
name: EventName,
#[serde(rename = "flow_id")]
id: u64,
#[serde(rename = "alias")]
alias: String,
#[serde(rename = "source")]
source: String,
#[serde(rename = "thread_id")]
thread_id: u64,
#[serde(rename = "flow_state")]
state: State,
#[serde(rename = "flow_first_seen")]
first_seen: u64,
#[serde(rename = "flow_src_last_pkt_time")]
src_last_pkt_time: u64,
#[serde(rename = "flow_dst_last_pkt_time")]
dst_last_pkt_time: u64,
#[serde(rename = "flow_idle_time")]
idle_time: u64,
#[serde(rename = "flow_src_packets_processed")]
src_packets_processed: u64,
#[serde(rename = "flow_dst_packets_processed")]
dst_packets_processed: u64,
#[serde(rename = "flow_src_tot_l4_payload_len")]
src_tot_l4_payload_len: u64,
#[serde(rename = "flow_dst_tot_l4_payload_len")]
dst_tot_l4_payload_len: u64,
#[serde(rename = "l3_proto")]
l3_proto: String,
#[serde(rename = "l4_proto")]
l4_proto: String,
#[serde(rename = "ndpi")]
ndpi: Option<FlowEventNdpi>,
}
#[derive(Serialize, Deserialize, Debug)]
struct PacketEvent {
pkt_datalink: u16,
pkt_caplen: u64,
pkt_len: u64,
pkt_l4_len: u64,
}
#[derive(Serialize, Deserialize, Clone, Debug)]
struct DaemonEventStatus {
#[serde(rename = "alias")]
alias: String,
#[serde(rename = "source")]
source: String,
#[serde(rename = "thread_id")]
thread_id: u64,
#[serde(rename = "packets-captured")]
packets_captured: u64,
#[serde(rename = "packets-processed")]
packets_processed: u64,
#[serde(rename = "total-skipped-flows")]
total_skipped_flows: u64,
#[serde(rename = "total-l4-payload-len")]
total_l4_payload_len: u64,
#[serde(rename = "total-not-detected-flows")]
total_not_detected_flows: u64,
#[serde(rename = "total-guessed-flows")]
total_guessed_flows: u64,
#[serde(rename = "total-detected-flows")]
total_detected_flows: u64,
#[serde(rename = "total-detection-updates")]
total_detection_updates: u64,
#[serde(rename = "total-updates")]
total_updates: u64,
#[serde(rename = "current-active-flows")]
current_active_flows: u64,
#[serde(rename = "total-active-flows")]
total_active_flows: u64,
#[serde(rename = "total-idle-flows")]
total_idle_flows: u64,
#[serde(rename = "total-compressions")]
total_compressions: u64,
#[serde(rename = "total-compression-diff")]
total_compression_diff: u64,
#[serde(rename = "current-compression-diff")]
current_compression_diff: u64,
#[serde(rename = "global-alloc-bytes")]
global_alloc_bytes: u64,
#[serde(rename = "global-alloc-count")]
global_alloc_count: u64,
#[serde(rename = "global-free-bytes")]
global_free_bytes: u64,
#[serde(rename = "global-free-count")]
global_free_count: u64,
#[serde(rename = "total-events-serialized")]
total_events_serialized: u64,
}
#[derive(Debug)]
enum EventType {
Flow(FlowEvent),
Packet(PacketEvent),
DaemonStatus(DaemonEventStatus),
Other(),
}
#[derive(Default)]
struct Stats {
ui_updates: u64,
flow_count: u64,
parse_errors: u64,
events: u64,
flow_events: u64,
packet_events: u64,
daemon_events: u64,
packet_events_total_caplen: u64,
packet_events_total_len: u64,
packet_events_total_l4_len: u64,
packets_captured: u64,
packets_processed: u64,
flows_total_skipped: u64,
flows_total_l4_payload_len: u64,
flows_total_not_detected: u64,
flows_total_guessed: u64,
flows_current_active: u64,
flows_total_compressions: u64,
flows_total_compression_diff: u64,
flows_current_compression_diff: u64,
global_alloc_bytes: u64,
global_alloc_count: u64,
global_free_bytes: u64,
global_free_count: u64,
total_events_serialized: u64,
}
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
enum FlowExpiration {
IdleTime(u64),
}
struct FlowExpiry;
#[derive(Clone, Eq, Default, Debug)]
struct FlowKey {
id: u64,
alias: String,
source: String,
thread_id: u64,
}
#[derive(Clone, Debug)]
struct FlowValue {
state: State,
total_src_packets: u64,
total_dst_packets: u64,
total_src_bytes: u64,
total_dst_bytes: u64,
first_seen: std::time::SystemTime,
last_seen: std::time::SystemTime,
timeout_in: std::time::SystemTime,
risks: usize,
proto: String,
app_proto: String,
}
#[derive(Clone, Eq, Default, Debug)]
struct DaemonKey {
alias: String,
source: String,
thread_id: u64,
}
impl Default for State {
fn default() -> State {
State::Unknown
}
}
impl fmt::Display for State {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
State::Unknown => write!(f, "N/A"),
State::Info => write!(f, "Info"),
State::Finished => write!(f, "Finished"),
}
}
}
impl FlowExpiration {
fn as_duration(&self) -> Option<Duration> {
match self {
FlowExpiration::IdleTime(value) => Some(Duration::from_micros(*value)),
}
}
}
impl fmt::Display for FlowExpiration {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.as_duration() {
Some(duration) => {
let secs = duration.as_secs();
write!(f, "{} s", secs)
}
None => write!(f, "N/A"),
}
}
}
impl Expiry<FlowKey, (FlowExpiration, FlowValue)> for FlowExpiry {
fn expire_after_create(
&self,
_key: &FlowKey,
value: &(FlowExpiration, FlowValue),
_current_time: Instant,
) -> Option<Duration> {
value.0.as_duration()
}
}
impl Hash for FlowKey {
fn hash<H: Hasher>(&self, state: &mut H) {
self.id.hash(state);
self.alias.hash(state);
self.source.hash(state);
self.thread_id.hash(state);
}
}
impl PartialEq for FlowKey {
fn eq(&self, other: &Self) -> bool {
self.id == other.id &&
self.alias == other.alias &&
self.source == other.source &&
self.thread_id == other.thread_id
}
}
impl Hash for DaemonKey {
fn hash<H: Hasher>(&self, state: &mut H) {
self.alias.hash(state);
self.source.hash(state);
self.thread_id.hash(state);
}
}
impl PartialEq for DaemonKey {
fn eq(&self, other: &Self) -> bool {
self.alias == other.alias &&
self.source == other.source &&
self.thread_id == other.thread_id
}
}
#[tokio::main]
async fn main() {
let args: Args = argh::from_env();
if args.host.len() == 0 {
eprintln!("At least one --host required");
return;
}
let mut connections: Vec<TcpStream> = Vec::new();
for host in args.host {
match TcpStream::connect(host.clone()).await {
Ok(stream) => {
connections.push(stream);
}
Err(e) => {
eprintln!("Fehler bei Verbindung zu {}: {}", host, e);
}
}
}
if let Err(e) = terminal::enable_raw_mode() {
eprintln!("Could not enable terminal raw mode: {}", e);
return;
}
let mut stdout = io::stdout();
if let Err(e) = stdout.execute(terminal::Clear(ClearType::All)) {
eprintln!("Could not clear your terminal: {}", e);
return;
}
if let Err(e) = stdout.execute(cursor::Hide) {
eprintln!("Could not hide your cursor: {}", e);
return;
}
let backend = CrosstermBackend::new(stdout);
let mut terminal = Terminal::new(backend);
let (tx, mut rx): (mpsc::Sender<String>, mpsc::Receiver<String>) = mpsc::channel(1024);
let data = Arc::new(Mutex::new(Stats::default()));
let data_tx = Arc::clone(&data);
let data_rx = Arc::clone(&data);
let flow_cache: Arc<Cache<FlowKey, (FlowExpiration, FlowValue)>> = Arc::new(Cache::builder()
.expire_after(FlowExpiry)
.build());
let flow_cache_rx = Arc::clone(&flow_cache);
let daemon_cache: Arc<Cache<DaemonKey, DaemonEventStatus>> = Arc::new(Cache::builder()
.time_to_live(Duration::from_secs(1800))
.build());
tokio::spawn(async move {
while let Some(msg) = rx.recv().await {
match parse_json(&msg) {
Ok(message) => {
let mut data_lock = data_tx.lock().await;
data_lock.events += 1;
update_stats(&message, &mut data_lock, &flow_cache, &daemon_cache).await;
}
Err(_message) => {
let mut data_lock = data_tx.lock().await;
data_lock.parse_errors += 1;
}
}
}
});
for mut stream in connections {
let cloned_tx = tx.clone();
tokio::spawn(async move {
let mut buffer = BytesMut::with_capacity(33792usize);
loop {
let n = match stream.read_buf(&mut buffer).await {
Ok(len) => len,
Err(_) => {
continue; // Versuche es erneut, wenn ein Fehler auftritt
}
};
if n == 0 {
break;
}
while let Some(message) = parse_message(&mut buffer) {
match cloned_tx.send(message).await {
Ok(_) => (),
Err(_) => return
}
}
}
});
}
let mut table_state = TableState::default();
let mut old_selected: Option<FlowKey> = None;
loop {
let flows: Vec<(FlowKey, (FlowExpiration, FlowValue))> = flow_cache_rx.iter().map(|(k, v)| (k.as_ref().clone(), v.clone()))
.take(128)
.collect();
let mut table_selected = match table_state.selected() {
Some(mut table_index) => {
if table_index >= flows.len() {
flows.len().saturating_sub(1)
} else {
if let Some(ref old_flow_key_selected) = old_selected {
if let Some(old_index) = flows.iter().position(|x| x.0 == *old_flow_key_selected) {
if old_index != table_index {
table_index = old_index;
}
} else {
old_selected = Some(flows.get(table_index).unwrap().0.clone());
}
}
table_index
}
}
None => 0,
};
match read_keypress() {
Some(KeyCode::Esc) => break,
Some(KeyCode::Char('q')) => break,
Some(KeyCode::Up) => {
table_selected = match table_selected {
i if i == 0 => flows.len().saturating_sub(1),
i => i - 1,
};
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(KeyCode::Down) => {
table_selected = match table_selected {
i if i >= flows.len().saturating_sub(1) => 0,
i => i + 1,
};
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(KeyCode::PageUp) => {
table_selected = match table_selected {
i if i == 0 => flows.len().saturating_sub(1),
i if i < 25 => 0,
i => i - 25,
};
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(KeyCode::PageDown) => {
table_selected = match table_selected {
i if i >= flows.len().saturating_sub(1) => 0,
i if i >= flows.len().saturating_sub(25) => flows.len().saturating_sub(1),
i => i + 25,
};
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(KeyCode::Home) => {
table_selected = 0;
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(KeyCode::End) => {
table_selected = match table_selected {
_ => flows.len().saturating_sub(1),
};
if let Some(new_selected) = flows.get(table_selected) {
old_selected = Some(new_selected.0.clone());
}
},
Some(_) => (),
None => ()
};
let mut data_lock = data_rx.lock().await;
data_lock.ui_updates += 1;
draw_ui(terminal.as_mut().unwrap(), &mut table_state, table_selected, &data_lock, &flows);
}
if let Err(e) = terminal.unwrap().backend_mut().execute(cursor::Show) {
eprintln!("Could not show your cursor: {}", e);
return;
}
let mut stdout = io::stdout();
if let Err(e) = stdout.execute(terminal::Clear(ClearType::All)) {
eprintln!("Could not clear your terminal: {}", e);
return;
}
if let Err(e) = terminal::disable_raw_mode() {
eprintln!("Could not disable raw mode: {}", e);
return;
}
println!("\nDone.");
}
fn read_keypress() -> Option<KeyCode> {
if event::poll(Duration::from_millis(1000)).unwrap() {
if let event::Event::Key(KeyEvent { code, .. }) = event::read().unwrap() {
return Some(code);
}
}
None
}
fn parse_message(buffer: &mut BytesMut) -> Option<String> {
if let Some(pos) = buffer.iter().position(|&b| b == b'\n') {
let message = buffer.split_to(pos + 1);
return Some(String::from_utf8_lossy(&message).to_string());
}
None
}
fn parse_json(data: &str) -> Result<EventType, ParseError> {
let first_non_digit = data.find(|c: char| !c.is_ascii_digit()).unwrap_or(0);
let length_str = &data[0..first_non_digit];
let length: usize = length_str.parse().unwrap_or(0);
if length == 0 {
return Err(ParseError::Protocol());
}
let json_str = &data[first_non_digit..first_non_digit + length];
let value: Value = serde_json::from_str(json_str).map_err(|_| ParseError::Json()).unwrap();
if value.get("flow_event_name").is_some() {
let flow_event: FlowEvent = serde_json::from_value(value)?;
return Ok(EventType::Flow(flow_event));
} else if value.get("packet_event_name").is_some() {
let packet_event: PacketEvent = serde_json::from_value(value)?;
return Ok(EventType::Packet(packet_event));
} else if value.get("daemon_event_name").is_some() {
if value.get("daemon_event_name").unwrap() == "status" ||
value.get("daemon_event_name").unwrap() == "shutdown"
{
let daemon_status_event: DaemonEventStatus = serde_json::from_value(value)?;
return Ok(EventType::DaemonStatus(daemon_status_event));
}
return Ok(EventType::Other());
} else if value.get("error_event_name").is_some() {
return Ok(EventType::Other());
}
Err(ParseError::Schema())
}
async fn update_stats(event: &EventType, stats: &mut MutexGuard<'_, Stats>, cache: &Cache<FlowKey, (FlowExpiration, FlowValue)>, daemon_cache: &Cache<DaemonKey, DaemonEventStatus>) {
match &event {
EventType::Flow(flow_event) => {
stats.flow_events += 1;
stats.flow_count = cache.entry_count();
let key = FlowKey { id: flow_event.id, alias: flow_event.alias.to_string(),
source: flow_event.source.to_string(), thread_id: flow_event.thread_id };
if flow_event.name == EventName::End ||
flow_event.name == EventName::Idle
{
cache.remove(&key).await;
return;
}
let first_seen_seconds = flow_event.first_seen / 1_000_000;
let first_seen_nanos = (flow_event.first_seen % 1_000_000) * 1_000;
let first_seen_epoch = std::time::Duration::new(first_seen_seconds, first_seen_nanos as u32);
let first_seen_system = UNIX_EPOCH + first_seen_epoch;
let last_seen = std::cmp::max(flow_event.src_last_pkt_time,
flow_event.dst_last_pkt_time);
let last_seen_seconds = last_seen / 1_000_000;
let last_seen_nanos = (last_seen % 1_000_000) * 1_000;
let last_seen_epoch = std::time::Duration::new(last_seen_seconds, last_seen_nanos as u32);
let last_seen_system = UNIX_EPOCH + last_seen_epoch;
let timeout_seconds = (last_seen + flow_event.idle_time) / 1_000_000;
let timeout_nanos = ((last_seen + flow_event.idle_time) % 1_000_000) * 1_000;
let timeout_epoch = std::time::Duration::new(timeout_seconds, timeout_nanos as u32);
let timeout_system = UNIX_EPOCH + timeout_epoch;
let risks = match &flow_event.ndpi {
None => 0,
Some(ndpi) => match &ndpi.risks {
None => 0,
Some(risks) => risks.len(),
},
};
let app_proto = match &flow_event.ndpi {
None => "-",
Some(ndpi) => &ndpi.proto,
};
let value = FlowValue {
state: flow_event.state,
total_src_packets: flow_event.src_packets_processed,
total_dst_packets: flow_event.dst_packets_processed,
total_src_bytes: flow_event.src_tot_l4_payload_len,
total_dst_bytes: flow_event.dst_tot_l4_payload_len,
first_seen: first_seen_system,
last_seen: last_seen_system,
timeout_in: timeout_system,
risks: risks,
proto: flow_event.l3_proto.to_string() + "/" + &flow_event.l4_proto,
app_proto: app_proto.to_string(),
};
cache.insert(key, (FlowExpiration::IdleTime(flow_event.idle_time), value)).await;
}
EventType::Packet(packet_event) => {
stats.packet_events += 1;
stats.packet_events_total_caplen += packet_event.pkt_caplen;
stats.packet_events_total_len += packet_event.pkt_len;
stats.packet_events_total_l4_len += packet_event.pkt_l4_len;
}
EventType::DaemonStatus(daemon_status_event) => {
let key = DaemonKey { alias: daemon_status_event.alias.to_string(),
source: daemon_status_event.source.to_string(),
thread_id: daemon_status_event.thread_id };
stats.daemon_events += 1;
daemon_cache.insert(key, daemon_status_event.clone()).await;
stats.packets_captured = 0;
stats.packets_processed = 0;
stats.flows_total_skipped = 0;
stats.flows_total_l4_payload_len = 0;
stats.flows_total_not_detected = 0;
stats.flows_total_guessed = 0;
stats.flows_current_active = 0;
stats.flows_total_compressions = 0;
stats.flows_total_compression_diff = 0;
stats.flows_current_compression_diff = 0;
stats.global_alloc_bytes = 0;
stats.global_alloc_count = 0;
stats.global_free_bytes = 0;
stats.global_free_count = 0;
stats.total_events_serialized = 0;
let daemons: Vec<DaemonEventStatus> = daemon_cache.iter().map(|(_, v)| (v.clone())).collect();
for daemon in daemons {
stats.packets_captured += daemon.packets_captured;
stats.packets_processed += daemon.packets_processed;
stats.flows_total_skipped += daemon.total_skipped_flows;
stats.flows_total_l4_payload_len += daemon.total_l4_payload_len;
stats.flows_total_not_detected += daemon.total_not_detected_flows;
stats.flows_total_guessed += daemon.total_guessed_flows;
stats.flows_current_active += daemon.current_active_flows;
stats.flows_total_compressions += daemon.total_compressions;
stats.flows_total_compression_diff += daemon.total_compression_diff;
stats.flows_current_compression_diff += daemon.current_compression_diff;
stats.global_alloc_bytes += daemon.global_alloc_bytes;
stats.global_alloc_count += daemon.global_alloc_count;
stats.global_free_bytes += daemon.global_free_bytes;
stats.global_free_count += daemon.global_free_count;
stats.total_events_serialized += daemon.total_events_serialized;
}
}
EventType::Other() => {}
}
}
fn format_bytes(bytes: u64) -> String {
const KB: u64 = 1024;
const MB: u64 = KB * 1024;
const GB: u64 = MB * 1024;
if bytes >= GB {
format!("{} GB", bytes / GB)
} else if bytes >= MB {
format!("{} MB", bytes / MB)
} else if bytes >= KB {
format!("{} kB", bytes / KB)
} else {
format!("{} B", bytes)
}
}
fn draw_ui<B: tui::backend::Backend>(terminal: &mut Terminal<B>, table_state: &mut TableState, table_selected: usize, data: &MutexGuard<Stats>, flows: &Vec<(FlowKey, (FlowExpiration, FlowValue))>) {
let general_items = vec![
ListItem::new("TUI Updates..: ".to_owned() + &data.ui_updates.to_string()),
ListItem::new("Flows Cached.: ".to_owned() + &data.flow_count.to_string()),
ListItem::new("Total Events.: ".to_owned() + &data.events.to_string()),
ListItem::new("Parse Errors.: ".to_owned() + &data.parse_errors.to_string()),
ListItem::new("Flow Events..: ".to_owned() + &data.flow_events.to_string()),
];
let packet_items = vec![
ListItem::new("Total Events........: ".to_owned() + &data.packet_events.to_string()),
ListItem::new("Total Capture Length: ".to_owned() + &format_bytes(data.packet_events_total_caplen)),
ListItem::new("Total Length........: ".to_owned() + &format_bytes(data.packet_events_total_len)),
ListItem::new("Total L4 Length.....: ".to_owned() + &format_bytes(data.packet_events_total_l4_len)),
];
let daemon_items = vec![
ListItem::new("Total Events.............: ".to_owned() + &data.daemon_events.to_string()),
ListItem::new("Total Packets Captured...: ".to_owned() + &data.packets_captured.to_string()),
ListItem::new("Total Packets Processed..: ".to_owned() + &data.packets_processed.to_string()),
ListItem::new("Total Flows Skipped......: ".to_owned() + &data.flows_total_skipped.to_string()),
ListItem::new("Total Flows Not-Detected.: ".to_owned() + &data.flows_total_not_detected.to_string()),
ListItem::new("Total Compressions/Memory: ".to_owned() + &data.flows_total_compressions.to_string()
+ " / " + &format_bytes(data.flows_total_compression_diff) + " deflate"),
ListItem::new("Total Memory in Use......: ".to_owned() + &format_bytes(data.global_alloc_bytes - data.global_free_bytes)
+ " (" + &format_bytes(data.flows_current_compression_diff) + " deflate)"),
ListItem::new("Total Events Serialized..: ".to_owned() + &data.total_events_serialized.to_string()),
ListItem::new("Current Flows Active.....: ".to_owned() + &data.flows_current_active.to_string()),
];
let table_rows: Vec<Row> = flows
.into_iter()
.map(|(key, (_exp, val))| {
let first_seen_display = match val.first_seen.elapsed() {
Ok(elapsed) => {
match elapsed.as_secs() {
t if t > (3_600 * 24) => format!("{} d ago", t / (3_600 * 24)),
t if t > 3_600 => format!("{} h ago", t / 3_600),
t if t > 60 => format!("{} min ago", t / 60),
t if t > 0 => format!("{} s ago", t),
t if t == 0 => "< 1 s ago".to_string(),
t => format!("INVALID: {}", t),
}
}
Err(err) => format!("ERROR: {}", err)
};
let last_seen_display = match val.last_seen.elapsed() {
Ok(elapsed) => {
match elapsed.as_secs() {
t if t > (3_600 * 24) => format!("{} d ago", t / (3_600 * 24)),
t if t > 3_600 => format!("{} h ago", t / 3_600),
t if t > 60 => format!("{} min ago", t / 60),
t if t > 0 => format!("{} s ago", t),
t if t == 0 => "< 1 s ago".to_string(),
t => format!("INVALID: {}", t),
}
}
Err(_err) => "ERROR".to_string()
};
let timeout_display = match val.timeout_in.duration_since(SystemTime::now()) {
Ok(elapsed) => {
match elapsed.as_secs() {
t if t > (3_600 * 24) => format!("in {} d", t / (3_600 * 24)),
t if t > 3_600 => format!("in {} h", t / 3_600),
t if t > 60 => format!("in {} min", t / 60),
t if t > 0 => format!("in {} s", t),
t if t == 0 => "in < 1 s".to_string(),
t => format!("INVALID: {}", t),
}
}
Err(_err) => "EXPIRED".to_string()
};
Row::new(vec![
key.id.to_string(),
val.state.to_string(),
first_seen_display,
last_seen_display,
timeout_display,
(val.total_src_packets + val.total_dst_packets).to_string(),
format_bytes(val.total_src_bytes + val.total_dst_bytes),
val.risks.to_string(),
val.proto.to_string(),
val.app_proto.to_string(),
])
})
.collect();
terminal.draw(|f| {
let size = f.size();
let chunks = Layout::default()
.direction(Direction::Vertical)
.constraints(
[
Constraint::Length(11),
Constraint::Percentage(100),
].as_ref()
)
.split(size);
let top_chunks = Layout::default()
.direction(Direction::Horizontal)
.constraints(
[
Constraint::Percentage(25),
Constraint::Percentage(30),
Constraint::Percentage(55),
].as_ref()
)
.split(chunks[0]);
let table_selected_abs = match table_selected {
_ if flows.len() == 0 => 0,
i => i + 1,
};
let table = Table::new(table_rows)
.header(Row::new(vec!["Flow ID", "State", "First Seen", "Last Seen", "Timeout", "Total Packets", "Total Bytes", "Risks", "L3/L4", "L7"])
.style(Style::default().fg(Color::Yellow).add_modifier(Modifier::BOLD)))
.block(Block::default().title("Flow Table (selected: ".to_string() +
&table_selected_abs.to_string() +
"): " +
&flows.len().to_string() +
" item(s)").borders(Borders::ALL))
.highlight_style(Style::default().bg(Color::Blue))
.widths(&[
Constraint::Length(10),
Constraint::Length(10),
Constraint::Length(12),
Constraint::Length(12),
Constraint::Length(10),
Constraint::Length(13),
Constraint::Length(12),
Constraint::Length(6),
Constraint::Length(12),
Constraint::Length(15),
]);
let general_list = List::new(general_items)
.block(Block::default().title("General").borders(Borders::ALL));
let packet_list = List::new(packet_items)
.block(Block::default().title("Packet Events").borders(Borders::ALL));
let daemon_list = List::new(daemon_items)
.block(Block::default().title("Daemon Events").borders(Borders::ALL));
table_state.select(Some(table_selected));
f.render_widget(general_list, top_chunks[0]);
f.render_widget(packet_list, top_chunks[1]);
f.render_widget(daemon_list, top_chunks[2]);
f.render_stateful_widget(table, chunks[1], table_state);
}).unwrap();
}

View File

@@ -0,0 +1,28 @@
filebeat.inputs:
- type: unix
id: "NDPId-logs" # replace this index to your preference
max_message_size: 100MiB
index: "index-name" # Replace this with your desired index name in Elasticsearch
enabled: true
path: "/var/run/nDPId.sock" # point nDPId to this Unix Socket (Collector)
processors:
- script: # execute javascript to remove the first 5-digit-number and also the Newline at the end
lang: javascript
id: trim
source: >
function process(event) {
event.Put("message", event.Get("message").trim().slice(5));
}
- decode_json_fields: # Decode the Json output
fields: ["message"]
process_array: true
max_depth: 10
target: ""
overwrite_keys: true
add_error_key: false
- drop_fields: # Deletes the Message field, which is the undecoded json (You may comment this out if you need the original message)
fields: ["message"]
- rename:
fields:
- from: "source" # Prevents a conflict in Elasticsearch and renames the field
to: "Source_Interface"

Submodule libnDPI updated: 7c19de4904...75db1a8a66

File diff suppressed because it is too large Load Diff

4207
nDPId.c

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

207
ncrypt.c Normal file
View File

@@ -0,0 +1,207 @@
#include "ncrypt.h"
#include <endian.h>
#include <openssl/conf.h>
#include <openssl/core_names.h>
#include <openssl/err.h>
#include <openssl/pem.h>
#include <openssl/ssl.h>
#include <unistd.h>
int ncrypt_init(void)
{
SSL_load_error_strings();
OpenSSL_add_all_algorithms();
return NCRYPT_SUCCESS;
}
static int ncrypt_init_ctx(struct ncrypt_ctx * const ctx, SSL_METHOD const * const meth)
{
if (meth == NULL)
{
return NCRYPT_NULL_PTR;
}
if (ctx->ssl_ctx != NULL)
{
return NCRYPT_ALREADY_INITIALIZED;
}
ctx->ssl_ctx = SSL_CTX_new(meth);
if (ctx->ssl_ctx == NULL)
{
return NCRYPT_NOT_INITIALIZED;
}
SSL_CTX_set_min_proto_version(ctx->ssl_ctx, TLS1_3_VERSION);
SSL_CTX_set_max_proto_version(ctx->ssl_ctx, TLS1_3_VERSION);
SSL_CTX_set_ciphersuites(ctx->ssl_ctx, "TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256");
return NCRYPT_SUCCESS;
}
static int ncrypt_load_pems(struct ncrypt_ctx * const ctx,
char const * const ca_path,
char const * const privkey_pem_path,
char const * const pubkey_pem_path)
{
if (SSL_CTX_use_certificate_file(ctx->ssl_ctx, pubkey_pem_path, SSL_FILETYPE_PEM) <= 0 ||
SSL_CTX_use_PrivateKey_file(ctx->ssl_ctx, privkey_pem_path, SSL_FILETYPE_PEM) <= 0 ||
SSL_CTX_load_verify_locations(ctx->ssl_ctx, ca_path, NULL) <= 0)
{
return NCRYPT_PEM_LOAD_FAILED;
}
SSL_CTX_set_verify(ctx->ssl_ctx, SSL_VERIFY_PEER | SSL_VERIFY_FAIL_IF_NO_PEER_CERT, NULL);
SSL_CTX_set_verify_depth(ctx->ssl_ctx, 4);
return NCRYPT_SUCCESS;
}
int ncrypt_init_client(struct ncrypt_ctx * const ctx,
char const * const ca_path,
char const * const privkey_pem_path,
char const * const pubkey_pem_path)
{
if (ca_path == NULL || privkey_pem_path == NULL || pubkey_pem_path == NULL)
{
return NCRYPT_NULL_PTR;
}
int rv = ncrypt_init_ctx(ctx, TLS_client_method());
if (rv != NCRYPT_SUCCESS)
{
return rv;
}
return ncrypt_load_pems(ctx, ca_path, privkey_pem_path, pubkey_pem_path);
}
int ncrypt_init_server(struct ncrypt_ctx * const ctx,
char const * const ca_path,
char const * const privkey_pem_path,
char const * const pubkey_pem_path)
{
if (ca_path == NULL || privkey_pem_path == NULL || pubkey_pem_path == NULL)
{
return NCRYPT_NULL_PTR;
}
int rv = ncrypt_init_ctx(ctx, TLS_server_method());
if (rv != NCRYPT_SUCCESS)
{
return rv;
}
return ncrypt_load_pems(ctx, ca_path, privkey_pem_path, pubkey_pem_path);
}
int ncrypt_on_connect(struct ncrypt_ctx * const ctx, int connect_fd, struct ncrypt_entity * const ent)
{
if (ent->ssl == NULL)
{
ent->ssl = SSL_new(ctx->ssl_ctx);
if (ent->ssl == NULL)
{
return NCRYPT_NOT_INITIALIZED;
}
SSL_set_fd(ent->ssl, connect_fd);
SSL_set_connect_state(ent->ssl);
}
int rv = SSL_do_handshake(ent->ssl);
if (rv != 1)
{
return SSL_get_error(ent->ssl, rv);
}
return NCRYPT_SUCCESS;
}
int ncrypt_on_accept(struct ncrypt_ctx * const ctx, int accept_fd, struct ncrypt_entity * const ent)
{
if (ent->ssl == NULL)
{
ent->ssl = SSL_new(ctx->ssl_ctx);
if (ent->ssl == NULL)
{
return NCRYPT_NOT_INITIALIZED;
}
SSL_set_fd(ent->ssl, accept_fd);
SSL_set_accept_state(ent->ssl);
}
int rv = SSL_accept(ent->ssl);
if (rv != 1)
{
return SSL_get_error(ent->ssl, rv);
}
return NCRYPT_SUCCESS;
}
ssize_t ncrypt_read(struct ncrypt_entity * const ent, char * const json_msg, size_t json_msg_len)
{
if (ent->ssl == NULL)
{
errno = EPROTO;
return -1;
}
int rv = SSL_read(ent->ssl, json_msg, json_msg_len);
if (rv <= 0)
{
int err = SSL_get_error(ent->ssl, rv);
if (err == SSL_ERROR_WANT_WRITE || err == SSL_ERROR_WANT_READ)
{
errno = EAGAIN;
}
else if (err != SSL_ERROR_SYSCALL)
{
errno = EPROTO;
}
return -1;
}
return rv;
}
ssize_t ncrypt_write(struct ncrypt_entity * const ent, char const * const json_msg, size_t json_msg_len)
{
if (ent->ssl == NULL)
{
errno = EPROTO;
return -1;
}
int rv = SSL_write(ent->ssl, json_msg, json_msg_len);
if (rv <= 0)
{
int err = SSL_get_error(ent->ssl, rv);
if (err == SSL_ERROR_WANT_WRITE || err == SSL_ERROR_WANT_READ)
{
errno = EAGAIN;
}
else if (err != SSL_ERROR_SYSCALL)
{
errno = EPROTO;
}
return -1;
}
return rv;
}
void ncrypt_free_entity(struct ncrypt_entity * const ent)
{
SSL_free(ent->ssl);
ent->ssl = NULL;
}
void ncrypt_free_ctx(struct ncrypt_ctx * const ctx)
{
SSL_CTX_free(ctx->ssl_ctx);
ctx->ssl_ctx = NULL;
EVP_cleanup();
}

73
ncrypt.h Normal file
View File

@@ -0,0 +1,73 @@
#ifndef NCRYPT_H
#define NCRYPT_H 1
#include <stdlib.h>
#define ncrypt_ctx(x) \
do \
{ \
(x)->ssl_ctx = NULL; \
} while (0);
#define ncrypt_entity(x) \
do \
{ \
(x)->ssl = NULL; \
(x)->handshake_done = 0; \
} while (0);
#define ncrypt_handshake_done(x) ((x)->handshake_done)
#define ncrypt_set_handshake(x) \
do \
{ \
(x)->handshake_done = 1; \
} while (0)
#define ncrypt_clear_handshake(x) \
do \
{ \
(x)->handshake_done = 0; \
} while (0)
enum
{
NCRYPT_SUCCESS = 0,
NCRYPT_NOT_INITIALIZED = -1,
NCRYPT_ALREADY_INITIALIZED = -2,
NCRYPT_NULL_PTR = -3,
NCRYPT_PEM_LOAD_FAILED = -4
};
struct ncrypt_ctx
{
void * ssl_ctx;
};
struct ncrypt_entity
{
void * ssl;
int handshake_done;
};
int ncrypt_init(void);
int ncrypt_init_client(struct ncrypt_ctx * const ctx,
char const * const ca_path,
char const * const privkey_pem_path,
char const * const pubkey_pem_path);
int ncrypt_init_server(struct ncrypt_ctx * const ctx,
char const * const ca_path,
char const * const privkey_pem_path,
char const * const pubkey_pem_path);
int ncrypt_on_connect(struct ncrypt_ctx * const ctx, int connect_fd, struct ncrypt_entity * const ent);
int ncrypt_on_accept(struct ncrypt_ctx * const ctx, int accept_fd, struct ncrypt_entity * const ent);
ssize_t ncrypt_read(struct ncrypt_entity * const ent, char * const json_msg, size_t json_msg_len);
ssize_t ncrypt_write(struct ncrypt_entity * const ent, char const * const json_msg, size_t json_msg_len);
void ncrypt_free_entity(struct ncrypt_entity * const ent);
void ncrypt_free_ctx(struct ncrypt_ctx * const ctx);
#endif

102
ndpid.conf.example Normal file
View File

@@ -0,0 +1,102 @@
[general]
# Set the network interface from which packets are captured and processed.
# Leave it empty to let nDPId choose the default network interface.
#netif = eth0
# Set a Berkeley Packet Filter.
# This will work for libpcap as well as with PF_RING.
#bpf = udp or tcp
# Decapsulate Layer4 tunnel protocols.
# Supported protocols: GRE
#decode-tunnel = true
#pidfile = /tmp/ndpid.pid
#user = nobody
#group = daemon
#riskdomains = /path/to/libnDPI/example/risky_domains.txt
#protocols = /path/to/libnDPI/example/protos.txt
#categories = /path/to/libnDPI/example/categories.txt
#ja4 = /path/to/libnDPI/example/ja4_fingerprints.csv
#sha1 = /path/to/libnDPI/example/sha1_fingerprints.csv
# Collector endpoint as UNIX socket (usually nDPIsrvd)
#collector = /run/nDPIsrvd/collector
# Collector endpoint as UDP socket (usually a custom application)
#collector = 127.0.0.1:7777
# Set a name for this nDPId instance
#alias = myhostname
# Set an optional UUID for this instance
# If the value starts with a '/' or '.', it is interpreted as a path
# from which the uuid is read from.
#uuid = 00000000-dead-c0de-0000-123456789abc
#uuid = ./path/to/some/file
#uuid = /proc/sys/kernel/random/uuid
#uuid = /sys/class/dmi/id/product_uuid
# Process only internal initial connections (src->dst)
#internal = true
# Process only external initial connections (dst->src)
#external = true
# Enable zLib compression of flow memory for long lasting flows
compression = true
# Enable "analyse" events, which can be used for machine learning.
# The daemon will generate some statistical values for every single flow.
# An "analyse" event is thrown after "max-packets-per-flow-to-analyse".
# Please note that the daemon will require a lot more heap memory for every flow.
#analysis = true
# Force poll() on systems that support epoll() as well
#poll = false
# Enable PF_RING packet capture instead of libpcap
#pfring = false
[tuning]
max-flows-per-thread = 2048
max-idle-flows-per-thread = 64
max-reader-threads = 10
daemon-status-interval = 600000000
#memory-profiling-log-interval = 5
compression-scan-interval = 20000000
compression-flow-inactivity = 30000000
flow-scan-interval = 10000000
generic-max-idle-time = 600000000
icmp-max-idle-time = 120000000
tcp-max-idle-time = 180000000
udp-max-idle-time = 7440000000
tcp-max-post-end-flow-time = 120000000
max-packets-per-flow-to-send = 15
max-packets-per-flow-to-process = 32
max-packets-per-flow-to-analyse = 32
error-event-threshold-n = 16
error-event-threshold-time = 10000000
# Please note that the following options are libnDPI related and can only be set via config file,
# not as commnand line parameter.
# See libnDPI/doc/configuration_parameters.md for detailed information.
[ndpi]
packets_limit_per_flow = 32
flow.direction_detection = enable
flow.track_payload = disable
tcp_ack_payload_heuristic = disable
fully_encrypted_heuristic = enable
libgcrypt.init = 1
dpi.compute_entropy = 1
fpc = disable
dpi.guess_on_giveup = 0x03
flow_risk_lists.load = 1
# Currently broken (upstream)
#flow_risk.crawler_bot.list.load = 1
log.level = 0
[protos]
tls.certificate_expiration_threshold = 7
tls.application_blocks_tracking = enable
stun.max_packets_extra_dissection = 8

31
ndpisrvd.conf.example Normal file
View File

@@ -0,0 +1,31 @@
[general]
#pidfile = /tmp/ndpisrvd.pid
#user = nobody
#group = nogroup
# Collector listener as UNIX socket
#collector = /run/nDPIsrvd/collector
# Distributor listener as UNIX socket
#distributor-unix = /run/nDPIsrvd/distributor
# Distributor listener as IP socket
#distributor-in = 127.0.0.1:7000
# Change group of the collector socket
#collector-group = daemon
# Change group of the distirbutor socket
#distirbutor-group = staff
# Max (distributor) clients allowed to connect to nDPIsrvd
max-remote-descriptors = 128
# Additional output buffers useful if a distributor sink speed unstable
max-write-buffers = 1024
# Fallback to blocking I/O if output buffers full
blocking-io-fallback = true
# Force poll() on systems that support epoll() as well
#poll = false

Some files were not shown because too many files have changed in this diff Show More