Skip to content

Commit

Permalink
Merge pull request #13 from Elektrobit/next
Browse files Browse the repository at this point in the history
Next
  • Loading branch information
gehwolf authored Aug 15, 2024
2 parents fec825e + 245c2ef commit 1c3ca28
Show file tree
Hide file tree
Showing 202 changed files with 926 additions and 1,702 deletions.
4 changes: 3 additions & 1 deletion ci/create_debian_release.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ function setup_env() {

git config --local user.name "${GIT_AUTHOR_NAME}"
git config --local user.email "${GIT_AUTHOR_EMAIL}"

export ELOS_DEPENDENCY_CONFIG=./ci/dependencies_emlix.json
}

function create_and_publish_debian_main() {
Expand Down Expand Up @@ -105,7 +107,7 @@ setup_env
# from source, because the one from the official repositories is broken.
# Remove this when nosql-plugin and dependency to libmongoc is removed.
sudo apt-get update
sudo apt-get install libmongoc-dev
sudo apt-get install -y libmongoc-dev

if [ $UPDATE_ALL -eq 1 ] || [ $UPDATE_RELEASE -eq 1 ]; then
create_and_publish_debian_main
Expand Down
1 change: 1 addition & 0 deletions ci/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@ In addition the `install_deps.py` provides the following command line options to

.. program-output:: ./ci/install_deps.py -h

To use binaries installed from dependencies add `build/deps/bin` to the `PATH` environment variable.

`ci/build.sh`
-------------
Expand Down
12 changes: 12 additions & 0 deletions cmake/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,15 @@ CMake options
.. program-output:: cmake -LH 2>/dev/null | sed '0,/^-- Cache values$/d'
:shell:


Usage of find_package
=====================

* Always specify a version. `find_package(dependecy X.Y.Z REQUIRED)`
* Specify the version used for development


* The version doesn't guarantee that in the future the build still works with this version.
* The version does not necessarily say the previous versions will not work.
* The version is just an indicator for later issue or bug tracking, to say: "It was developed with this version and it worked".
* Usually we always build against the latest available version of our dependencies, so we only can guarantee that the latest upstream version will work.
2 changes: 1 addition & 1 deletion cmake/project.cmake
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# SPDX-License-Identifier: MIT
set(ELOS_VERSION 0.60.11)
set(ELOS_VERSION 0.62.6)

# Attention: Aside from the version, as many things as possible in this file
# should be put into functions, as this solves potential issues with commands
Expand Down
10 changes: 4 additions & 6 deletions conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@
'sphinxcontrib.plantuml',
'sphinx_favicon',

"sphinx.ext.autosectionlabel", # better cross referencing

# copy button on code blocks in HTML doc
'sphinx_copybutton',

Expand All @@ -33,6 +35,8 @@
]

myst_enable_extensions = ["tasklist"]
autosectionlabel_prefix_document = True
autosectionlabel_maxdepth = 2

templates_path = ['doc/_templates']
exclude_patterns = ['build/deps/**', 'build/*/cmake/_deps/*', 'README.md', '.venv']
Expand Down Expand Up @@ -106,12 +110,6 @@
'./src/components/rpnfilter/interface',
'./src/components/rpnfilter/private',
'./src/components/rpnfilter/public',
'./src/components/scanner_legacy/interface',
'./src/components/scanner_legacy/private',
'./src/components/scanner_legacy/public',
'./src/components/scanner_manager_legacy/interface',
'./src/components/scanner_manager_legacy/private',
'./src/components/scanner_manager_legacy/public',
'./src/components/scannermanager/interface',
'./src/components/scannermanager/private',
'./src/components/scannermanager/public',
Expand Down
6 changes: 4 additions & 2 deletions debian.native/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,10 @@ should be added to a new `.install` file matching the name of the new package.

To prepare a new release on `debian/main` run
1. `git checkout -b <task/-new-release> origin/debian/main`
2. `./ci/docker-run.sh ./ci/create_debian_release.sh <x.y.z>`
3. push branch and create MR for **debian/main** , not *main* and not *integration*!
2. ensure to set `SOURCE_URI` to a repo containing the latest debian releases
of the dependency projects
3. `./ci/docker-run.sh ./ci/create_debian_release.sh <x.y.z>`
4. push branch and create MR for **debian/main** , not *main* and not *integration*!

## Packaging Script Maintainer

Expand Down
28 changes: 14 additions & 14 deletions doc/Architecture_Design_Records/event_storage_backends.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ are designed to search for through and for particular attributes of documents.
*cons*
* needs a mongoDB server which comes with additional dependencies like python

### 2) Custom File Storage – Json File
### 3) Custom File Storage – Json File

To address the special requirements on storing events a sequential approach to
store events serialized as newline Json separated strings is possible.
Expand All @@ -78,7 +78,7 @@ approach can be obtained from the corresponding design decision.
* danger of reinventing some other stream or file storage system over time, as more and more "lessons learned"


### 3) systemd like storage of logs
### 4) systemd like storage of logs

https://systemd.io/JOURNAL_FILE_FORMAT/
https://github.com/systemd/systemd
Expand All @@ -88,14 +88,14 @@ systemds journald subsystem is a logging system not too different from syslog.
It is, effectively, a block-based protocol, writing its logs to a socket.


## Decision
#### Decision

Systemds journald will not be used.
If the decision is reached to implement a completly new logging mechanism,
the data storage format from journald is a good reference on how to write
a logging format that is easily searchable.

### Rationale
#### Rationale

The API of journald does not support writing to a custom file/location,
which means that we can not simply use the API for logging.
Expand Down Expand Up @@ -146,7 +146,7 @@ encoding of our field names. Combining that with the efficient search with
field names as search parameters would make lookup pretty efficient.


### Open Points
#### Open Points
It is unclear if, should we be able to create a shared library for the journald
server, how much of systemds other sources we would need to install as well to
enable the server to run.
Expand All @@ -158,20 +158,20 @@ sync is and how many logs would accumulate in that time.
It is unclear how good the corruption protection would work for elos, depending
on how many lookups actually happen.

### 4) Apache Avro Storage of logs
### 5) Apache Avro Storage of logs

https://avro.apache.org/docs/1.11.1/api/c/

Avro supports storing of binary data in an easy way.

# Decision
#### Decision

Creating a code poc is necessary to determine how the api performs in regards
to writing blocks.
During the creation of the poc, further development was halted and avro was
abandoned as a possible logging backend.

## Rationale
#### Rationale

It is certain that we can store an event fully in the data structures available
from Avro.
Expand All @@ -191,19 +191,19 @@ in the necessary version locally, which would require building them as well.
When trying to build avro locally, while supplying the necessary dependencies,
The build failed to varying reasons, even with the same setup.

## Open Points
#### Open Points

The amount of actual writes that happen when storing an event is unclear,
but at least from the poc development, it seems reasonable to assume that it
is possible to cache multiple events before actually writing them to file.

### 5) Time-Series Databases
### 6) Time-Series Databases

As a representative for Time-Series Databases, InfluxDb was chosen.

https://www.influxdata.com/products/influxdb-overview/.

# Decision
#### Decision

Creating a code poc is necessary to determine how the api performs in regards.
Due to the unavailability of InfluxDBv2 for yocto, the poc was implemented
Expand All @@ -216,7 +216,7 @@ the local storing we need for elos.

Further development has not been decided as of yet.

# Rational
#### Rational

It is confirmed that we can store an elos event to an InfluxDb table and
read it again.
Expand All @@ -226,15 +226,15 @@ similar writes have been done, assumably since it needs to write its meta-
data for the table only once, and subsequent writes are a lot smaller then for
the other loggers.

## Open Points
#### Open Points
Version 2 of InfluxDb uses a different storage format. The assumption
is, that it could perform better in writes then the previous Storage formats.

It is also unclear how the write performance changes should we decide to cache
events and write multiple at once, which is easily possible with the InfluxDb
API, in both versions.

### Test Results
## Test Results

The following table displays the results of performance tests, executed
on the S32G.
Expand Down
91 changes: 10 additions & 81 deletions doc/userManual.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ filters. The event communication is realized in a publish-poll pattern:
Clients interested in certain events will create an event filter on the
server, which collects incoming matching events in an event queue. A
filter can relate to one or more members of an event. For example only
events from a specific aplication or with a specific severity could be
events from a specific application or with a specific severity could be
collected. Listening clients must keep up the connection to the server,
so their filter and queues will exist further. When an event gets
published by a client and doesn’t match any existing filters, it will be
Expand Down Expand Up @@ -195,7 +195,7 @@ Only match events of severity ``warning`` and above. (note: severity is

.event.severity 3 LE

Only match events releated to security incidents with severity
Only matching events related to security incidents with severity
``warning`` or higher. (note: severity is 1 == FATAL to 6 == VERBOSE )

::
Expand Down Expand Up @@ -675,79 +675,10 @@ elosd Configuration - Options Explained
By default the elosd config options, stored in ‘/etc/elos/elosd.json’ do
look like this:

.. code:: json
{
"root": {
"elos": {
"UseEnv": false,
"Port": 54321,
"Interface": "127.0.0.1",
"ConnectionLimit": 200,
"LogFilter": "connectionmanager.c;dispatcher.c;message_handler.c;message_event_create.c;message_event_push.c",
"LogLevel": "DEBUG",
"EventBlacklist": ".event.messageCode 2000 EQ",
"authorizedProcesses": [
".process.uid 0 EQ .process.gid 0 EQ .process.exec '/bin/elosc' STRCMP AND",
".process.gid 200 EQ .process.exec '/bin/elosc' STRCMP AND",
".process.pid 1 EQ"
],
"EventLogging": {
"PluginSearchPath": "/usr/lib/x86_64-linux-gnu/elos/backend",
"Plugins": {
"Dummy": {
"File": "backend_dummy.so",
"Run": "always",
"Filters": [
"1 1 EQ"
]
},
"JsonBackend": {
"File": "backend_json.so",
"Run": "always",
"Filters": [
"1 1 EQ"
]
}
}
},
"StorageBackend": {
"Json": {
"File": "/var/log/elos/elosd_event.log"
}
},
"Scanner": {
"Path": "/usr/local/lib/elos/scanner",
"Plugins": {
"SyslogScanner": {
"File": "scanner_syslog.so",
"Run": "always",
"Config": {
"SyslogPath": "/dev/log",
"MappingRules": {
"MessageCodes": {
"4000": ".event.source.appName 'ssh' STRCMP",
"2000": ".event.source.appName 'crinit' STRCMP",
"1000": ".event.source.appName 'login' STRCMP"
}
}
}
},
"KmsgScanner": {
"File": "scanner_kmsg.so",
"Run": "always",
"Config": {
"KmsgFile": "/dev/kmsg"
}
}
},
"KmsgScanner": {
"KmsgFile": "/dev/kmsg"
}
}
}
}
}
.. literalinclude:: /src/components/config/elosd.json
:language: json
:caption: elos default config shipped with elos
:linenos:

These options are used if the UseEnv variable is set to false. Otherwise
elos will use environment variables (if there are any defined). The
Expand Down Expand Up @@ -780,10 +711,6 @@ use another default value, decided by us.
by unauthorized clients.
- **authorizedProcesses**: A list of process filters that determines if
a client is authorized to publish critical events.
- **StorageBackend/Json/File**: The file where elosd will store all the
logged events (``ELOS_STORAGE_BACKEND_JSON_FILE``)
- **Scanner/Path**: Path to the scanner plugins (``ELOS_SCANNER_PATH``
default value: ``"/usr/lib/elos/scanner"``)
- **Scanner/Plugins/<KmsgScanner>/Config/KmsgFile**: Character device or FIFO file node
to receive logs in kmsg format from (``ELOS_KMSG_FILE`` default
value: ``"/dev/kmsg"``)
Expand All @@ -794,6 +721,8 @@ use another default value, decided by us.
``message code, filter`` pairs to set a specific ``message code`` for
an event if the given filter matches the event.

For a more details see :ref:`src/components/config/index:Elos Configuration`.

Note: You can create/overwrite environment variables by typing something
like i.e.: ``export ELOSD_PORT='1234'`` If you want to store them
permanently, you can add them to your ``~/.bashrc`` file. You can also
Expand All @@ -806,8 +735,8 @@ Event Authorization
Event authorization is implemented by setting two filters in the config
file. The two filters are :

- EventBlacklist: This is an event filter which separtes an event into
critical and non-crtical events and blacklists it, if critical, to
- EventBlacklist: This is an event filter which separates an event into
critical and non-critical events and blacklists it, if critical, to
prevent it from being published by an unauthorized client.

- authorizedProcesses: This is a list of process filters which
Expand Down
4 changes: 2 additions & 2 deletions src/clients/coredump/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# SPDX-License-Identifier: MIT
find_package(samconf 0.50.1 REQUIRED)
find_package(safu 0.50.1 REQUIRED)
find_package(samconf 0.56.3 REQUIRED)
find_package(safu 0.56.2 REQUIRED)

add_executable(
elos-coredump
Expand Down
2 changes: 1 addition & 1 deletion src/common/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# SPDX-License-Identifier: MIT
find_package(safu 0.50.1 REQUIRED)
find_package(safu 0.56.2 REQUIRED)

include(../../cmake/shared_library.cmake)

Expand Down
2 changes: 0 additions & 2 deletions src/components/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,5 @@ add_subdirectory(plugincontrol)
add_subdirectory(pluginmanager)
add_subdirectory(processfilter)
add_subdirectory(rpnfilter)
add_subdirectory(scanner_legacy)
add_subdirectory(scanner_manager_legacy)
add_subdirectory(scannermanager)
add_subdirectory(storagemanager)
4 changes: 2 additions & 2 deletions src/components/clientmanager/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# SPDX-License-Identifier: MIT
find_package(safu 0.50.1 REQUIRED)
find_package(samconf 0.50.1 REQUIRED)
find_package(safu 0.56.2 REQUIRED)
find_package(samconf 0.56.3 REQUIRED)

create_interface_library(
FROM
Expand Down
4 changes: 2 additions & 2 deletions src/components/config/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# SPDX-License-Identifier: MIT
find_package(safu 0.50.1 REQUIRED)
find_package(samconf 0.50.1 REQUIRED)
find_package(safu 0.56.2 REQUIRED)
find_package(samconf 0.56.3 REQUIRED)

create_interface_library(
FROM
Expand Down
Loading

0 comments on commit 1c3ca28

Please sign in to comment.