Skip to content

Commit

Permalink
Merge branch 'Checkmk:master' into checkpoint_fix
Browse files Browse the repository at this point in the history
  • Loading branch information
Bastian-Kuhn authored Jan 22, 2025
2 parents 65dec4b + b8bc0f3 commit fb86e29
Show file tree
Hide file tree
Showing 2,384 changed files with 83,945 additions and 55,183 deletions.
24 changes: 20 additions & 4 deletions .bazelrc
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
# https://docs.bazel.build/versions/main/best-practices.html#using-the-bazelrc-file
try-import %workspace%/user.bazelrc

# The remote.bazelrc file contains the secrets for accessing the remote cache and must be added by the corresponding
# Jenkins job (or a user) in the form of:
# common --remote_cache=grpcs://${USER}:${PASSWORD}@{URL}
try-import %workspace%/remote.bazelrc

# default/common bazel args for all commands supporting it
common --experimental_ui_max_stdouterr_bytes=10000000
common --experimental_remote_cache_async
Expand All @@ -17,6 +22,7 @@ common --@//:filesystem_layout=lsb
common --enable_bzlmod
# Do not upload artifacts other than from CI, see below, CMK-18656
common --remote_upload_local_results=false
common --lockfile_mode=error

## For specific commands
# Always require debug info.
Expand All @@ -43,9 +49,8 @@ common:ci --show_progress_rate_limit=0
common:ci --show_timestamps
common:ci --memory_profile=bazel-memory.profile
common:ci --extra_toolchains="//bazel/toolchains/cc:ci"
common:ci --local_resources=cpu=HOST_CPUS
common:ci --local_resources=cpu=HOST_CPUS*.21
common:ci --local_resources=memory=HOST_RAM*.67
common:ci --lockfile_mode=error
# the later flags will override the previous flags
# upload artifacts only from CI, CMK-18656
common:ci --remote_upload_local_results=true
Expand All @@ -69,8 +74,6 @@ build:ci --@rules_python//python/config_settings:bootstrap_impl=script
# Flags for Debug builds
# Definition of "bazel x --config=debug -- ..."
## Common
common:debug --sandbox_debug
common:debug --subcommands=pretty_print
common:debug --announce_rc
# gRPC errors provide stack trace as well
common:debug --verbose_failures
Expand Down Expand Up @@ -106,6 +109,19 @@ build:iwyu --@bazel_iwyu//:iwyu_mappings=//bazel/tools:iwyu_mappings
build:mypy --aspects //bazel/tools:aspects.bzl%mypy_aspect
build:mypy --output_groups=mypy

################################################################################
# Example: bazel build --config=clippy //packages/host/cmk-agent-ctl:all
################################################################################
build:clippy --aspects=@rules_rust//rust:defs.bzl%rust_clippy_aspect
build:clippy --output_groups=+clippy_checks
build:clippy --@rules_rust//:clippy_flags=-Dwarnings

################################################################################
# Example: bazel build --config=rustfmt //packages/site/check-http:all
################################################################################
build:rustfmt --aspects=@rules_rust//rust:defs.bzl%rustfmt_aspect
build:rustfmt --output_groups=+rustfmt_checks

# Turn off automatic capturing of environmental stuff like PATH - those will be set explicitly
# (see https://blog.aspect.build/bazelrc-flags)
build --incompatible_strict_action_env
Expand Down
31 changes: 10 additions & 21 deletions .envrc
Original file line number Diff line number Diff line change
@@ -1,24 +1,13 @@
# -*- mode: sh -*-
export PIPENV_VENV_IN_PROJECT=true

# Enable packaging and Microcore and Livestatus builds to use our build cache.
# See omd/README.md for further information.
export NEXUS_BUILD_CACHE_URL=https://artifacts.lan.tribe29.com/repository/omd-build-cache

layout_cmk_pipenv() {
if [[ ! -f Pipfile ]]; then
log_error 'No Pipfile found. Use `pipenv` to create a Pipfile first.'
exit 2
fi

local VENV=$(pipenv --bare --venv 2>/dev/null)
if [[ -z $VENV || ! -d $VENV ]]; then
make .venv
fi

export VIRTUAL_ENV=$(pipenv --venv)
PATH_add "$VIRTUAL_ENV/bin"
export PIPENV_ACTIVE=1
layout_cmk_uv() {
VIRTUAL_ENV="$(pwd)/.venv"
if [[ -z $VIRTUAL_ENV || ! -d $VIRTUAL_ENV ]]; then
make .venv
fi

PATH_add "$VIRTUAL_ENV/bin"
export UV_ACTIVE=1 # or VENV_ACTIVE=1
export VIRTUAL_ENV
}

layout cmk_pipenv
layout cmk_uv
9 changes: 4 additions & 5 deletions .github/workflows/pr-autoclose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,8 @@ jobs:
exempt-pr-labels: 'tracked'
# After 14 days of inactivity, mark PRs as "Stale"
days-before-pr-stale: 14
stale-pr-message: "This PR is stale because it has been open for 14 days with no activity and the Github Actions are not passing."
# Autoclose stale PRs after 14 days.
days-before-pr-close: 14
close-pr-message: "This PR was closed because it has been inactive for 14 days since being marked as stale."
stale-pr-message: "Thank you for your contribution. This pull request has been marked as stale as it has not passed the automated tests and there was no activity for the last 14 days.\nPlease take a look at the ‘Checks’ section for details on the test results and make the necessary changes.\n\nThis pull request will be closed due to inactivity after 60 days, if no action is taken."
# Autoclose stale PRs after 60 days.
days-before-pr-close: 60
close-pr-message: "This pull request has been stale for 60 days and no action has been taken by the author. Unfortunately we have to close this contribution due to inactivity."
repo-token: ${{ secrets.GITHUB_TOKEN }}

38 changes: 16 additions & 22 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This is a simple entry point to execute the basic and most important Python
# tests for Checkmk. We run tools like pylint, black and our pytest based unit
# tests for Checkmk. We run tools like ruff, black and our pytest based unit
# tests here. Some tests, like integration tests or tests of very specific
# components are not executed.
#
Expand All @@ -14,18 +14,18 @@ jobs:
testing:
runs-on: ubuntu-22.04
env:
PIPENV_IGNORE_VIRTUALENVS: 1
USE_EXTERNAL_PIPENV_MIRROR: true
PYTHONWARNINGS: ignore:DeprecationWarning
# Avoid falling back to our internal bazel remote cache on github actions
BAZEL_CACHE_URL: ""
strategy:
matrix:
include:
- name: Bandit tests
target: test-bandit
- name: Python formatting
target: test-format-python
- name: Pylint tests
target: test-pylint
- name: Ruff lint tests
target: test-ruff
- name: Python unit tests
target: test-unit
- name: mypy tests
Expand All @@ -35,27 +35,22 @@ jobs:
run: |
# Using existing environment variables within another variables is not working in the jobs.*.env section
# more infos see: https://brandur.org/fragments/github-actions-env-vars-in-env-vars
echo "PIP_CACHE_DIR=$HOME/.cache/pip" >> $GITHUB_ENV
echo "PIPENV_CACHE_DIR=$HOME/.cache/pipenv" >> $GITHUB_ENV
echo "UV_CACHE_DIR=$HOME/.cache/uv" >> $GITHUB_ENV
- name: Checkout Repository
uses: actions/checkout@v3
- name: Parse Python version from defines.make
run: |
echo "PYTHON_VERSION=$(make --no-print-directory --file=defines.make print-PYTHON_VERSION)" >> $GITHUB_ENV
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Cache pip
- name: Cache uv
uses: actions/cache@v3
with:
key: pip-${{ runner.os }}-${{ hashFiles('Pipfile.lock') }}
path: ${{ env.PIP_CACHE_DIR }}
- name: Cache pipenv
key: uv-${{ runner.os }}-${{ hashFiles('requirements_all_lock.txt') }}
path: ${{ env.UV_CACHE_DIR }}
- name: Cache bazel
uses: actions/cache@v3
with:
key: pipenv-${{ runner.os }}-${{ hashFiles('Pipfile.lock') }}
path: ${{ env.PIPENV_CACHE_DIR }}
key: ${{ runner.os }}-bazel-${{ hashFiles('.bazelversion', '.bazelrc', 'WORKSPACE', 'MODULE.bazel') }}
path: |
~/.cache/bazel
restore-keys: |
${{ runner.os }}-bazel-
- name: Setup Environment
run: |
# ksh: Needed for some "unit test" (test_mk_errpt_aix).
Expand All @@ -67,8 +62,7 @@ jobs:
# gettext: Needed for some "unit tests" (test_i18n.py)
sudo add-apt-repository -y ppa:ubuntu-toolchain-r/test
sudo apt-get update
sudo apt-get install ksh librrd-dev libldap2-dev libsasl2-dev libkrb5-dev libglib2.0-dev gettext g++-13
buildscripts/infrastructure/build-nodes/scripts/install-pipenv.sh
sudo apt-get install ksh libpango1.0-dev librrd-dev libldap2-dev libsasl2-dev libkrb5-dev libglib2.0-dev gettext g++-13
make .venv
- name: Run ${{ matrix.name }}
env:
Expand Down
38 changes: 15 additions & 23 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#
# This is the configuration file for the pre-commit framework.
# To use this you need to install it seperately and activate it for your repository.
# To use this you need to install it separately and activate it for your repository.
# To do so issue the following commands:
#
# pip3 install pre-commit
Expand All @@ -17,10 +17,13 @@ default_stages:
- manual
repos:
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: 'v0.4.4'
rev: "v0.8.3"
hooks:
- id: ruff
- id: ruff-format
- id: ruff
- id: ruff-format
# Run ruff import sorting
- id: ruff
args: [--select, I, --fix]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
Expand All @@ -46,7 +49,7 @@ repos:
- repo: https://github.com/gitleaks/gitleaks
rev: v8.16.1
hooks:
- id: gitleaks
- id: gitleaks
- repo: local
hooks:
- id: check-cmk-namespace
Expand All @@ -62,7 +65,7 @@ repos:
- id: bandit
name: Run bandit
# -l level low -ll = level medium -lll level high
entry: scripts/run-pipenv run bandit --config bandit.yaml -ll
entry: scripts/run-uvenv bandit --config bandit.yaml -ll
language: script
types: [file, python]
- id: omd-python-modules
Expand All @@ -73,16 +76,11 @@ repos:
types: [file]
- id: sphinx
name: Sphinx Documentation
entry: scripts/run-pipenv run make -C doc/documentation html
entry: make -C doc/documentation html
files: ^doc/documentation/.*(rst|puml)$
pass_filenames: false
language: script
types: [file]
- id: pylint
name: Check pylint
entry: scripts/check-pylint
language: script
types: [file, python]
- id: doctest
name: Doctests
entry: scripts/run-doctests
Expand All @@ -104,12 +102,6 @@ repos:
entry: scripts/check-absolute-imports.py
language: script
types: [file, python]
- id: flake8
name: Check flake8
entry: scripts/check-flake8
language: script
types: [file, python]
verbose: true
- id: unittest
name: unittests
entry: scripts/run-unittests
Expand All @@ -129,10 +121,10 @@ repos:
language: script
types: ["bazel", "file", "non-executable", "text"]
verbose: true
- id: pipfile-locking
name: Pipfile.lock
entry: scripts/run-pipenv verify
- id: requirements-locking
name: requirements locking
entry: bazel test //:requirements_test
pass_filenames: false
language: script
files: Pipfile
language: system
files: ^requirements.*\.txt$
verbose: true
3 changes: 2 additions & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,10 @@
init-hook=
import sys;
sys.path.insert(0, __file__[:__file__.rfind("/.venv")]); # __file__ is somewhere deep inside the .venv
from tests.testlib.repo import add_protocols_path, add_python_paths;
from tests.testlib.repo import add_protocols_path, add_python_paths, add_otel_collector_path;
add_protocols_path();
add_python_paths();
add_otel_collector_path();
load-plugins=
tests.pylint.cmk_edition_ignores,
tests.pylint.checker_localization,
Expand Down
18 changes: 18 additions & 0 deletions .werks/15345.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
[//]: # (werk v2)
# mk-job: Discover running jobs

key | value
---------- | ---
date | 2024-11-04T14:53:00+00:00
version | 2.4.0b1
class | fix
edition | cre
component | checks
level | 1
compatible | yes

Previously only finished jobs were discovered. This lead to problems with long
running jobs: If a Service discovery is executed while the job was running, the
job-service will vanished, because the running job is no longer discovered.

Now also running jobs will be discoverd
18 changes: 18 additions & 0 deletions .werks/15346.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
[//]: # (werk v2)
# mk-job: clean up old running jobs

key | value
---------- | ---
date | 2024-11-05T13:53:56+00:00
version | 2.4.0b1
class | fix
edition | cre
component | checks
level | 1
compatible | yes

In certain situations the trap to move the file indicating a currently running
job is not executed.

The files are now removed if there is no process with the corresponding PID or the
process command name does not contain mk-job.
21 changes: 21 additions & 0 deletions .werks/15347.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
[//]: # (werk v2)
# mk-job: currently running job was not correctly reported

key | value
---------- | ---
date | 2024-11-05T15:06:41+00:00
version | 2.4.0b1
class | fix
edition | cre
component | checks
level | 1
compatible | yes

`mk-job` creates multiple files: a stat file for finished jobs, and a running
file for currently running jobs. The `check_mk_agent` then collects those files.
Previously to this change, the sorting of those files was important.

The first file belonging to a certain job determined the state of the aggregated job.

Now the aggregated job will be set to running although the running file is not
at first position.
18 changes: 18 additions & 0 deletions .werks/15349.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
[//]: # (werk v2)
# oracle: Do not discover uptime service for template databases

key | value
---------- | ---
date | 2024-11-27T09:55:10+00:00
version | 2.4.0b1
class | fix
edition | cre
component | checks
level | 1
compatible | yes

Template databases return `-1` for the uptime, which crashes the services.

Databases having `-1` for uptime are no longer discovered.

You have to execute service discovery to make the broken uptime services vanish.
Loading

0 comments on commit fb86e29

Please sign in to comment.