Uploaded image for project: 'RHEL'
  1. RHEL
  2. RHEL-93205

selinux role does not work during container build

Linking RHIVOS CVEs to...Migration: Automation ...SWIFT: POC ConversionSync from "Extern...XMLWordPrintable

    • Icon: Task Task
    • Resolution: Done
    • Icon: Major Major
    • None
    • None
    • rhel-system-roles
    • rhel-system-roles
    • Sprint 13
    • 2
    • False
    • Hide

      None

      Show
      None
    • None

      The first attempt has a lot of failures in both a podman system container and a bootc build. The first failures are

      TASK [linux-system-roles.selinux : Get SELinux modules facts] ******************
      task path: /home/runner/work/lsr-selinux/lsr-selinux/tests/roles/linux-system-roles.selinux/tasks/main.yml:116
      fatal: [sut]: FAILED! => {"changed": false, "module_stderr": "libsemanage.semanage_direct_list_all: Error while scanning directory /var/lib/selinux/targeted/active/modules. (No such file or directory).\nTraceback (most recent call last):\n  File \"/root/.ansible/tmp/ansible-tmp-1747919177.4415762-11040-95291907400413/AnsiballZ_selinux_modules_facts.py\", line 107, in <module>\n    _ansiballz_main()\n    ~~~~~~~~~~~~~~~^^\n  File \"/root/.ansible/tmp/ansible-tmp-1747919177.4415762-11040-95291907400413/AnsiballZ_selinux_modules_facts.py\", line 99, in _ansiballz_main\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n    ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/root/.ansible/tmp/ansible-tmp-1747919177.4415762-11040-95291907400413/AnsiballZ_selinux_modules_facts.py\", line 47, in invoke_module\n    runpy.run_module(mod_name='ansible.modules.selinux_modules_facts', init_globals=dict(_module_fqn='ansible.modules.selinux_modules_facts', _modlib_path=modlib_path),\n    ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n                     run_name='__main__', alter_sys=True)\n                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"<frozen runpy>\", line 226, in run_module\n  File \"<frozen runpy>\", line 98, in _run_module_code\n  File \"<frozen runpy>\", line 88, in _run_code\n  File \"/tmp/ansible_selinux_modules_facts_payload_i147fom9/ansible_selinux_modules_facts_payload.zip/ansible/modules/selinux_modules_facts.py\", line 171, in <module>\n  File \"/tmp/ansible_selinux_modules_facts_payload_i147fom9/ansible_selinux_modules_facts_payload.zip/ansible/modules/selinux_modules_facts.py\", line 92, in run_module\n  File \"/usr/lib64/python3.13/site-packages/semanage.py\", line 249, in semanage_module_list_all\n    return _semanage.semanage_module_list_all(sh)\n           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^\nFileNotFoundError: [Errno 2] No such file or directory\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
      

      and

      TASK [Add some mapping] ****************************************************************************************************************
      task path: /var/home/martin/upstream/lsr/selinux/tests/tests_all_purge.yml:24
      fatal: [sut]: FAILED! => {"changed": false, "cmd": "set -euo pipefail\necho -e -n \"boolean -m --on samba_enable_home_dirs\nport -a -p tcp -t ssh_port_t 22100\nfcontext -a -t user_home_dir_t /tmp/test_dir\nlogin -a -s staff_u sar-user\n\" | /usr/sbin/semanage -i -", "delta": "0:00:00.125274", "end": "2025-05-22 13:17:14.014630", "msg": "non-zero return code", "rc": 1, "start": "2025-05-22 13:17:13.889356", "stderr": "ValueError: SELinux policy is not managed or store cannot be accessed.", "stderr_lines": ["ValueError: SELinux policy is not managed or store cannot be accessed."], "stdout": "", "stdout_lines": []}
      

      This requires some research, so starting with allocating 2 SPs.

              rhn-engineering-mpitt Martin Pitt
              rhn-engineering-mpitt Martin Pitt
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: